WorldWideScience

Sample records for survival probability parameters

  1. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  2. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  3. A nonparametric method for predicting survival probabilities

    NARCIS (Netherlands)

    van der Klaauw, B.; Vriend, S.

    2015-01-01

    Public programs often use statistical profiling to assess the risk that applicants will become long-term dependent on the program. The literature uses linear probability models and (Cox) proportional hazard models to predict duration outcomes. These either focus on one threshold duration or impose

  4. Energy dependence of gap survival probability and antishadowing

    OpenAIRE

    Troshin, S M; Tyurin, N. E.

    2004-01-01

    We discuss energy dependence of gap survival probability which follows from rational form of amplitude unitarization. In contrast to eikonal form of unitarization which leads to decreasing energy dependence of gap survival probability, we predict a non-monotonous form for this dependence.

  5. Duality of circulation decay statistics and survival probability

    Science.gov (United States)

    2010-09-01

    Survival probability and circulation decay history have both been used for setting wake turbulence separation standards. Conceptually a strong correlation should exist between these two characterizations of the vortex behavior, however, the literatur...

  6. [Analysis of survival and mortality curves with the model of vital receptors. The maximal life span. Effect of temperature on the life span. The mortality probability density function (mortality curve) and its parameters].

    Science.gov (United States)

    Poltorakov, A P

    2001-01-01

    We have continued an analysis of survival curves by the model of the vital receptors (MVR). The main types survival function (E-, TW- and GM-distributions) have been considered. It was found that the maximal life span depends on the threshold concentration of vital receptors. Equations are obtained for the dependence of the maximal life span on the kinetic parameters in the reactions of inactivation, destruction and inactivation. Dependence of maximal time life on initial size of the population have been considered. The influence of temperature on the survival curves is analysed by E-distribution. Equations are founded for the description of thermosurvival and thermoinactivation curves. Equation are obtained for the dependence of density function and it characteristics (modal and antimodal age, coefficient of asymmetry) on the MVR parameters. It was shown that E-, TW- and GM-distribution has different types of asymmetry. The coefficient of asymmetry of GM-distribution is associated on the MVR parameters. It is assumed that symmetry of the curves of mortality and birth-rate is coordinated by the mechanisms of MVR.

  7. Coverage Probability of Wald Interval for Binomial Parameters

    OpenAIRE

    Chen, Xinjia

    2008-01-01

    In this paper, we develop an exact method for computing the minimum coverage probability of Wald interval for estimation of binomial parameters. Similar approach can be used for other type of confidence intervals.

  8. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  9. Seasonal survival probabilities suggest low migration mortality in migrating bats.

    Directory of Open Access Journals (Sweden)

    Simone Giavi

    Full Text Available Migration is adaptive if survival benefits are larger than costs of residency. Many aspects of bat migration ecology such as migratory costs, stopover site use and fidelity are largely unknown. Since many migrating bats are endangered, such information is urgently needed to promote conservation. We selected the migrating Leisler's bat (Nyctalus leisleri as model species and collected capture-recapture data in southern Switzerland year round during 6 years. We estimated seasonal survival and site fidelity with Cormack-Jolly-Seber models that accounted for the presence of transients fitted with Bayesian methods and assessed differences between sexes and seasons. Activity peaked in autumn and spring, whereas very few individuals were caught during summer. We hypothesize that the study site is a migratory stopover site used during fall and spring migration for most individuals, but there is also evidence for wintering. Additionally, we found strong clues for mating during fall. Summer survival that included two major migratory journeys was identical to winter survival in males and slightly higher in females, suggesting that the migratory journeys did not bear significant costs in terms of survival. Transience probability was in both seasons higher in males than in females. Our results suggest that, similarly to birds, Leisler's bat also use stopover sites during migration with high site fidelity. In contrast to most birds, the stopover site was also used for mating and migratory costs in terms of survival seemed to be low. Transients' analyses highlighted strong individual variation in site use which makes particularly challenging the study and modelling of their populations as well as their conservation.

  10. Seasonal survival probabilities suggest low migration mortality in migrating bats.

    Science.gov (United States)

    Giavi, Simone; Moretti, Marco; Bontadina, Fabio; Zambelli, Nicola; Schaub, Michael

    2014-01-01

    Migration is adaptive if survival benefits are larger than costs of residency. Many aspects of bat migration ecology such as migratory costs, stopover site use and fidelity are largely unknown. Since many migrating bats are endangered, such information is urgently needed to promote conservation. We selected the migrating Leisler's bat (Nyctalus leisleri) as model species and collected capture-recapture data in southern Switzerland year round during 6 years. We estimated seasonal survival and site fidelity with Cormack-Jolly-Seber models that accounted for the presence of transients fitted with Bayesian methods and assessed differences between sexes and seasons. Activity peaked in autumn and spring, whereas very few individuals were caught during summer. We hypothesize that the study site is a migratory stopover site used during fall and spring migration for most individuals, but there is also evidence for wintering. Additionally, we found strong clues for mating during fall. Summer survival that included two major migratory journeys was identical to winter survival in males and slightly higher in females, suggesting that the migratory journeys did not bear significant costs in terms of survival. Transience probability was in both seasons higher in males than in females. Our results suggest that, similarly to birds, Leisler's bat also use stopover sites during migration with high site fidelity. In contrast to most birds, the stopover site was also used for mating and migratory costs in terms of survival seemed to be low. Transients' analyses highlighted strong individual variation in site use which makes particularly challenging the study and modelling of their populations as well as their conservation.

  11. Survival probability and order statistics of diffusion on disordered media.

    Science.gov (United States)

    Acedo, L; Yuste, S B

    2002-07-01

    We investigate the first passage time t(j,N) to a given chemical or Euclidean distance of the first j of a set of N>1 independent random walkers all initially placed on a site of a disordered medium. To solve this order-statistics problem we assume that, for short times, the survival probability (the probability that a single random walker is not absorbed by a hyperspherical surface during some time interval) decays for disordered media in the same way as for Euclidean and some class of deterministic fractal lattices. This conjecture is checked by simulation on the incipient percolation aggregate embedded in two dimensions. Arbitrary moments of t(j,N) are expressed in terms of an asymptotic series in powers of 1/ln N, which is formally identical to those found for Euclidean and (some class of) deterministic fractal lattices. The agreement of the asymptotic expressions with simulation results for the two-dimensional percolation aggregate is good when the boundary is defined in terms of the chemical distance. The agreement worsens slightly when the Euclidean distance is used.

  12. Probability density functions of instantaneous Stokes parameters on weak scattering

    Science.gov (United States)

    Chen, Xi; Korotkova, Olga

    2017-10-01

    The single-point probability density functions (PDF) of the instantaneous Stokes parameters of a polarized plane-wave light field scattered from a three-dimensional, statistically stationary, weak medium with Gaussian statistics and Gaussian correlation function have been studied for the first time. Apart from the scattering geometry the PDF distributions of the scattered light have been related to the illumination's polarization state and the correlation properties of the medium.

  13. Survival and compound nucleus probability of super heavy element Z = 117

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First grade College, Department of Physics, Kolar, Karnataka (India)

    2017-05-15

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of {sup 289-297}Ts, we have calculated the transmission probability (T{sub l}), compound nucleus formation probabilities (P{sub CN}) and survival probability (P{sub sur}) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of {sup 289-297}Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei {sup 289-297}Ts are worked out and listed explicitly. We have also studied the variation of P{sub CN} and P{sub sur} with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  14. Modeling of thermal stresses and probability of survival of tubular SOFC

    Energy Technology Data Exchange (ETDEWEB)

    Nakajo, Arata [Laboratory for Industrial Energy Systems (LENI), Faculty of Engineering, Swiss Federal Institute of Technology, 1015 Lausanne (Switzerland); Stiller, Christoph; Bolland, Olav [Department of Energy and Process Engineering, Norwegian University of Science and Technology, Trondheim N-7491 (Norway); Haerkegaard, Gunnar [Department of Engineering Design and Materials, Norwegian University of Science and Technology, Trondheim N-7491 (Norway)

    2006-07-14

    The temperature profile generated by a thermo-electro-chemical model was used to calculate the thermal stress distribution in a tubular solid oxide fuel cell (SOFC). The solid heat balances were calculated separately for each layer of the MEA (membrane electrode assembly) in order to detect the radial thermal gradients more precisely. It appeared that the electrolyte undergoes high tensile stresses at the ends of the cell in limited areas and that the anode is submitted to moderate tensile stresses. A simplified version of the widely used Weibull analysis was used to calculate the global probability of survival for the assessment of the risks related to both operating points and load changes. The cell at room temperature was considered and revealed as critical. As a general trend, the computed probabilities of survival were too low for the typical requirements for a commercial product. A sensitivity analysis showed a strong influence of the thermal expansion mismatch between the layers of the MEA on the probability of survival. The lack of knowledge on mechanical material properties as well as uncertainties about the phenomena occurring in the cell revealed itself as a limiting parameter for the simulation of thermal stresses. (author)

  15. Probability of detection as a function of multiple influencing parameters

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato

    2014-10-15

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  16. Time preference and its relationship with age, health, and survival probability

    Directory of Open Access Journals (Sweden)

    Li-Wei Chao

    2009-02-01

    Full Text Available Although theories from economics and evolutionary biology predict that one's age, health, and survival probability should be associated with one's subjective discount rate (SDR, few studies have empirically tested for these links. Our study analyzes in detail how the SDR is related to age, health, and survival probability, by surveying a sample of individuals in townships around Durban, South Africa. In contrast to previous studies, we find that age is not significantly related to the SDR, but both physical health and survival expectations have a U-shaped relationship with the SDR. Individuals in very poor health have high discount rates, and those in very good health also have high discount rates. Similarly, those with expected survival probability on the extremes have high discount rates. Therefore, health and survival probability, and not age, seem to be predictors of one's SDR in an area of the world with high morbidity and mortality.

  17. Some properties of a 5-parameter bivariate probability distribution

    Science.gov (United States)

    Tubbs, J. D.; Brewer, D. W.; Smith, O. E.

    1983-01-01

    A five-parameter bivariate gamma distribution having two shape parameters, two location parameters and a correlation parameter was developed. This more general bivariate gamma distribution reduces to the known four-parameter distribution. The five-parameter distribution gives a better fit to the gust data. The statistical properties of this general bivariate gamma distribution and a hypothesis test were investigated. Although these developments have come too late in the Shuttle program to be used directly as design criteria for ascent wind gust loads, the new wind gust model has helped to explain the wind profile conditions which cause large dynamic loads. Other potential applications of the newly developed five-parameter bivariate gamma distribution are in the areas of reliability theory, signal noise, and vibration mechanics.

  18. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  19. Monte Carlo based protocol for cell survival and tumour control probability in BNCT.

    Science.gov (United States)

    Ye, S J

    1999-02-01

    A mathematical model to calculate the theoretical cell survival probability (nominally, the cell survival fraction) is developed to evaluate preclinical treatment conditions for boron neutron capture therapy (BNCT). A treatment condition is characterized by the neutron beam spectra, single or bilateral exposure, and the choice of boron carrier drug (boronophenylalanine (BPA) or boron sulfhydryl hydride (BSH)). The cell survival probability defined from Poisson statistics is expressed with the cell-killing yield, the 10B(n,alpha)7Li reaction density, and the tolerable neutron fluence. The radiation transport calculation from the neutron source to tumours is carried out using Monte Carlo methods: (i) reactor-based BNCT facility modelling to yield the neutron beam library at an irradiation port; (ii) dosimetry to limit the neutron fluence below a tolerance dose (10.5 Gy-Eq); (iii) calculation of the 10B(n,alpha)7Li reaction density in tumours. A shallow surface tumour could be effectively treated by single exposure producing an average cell survival probability of 10(-3)-10(-5) for probable ranges of the cell-killing yield for the two drugs, while a deep tumour will require bilateral exposure to achieve comparable cell kills at depth. With very pure epithermal beams eliminating thermal, low epithermal and fast neutrons, the cell survival can be decreased by factors of 2-10 compared with the unmodified neutron spectrum. A dominant effect of cell-killing yield on tumour cell survival demonstrates the importance of choice of boron carrier drug. However, these calculations do not indicate an unambiguous preference for one drug, due to the large overlap of tumour cell survival in the probable ranges of the cell-killing yield for the two drugs. The cell survival value averaged over a bulky tumour volume is used to predict the overall BNCT therapeutic efficacy, using a simple model of tumour control probability (TCP).

  20. Probability of Survival Scores in Different Trauma Registries: A Systematic Review.

    Science.gov (United States)

    Stoica, Bogdan; Paun, Sorin; Tanase, Ioan; Negoi, Ionut; Chiotoroiu, Alexandru; Beuran, Mircea

    2016-01-01

    A mixed score to predict the probability of survival has a key role in the modern trauma systems. The aim of the current studies is to summarize the current knowledge about estimation of survival in major trauma patients, in different trauma registries. Systematic review of the literature using electronic search in the PubMed/Medline, Web of Science Core Collection and EBSCO databases. We have used as a MeSH or truncated words a combination of trauma "probability of survival" and "mixed scores". The search strategy in PubMed was: "((((trauma(MeSH Major Topic)) OR injury(Title/Abstract)) AND score (Title/Abstract)) AND survival) AND registry (Title/Abstract))))". We used as a language selection only English language literature. There is no consensus between the major trauma registries, regarding probability of survival estimation in major trauma patients. The German (RISC II), United Kingdom (PS Model 14) trauma registries scores are based of the largest population, with demographics updated to the nowadays European injury pattern. The revised TRISS, resulting from the USA National Trauma Database, seems to be inaccurate for trauma systems managing predominantly blunt injuries. The probability of survival should be evaluated in all major trauma patients, with a score derived from a population which reproduce the current demographics.Only a careful audit of the unpredicted deaths may continuously improve our care for severely injured patients. Celsius.

  1. 30-Day Survival Probabilities as a Quality Indicator for Norwegian Hospitals: Data Management and Analysis.

    Science.gov (United States)

    Hassani, Sahar; Lindman, Anja Schou; Kristoffersen, Doris Tove; Tomic, Oliver; Helgeland, Jon

    2015-01-01

    The Norwegian Knowledge Centre for the Health Services (NOKC) reports 30-day survival as a quality indicator for Norwegian hospitals. The indicators have been published annually since 2011 on the website of the Norwegian Directorate of Health (www.helsenorge.no), as part of the Norwegian Quality Indicator System authorized by the Ministry of Health. Openness regarding calculation of quality indicators is important, as it provides the opportunity to critically review and discuss the method. The purpose of this article is to describe the data collection, data pre-processing, and data analyses, as carried out by NOKC, for the calculation of 30-day risk-adjusted survival probability as a quality indicator. Three diagnosis-specific 30-day survival indicators (first time acute myocardial infarction (AMI), stroke and hip fracture) are estimated based on all-cause deaths, occurring in-hospital or out-of-hospital, within 30 days counting from the first day of hospitalization. Furthermore, a hospital-wide (i.e. overall) 30-day survival indicator is calculated. Patient administrative data from all Norwegian hospitals and information from the Norwegian Population Register are retrieved annually, and linked to datasets for previous years. The outcome (alive/death within 30 days) is attributed to every hospital by the fraction of time spent in each hospital. A logistic regression followed by a hierarchical Bayesian analysis is used for the estimation of risk-adjusted survival probabilities. A multiple testing procedure with a false discovery rate of 5% is used to identify hospitals, hospital trusts and regional health authorities with significantly higher/lower survival than the reference. In addition, estimated risk-adjusted survival probabilities are published per hospital, hospital trust and regional health authority. The variation in risk-adjusted survival probabilities across hospitals for AMI shows a decreasing trend over time: estimated survival probabilities for AMI in

  2. 30-Day Survival Probabilities as a Quality Indicator for Norwegian Hospitals: Data Management and Analysis.

    Directory of Open Access Journals (Sweden)

    Sahar Hassani

    Full Text Available The Norwegian Knowledge Centre for the Health Services (NOKC reports 30-day survival as a quality indicator for Norwegian hospitals. The indicators have been published annually since 2011 on the website of the Norwegian Directorate of Health (www.helsenorge.no, as part of the Norwegian Quality Indicator System authorized by the Ministry of Health. Openness regarding calculation of quality indicators is important, as it provides the opportunity to critically review and discuss the method. The purpose of this article is to describe the data collection, data pre-processing, and data analyses, as carried out by NOKC, for the calculation of 30-day risk-adjusted survival probability as a quality indicator.Three diagnosis-specific 30-day survival indicators (first time acute myocardial infarction (AMI, stroke and hip fracture are estimated based on all-cause deaths, occurring in-hospital or out-of-hospital, within 30 days counting from the first day of hospitalization. Furthermore, a hospital-wide (i.e. overall 30-day survival indicator is calculated. Patient administrative data from all Norwegian hospitals and information from the Norwegian Population Register are retrieved annually, and linked to datasets for previous years. The outcome (alive/death within 30 days is attributed to every hospital by the fraction of time spent in each hospital. A logistic regression followed by a hierarchical Bayesian analysis is used for the estimation of risk-adjusted survival probabilities. A multiple testing procedure with a false discovery rate of 5% is used to identify hospitals, hospital trusts and regional health authorities with significantly higher/lower survival than the reference. In addition, estimated risk-adjusted survival probabilities are published per hospital, hospital trust and regional health authority. The variation in risk-adjusted survival probabilities across hospitals for AMI shows a decreasing trend over time: estimated survival probabilities

  3. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  4. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  5. Effects of amphibian chytrid fungus on individual survival probability in wild boreal toads

    Science.gov (United States)

    Pilliod, D.S.; Muths, E.; Scherer, R. D.; Bartelt, P.E.; Corn, P.S.; Hossack, B.R.; Lambert, B.A.; Mccaffery, R.; Gaughan, C.

    2010-01-01

    Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture-recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect of the presence of amphibian chytrid fungus (Batrachochytrium dendrobatidis [Bd]; the agent of chytridiomycosis) on survival probability and population growth rate. Toads that were infected with Bd had lower average annual survival probability than uninfected individuals at sites where Bd was detected, which suggests chytridiomycosis may reduce survival by 31-42% in wild boreal toads. Toads that were negative for Bd at infected sites had survival probabilities comparable to toads at the uninfected site. Evidence that environmental covariates (particularly cold temperatures during the breeding season) influenced toad survival was weak. The number of individuals in diseased populations declined by 5-7%/year over the 6 years of the study, whereas the uninfected population had comparatively stable population growth. Our data suggest that the presence of Bd in these toad populations is not causing rapid population declines. Rather, chytridiomycosis appears to be functioning as a low-level, chronic disease whereby some infected individuals survive but the overall population effects are still negative. Our results show that some amphibian populations may be coexisting with Bd and highlight the importance of quantitative assessments of survival in diseased animal populations. Journal compilation. ?? 2010 Society for Conservation Biology. No claim to original US government works.

  6. A Failure Probability Calculation Method for Power Equipment Based on Multi-Characteristic Parameters

    Directory of Open Access Journals (Sweden)

    Hang Liu

    2017-05-01

    Full Text Available Although traditional fault diagnosis methods can qualitatively identify the failure modes for power equipment, it is difficult to evaluate the failure probability quantitatively. In this paper, a failure probability calculation method for power equipment based on multi-characteristic parameters is proposed. After collecting the historical data of different fault characteristic parameters, the distribution functions and the cumulative distribution functions of each parameter, which are applied to dispersing the parameters and calculating the differential warning values, are calculated by using the two-parameter Weibull model. To calculate the membership functions of parameters for each failure mode, the Apriori algorithm is chosen to mine the association rules between parameters and failure modes. After that, the failure probability of each failure mode is obtained by integrating the membership functions of different parameters by a weighted method, and the important weight of each parameter is calculated by the differential warning values. According to the failure probability calculation result, the series model is established to estimate the failure probability of the equipment. Finally, an application example for two 220 kV transformers is presented to show the detailed process of the method. Compared with traditional fault diagnosis methods, the calculation results not only identify the failure modes correctly, but also reflect the failure probability changing trend of the equipment accurately.

  7. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  8. Analysis of feedbacks between nucleation rate, survival probability and cloud condensation nuclei formation

    Science.gov (United States)

    Westervelt, D. M.; Pierce, J. R.; Adams, P. J.

    2014-06-01

    Aerosol nucleation is an important source of particle number in the atmosphere. However, in order to become cloud condensation nuclei (CCN), freshly nucleated particles must undergo significant condensational growth while avoiding coagulational scavenging. In an effort to quantify the contribution of nucleation to CCN, this work uses the GEOS-Chem-TOMAS global aerosol model to calculate changes in CCN concentrations against a broad range of nucleation rates and mechanisms. We then quantify the factors that control CCN formation from nucleation, including daily nucleation rates, growth rates, coagulation sinks, condensation sinks, survival probabilities, and CCN formation rates, in order to examine feedbacks that may limit growth of nucleated particles to CCN. Nucleation rate parameterizations tested in GEOS-Chem-TOMAS include ternary nucleation (with multiple tuning factors), activation nucleation (with two pre-factors), binary nucleation, and ion-mediated nucleation. We find that nucleation makes a significant contribution to boundary layer CCN(0.2%), but this contribution is only modestly sensitive to the choice of nucleation scheme, ranging from 49 to 78% increase in concentrations over a control simulation with no nucleation. Moreover, a two order-of-magnitude increase in the globally averaged nucleation rate (via changes to tuning factors) results in small changes (less than 10%) to global CCN(0.2%) concentrations. To explain this, we present a simple theory showing that survival probability has an exponentially decreasing dependence on the square of the condensation sink. This functional form stems from a negative correlation between condensation sink and growth rate and a positive correlation between condensation sink and coagulational scavenging. Conceptually, with a fixed condensable vapor budget (sulfuric acid and organics), any increase in CCN concentrations due to higher nucleation rates necessarily entails an increased aerosol surface area in the

  9. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  10. Nonlinear effects of winter sea ice on the survival probabilities of Adélie penguins.

    Science.gov (United States)

    Ballerini, Tosca; Tavecchia, Giacomo; Olmastroni, Silvia; Pezzo, Francesco; Focardi, Silvano

    2009-08-01

    The population dynamics of Antarctic seabirds are influenced by variations in winter sea ice extent and persistence; however, the type of relationship differs according to the region and the demographic parameter considered. We used annual presence/absence data obtained from 1,138 individually marked birds to study the influence of environmental and individual characteristics on the survival of Adélie penguins Pygoscelis adeliae at Edmonson Point (Ross Sea, Antarctica) between 1994 and 2005. About 25% of 600 birds marked as chicks were reobserved at the natal colony. The capture and survival rates of Adélie penguins at this colony increased with the age of individuals, and five age classes were identified for both parameters. Mean adult survival was 0.85 (SE = 0.01), and no effect of sex on survival was evident. Breeding propensity, as measured by adult capture rates, was close to one, indicating a constant breeding effort through time. Temporal variations in survival were best explained by a quadratic relationship with winter sea ice extent anomalies in the Ross Sea, suggesting that for this region optimal conditions are intermediate between too much and too little winter sea ice. This is likely the result of a balance between suitable wintering habitat and food availability. Survival rates were not correlated with the Southern Oscillation Index. Low adult survival after a season characterized by severe environmental conditions at breeding but favorable conditions during winter suggested an additional mortality mediated by the reproductive effort. Adélie penguins are sensitive indicators of environmental changes in the Antarctic, and the results from this study provide insights into regional responses of this species to variability in winter sea ice habitat.

  11. Cell survival probability in a spread-out Bragg peak for novel treatment planning

    Science.gov (United States)

    Surdutovich, Eugene; Solov'yov, Andrey V.

    2017-08-01

    The problem of variable cell survival probability along the spread-out Bragg peak is one of the long standing problems in planning and optimisation of ion-beam therapy. This problem is considered using the multiscale approach to the physics of ion-beam therapy. The physical reasons for this problem are analysed and understood on a quantitative level. A recipe of solution to this problem is suggested using this approach. This recipe can be used in the design of a novel treatment planning and optimisation based on fundamental science.

  12. Lower survival probabilities for adult Florida manatees in years with intense coastal storms

    Science.gov (United States)

    Langtimm, C.A.; Beck, C.A.

    2003-01-01

    The endangered Florida manatee (Trichechus manatus latirostris) inhabits the subtropical waters of the southeastern United States, where hurricanes are a regular occurrence. Using mark-resighting statistical models, we analyzed 19 years of photo-identification data and detected significant annual variation in adult survival for a subpopulation in northwest Florida where human impact is low. That variation coincided with years when intense hurricanes (Category 3 or greater on the Saffir-Simpson Hurricane Scale) and a major winter storm occurred in the northern Gulf of Mexico. Mean survival probability during years with no or low intensity storms was 0.972 (approximate 95% confidence interval = 0.961-0.980) but dropped to 0.936 (0.864-0.971) in 1985 with Hurricanes Elena, Kate, and Juan; to 0.909 (0.837-0.951) in 1993 with the March "Storm of the Century"; and to 0.817 (0.735-0.878) in 1995 with Hurricanes Opal, Erin, and Allison. These drops in survival probability were not catastrophic in magnitude and were detected because of the use of state-of-the-art statistical techniques and the quality of the data. Because individuals of this small population range extensively along the north Gulf coast of Florida, it was possible to resolve storm effects on a regional scale rather than the site-specific local scale common to studies of more sedentary species. This is the first empirical evidence in support of storm effects on manatee survival and suggests a cause-effect relationship. The decreases in survival could be due to direct mortality, indirect mortality, and/or emigration from the region as a consequence of storms. Future impacts to the population by a single catastrophic hurricane, or series of smaller hurricanes, could increase the probability of extinction. With the advent in 1995 of a new 25- to 50-yr cycle of greater hurricane activity, and longer term change possible with global climate change, it becomes all the more important to reduce mortality and injury

  13. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    Science.gov (United States)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  14. Probability of survival of implant-supported metal ceramic and CAD/CAM resin nanoceramic crowns.

    Science.gov (United States)

    Bonfante, Estevam A; Suzuki, Marcelo; Lorenzoni, Fábio C; Sena, Lídia A; Hirata, Ronaldo; Bonfante, Gerson; Coelho, Paulo G

    2015-08-01

    To evaluate the probability of survival and failure modes of implant-supported resin nanoceramic relative to metal-ceramic crowns. Resin nanoceramic molar crowns (LU) (Lava Ultimate, 3M ESPE, USA) were milled and metal-ceramic (MC) (Co-Cr alloy, Wirobond C+, Bego, USA) with identical anatomy were fabricated (n=21). The metal coping and a burnout-resin veneer were created by CAD/CAM, using an abutment (Stealth-abutment, Bicon LLC, USA) and a milled crown from the LU group as models for porcelain hot-pressing (GC-Initial IQ-Press, GC, USA). Crowns were cemented, the implants (n=42, Bicon) embedded in acrylic-resin for mechanical testing, and subjected to single-load to fracture (SLF, n=3 each) for determination of step-stress profiles for accelerated-life testing in water (n=18 each). Weibull curves (50,000 cycles at 200N, 90% CI) were plotted. Weibull modulus (m) and characteristic strength (η) were calculated and a contour plot used (m versus η) for determining differences between groups. Fractography was performed in SEM and polarized-light microscopy. SLF mean values were 1871N (±54.03) for MC and 1748N (±50.71) for LU. Beta values were 0.11 for MC and 0.49 for LU. Weibull modulus was 9.56 and η=1038.8N for LU, and m=4.57 and η=945.42N for MC (p>0.10). Probability of survival (50,000 and 100,000 cycles at 200 and 300N) was 100% for LU and 99% for MC. Failures were cohesive within LU. In MC crowns, porcelain veneer fractures frequently extended to the supporting metal coping. Probability of survival was not different between crown materials, but failure modes differed. In load bearing regions, similar reliability should be expected for metal ceramics, known as the gold standard, and resin nanoceramic crowns over implants. Failure modes involving porcelain veneer fracture and delamination in MC crowns are less likely to be successfully repaired compared to cohesive failures in resin nanoceramic material. Copyright © 2015 Academy of Dental Materials

  15. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps.

    Science.gov (United States)

    Arkhincheev, V E

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways-by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  16. Parameter-free testing of the shape of a probability distribution.

    Science.gov (United States)

    Broom, M; Nouvellet, P; Bacon, J P; Waxman, D

    2007-01-01

    The Kolmogorov-Smirnov test determines the consistency of empirical data with a particular probability distribution. Often, parameters in the distribution are unknown, and have to be estimated from the data. In this case, the Kolmogorov-Smirnov test depends on the form of the particular probability distribution under consideration, even when the estimated parameter-values are used within the distribution. In the present work, we address a less specific problem: to determine the consistency of data with a given functional form of a probability distribution (for example the normal distribution), without enquiring into values of unknown parameters in the distribution. For a wide class of distributions, we present a direct method for determining whether empirical data are consistent with a given functional form of the probability distribution. This utilizes a transformation of the data. If the data are from the class of distributions considered here, the transformation leads to an empirical distribution with no unknown parameters, and hence is susceptible to a standard Kolmogorov-Smirnov test. We give some general analytical results for some of the distributions from the class of distributions considered here. The significance level and power of the tests introduced in this work are estimated from simulations. Some biological applications of the method are given.

  17. Inelastic cross section and survival probabilities at the LHC in minijet models

    Science.gov (United States)

    Fagundes, Daniel A.; Grau, Agnes; Pancheri, Giulia; Shekhovtsova, Olga; Srivastava, Yogendra N.

    2017-09-01

    Recent results for the total and inelastic hadronic cross sections from LHC experiments are compared with predictions from a single-channel eikonal minijet model driven by parton density functions and from an empirical model. The role of soft gluon resummation in the infrared region in taming the rise of minijets and their contribution to the increase of the total cross sections at high energies are discussed. Survival probabilities at the LHC, whose theoretical estimates range from circa 10% to a few per mille, are estimated in this model and compared with results from QCD-inspired models and from multichannel eikonal models. We revisit a previous calculation and examine the origin of these discrepancies.

  18. Survival probabilities of first and second clutches of blackbird (Turdus merula in an urban environment

    Directory of Open Access Journals (Sweden)

    Kurucz Kornelia

    2010-01-01

    Full Text Available The breeding success of blackbirds was investigated in April and June 2008 and 2009 in the Botanical Garden of the University of Pecs, with a total of 50 artificial nests at each of the four sessions (with 1 quail egg and 1 plasticine egg placed in every nest. In all four study periods of the two years, 2 nests (4% were destroyed by predators. Six nests (12%, of the nests were not discovered in either of the cases. The survival probability of artificial nests was greater in April than in June (both years, but the difference was significant only in 2008. Nests placed into a curtain of ivy (Hedera helix on a wall were located higher up than those in bushes, yet their predation rates were quite similar. The predation values of quail vs. plasticine eggs did not differ in 2008. In the year 2009, however, significantly more quail eggs were discovered (mostly removed, than plasticine eggs. Marks that were left on plasticine eggs originated mostly from small mammals and small-bodied birds, but the disappearance of a large number of quail and plasticine eggs was probably caused by larger birds, primarily jays.

  19. 10-Day survival of Hyalella azteca as a function of water quality parameters.

    Science.gov (United States)

    Javidmehr, Alireza; Kass, Philip H; Deanovic, Linda A; Connon, Richard E; Werner, Inge

    2015-05-01

    Estuarine systems are among the most impacted ecosystems due to anthropogenic contaminants; however, they present unique challenges to toxicity testing with regard to varying water quality parameters. The euryhaline amphipod species, Hyalella azteca, is widely used in toxicity testing and well suited for testing estuarine water samples. Nevertheless, the influence of relevant water quality parameters on test endpoints must be quantified in order to efficiently use this species for routine monitoring. Here, we studied the influence of five water quality parameters: electrical conductivity, pH, un-ionized ammonia, dissolved oxygen and temperature, on H. azteca survival in a water column toxicity test. A model was developed to quantify and predict the independent and interacting effects of water quality variables on 10-day survival. The model allows simultaneous assessment of multiple potential predictors recorded during the tests. Data used for modeling came from 1089 tests performed on ambient water samples over a period of three years (2006-2008). The final model reflects significant effects of predictors and their two-way interactions. The effect of each level of all predictors on survival probability of H. azteca was examined by comparing levels of each predictor at a time, while holding all others at their lowest (reference) level. This study showed that predictors of survival in water column tests should not be evaluated in isolation in the interpretation of H. azteca water column tests. Our model provides a useful tool to predict expected control survival based on relevant water quality parameters, and thus enables the use of H. azteca tests for toxicity monitoring in estuaries with a wide range of water quality conditions. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Asymptotic coverage probabilities of bootstrap percentile confidence intervals for constrained parameters

    OpenAIRE

    Wang, Chunlin; Marriott, Paul; Li, Pengfei

    2017-01-01

    The asymptotic behaviour of the commonly used bootstrap percentile confidence interval is investigated when the parameters are subject to linear inequality constraints. We concentrate on the important one- and two-sample problems with data generated from general parametric distributions in the natural exponential family. The focus of this paper is on quantifying the coverage probabilities of the parametric bootstrap percentile confidence intervals, in particular their limiting behaviour near ...

  1. Curve fitting for probability of detection data: A 4 - parameter generalization

    Science.gov (United States)

    Spencer, Floyd W.

    2014-02-01

    The hit - miss data taken from NDE validation and inspector qualification exercises have traditionally been used with logit or probit binary regression models to estimate probability of detection (POD) curves. These models are specified by functions with two parameters that determine location and shape of the resulting POD expressed in terms of an independent flaw size variable. A generalization of these models is discussed in which two additional parameters are added that allow the POD function range to be confined to a subset of the 0 to 1 interval. Thus the POD curve can have a lower asymptote other than zero and an upper asymptote other than one. The additional parameters model naturally occurring inspection phenomena such as detections and misses independent of flaw size. The relationship of this 4 - parameter model to non - parametric POD estimation is also discussed. Determining the need for, or the desirability of, fitting additional parameters is developed in terms of the statistical significance of the additional parameters. Other strategies for judging the ability of the resultant POD curve from the 4 - parameter fit to more adequately reflect the inspection data are also considered.

  2. Odds and Probabilities Estimation for the Survival of Breast Cancer Patients with Cancer Stages 2 & 3

    Directory of Open Access Journals (Sweden)

    Urrutia Jackie D.

    2016-01-01

    Full Text Available Breast cancer is one of the leading causes of death in the Philippines. One out of four who are diagnosed with breast cancer die within the first five years, and no less than 40 percent die within 10 years and it has continous rise as time passes by. Therefore, it is very important to know the factors that can help for the survival rate of the patients. The purpose of this study is to identify the best possible treatment or combination of treatments. The researchers considered four independent variables namely: Completed Surgery, Completed Chemotherapy, Completed Hormonotherapy and Completed Radiotherapy. The researchers limit this study for only 160 patients with stage 2 and 135 with stage 3 for a total of 295 patients considering the data gathered from three hospitals from Metro Manila. The names of the hospitals were not declared due to confidentiality of data. In identifying the best treatment or combination of treatments, odds, probabilities and odds ratios of patients, Logistic Regression Analysis was used.

  3. Parameter Analysis of the VPIN (Volume synchronized Probability of Informed Trading) Metric

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jung Heon; Wu, Kesheng; Simon, Horst D.

    2014-03-01

    VPIN (Volume synchronized Probability of Informed trading) is a leading indicator of liquidity-induced volatility. It is best known for having produced a signal more than hours before the Flash Crash of 2010. On that day, the market saw the biggest one-day point decline in the Dow Jones Industrial Average, which culminated to the market value of $1 trillion disappearing, but only to recover those losses twenty minutes later (Lauricella 2010). The computation of VPIN requires the user to set up a handful of free parameters. The values of these parameters significantly affect the effectiveness of VPIN as measured by the false positive rate (FPR). An earlier publication reported that a brute-force search of simple parameter combinations yielded a number of parameter combinations with FPR of 7%. This work is a systematic attempt to find an optimal parameter set using an optimization package, NOMAD (Nonlinear Optimization by Mesh Adaptive Direct Search) by Audet, le digabel, and tribes (2009) and le digabel (2011). We have implemented a number of techniques to reduce the computation time with NOMAD. Tests show that we can reduce the FPR to only 2%. To better understand the parameter choices, we have conducted a series of sensitivity analysis via uncertainty quantification on the parameter spaces using UQTK (Uncertainty Quantification Toolkit). Results have shown dominance of 2 parameters in the computation of FPR. Using the outputs from NOMAD optimization and sensitivity analysis, We recommend A range of values for each of the free parameters that perform well on a large set of futures trading records.

  4. Model and test in a fungus of the probability that beneficial mutations survive drift

    NARCIS (Netherlands)

    Gifford, D.R.; Visser, de J.A.G.M.; Wahl, L.M.

    2013-01-01

    Determining the probability of fixation of beneficial mutations is critically important for building predictive models of adaptive evolution. Despite considerable theoretical work, models of fixation probability have stood untested for nearly a century. However, recent advances in experimental and

  5. EXPERIMENTAL DETERMINATION OF PARAMETERS OF THE LAW GOVERNING DISTRIBUTION OF TIME PROBABILITIES BY PRECISE OPERATION OF AN ELECTRONIC APPARATUS,

    Science.gov (United States)

    For experimental determination of parameters of the law of time probability distribution of correct operation the demarcation of failures by causes...shows practically no effect on reliability. Parameters of the law of probabilities distribution, determined by numerical values of dispersion and

  6. How the probability of presentation to a primary care clinician correlates with cancer survival rates: a European survey using vignettes.

    Science.gov (United States)

    Harris, Michael; Frey, Peter; Esteva, Magdalena; Gašparović Babić, Svjetlana; Marzo-Castillejo, Mercè; Petek, Davorina; Petek Ster, Marija; Thulesius, Hans

    2017-03-01

    European cancer survival rates vary widely. System factors, including whether or not primary care physicians (PCPs) are gatekeepers, may account for some of these differences. This study explores where patients who may have cancer are likely to present for medical care in different European countries, and how probability of presentation to a primary care clinician correlates with cancer survival rates. Seventy-eight PCPs in a range of European countries assessed four vignettes representing patients who might have cancer, and consensus groups agreed how likely those patients were to present to different clinicians in their own countries. These data were compared with national cancer survival rates. A total of 14 countries. Consensus groups of PCPs. Probability of initial presentation to a PCP for four clinical vignettes. There was no significant correlation between overall national 1-year relative cancer survival rates and the probability of initial presentation to a PCP (r  = -0.16, 95% CI -0.39 to 0.08). Within that there was large variation depending on the type of cancer, with a significantly poorer lung cancer survival in countries where patients were more likely to initially consult a PCP (lung r = -0.57, 95% CI -0.83 to -0.12; ovary: r = -0.13, 95% CI -0.57 to 0.38; breast r = 0.14, 95% CI -0.36 to 0.58; bowel: r = 0.20, 95% CI -0.31 to 0.62). There were wide variations in the degree of gatekeeping between countries, with no simple binary model as to whether or not a country has a "PCP-as-gatekeeper" system. While there was case-by-case variation, there was no overall evidence of a link between a higher probability of initial consultation with a PCP and poorer cancer survival. KEY POINTS European cancer survival rates vary widely, and health system factors may account for some of these differences. The data from 14 European countries show a wide variation in the probability of initial presentation to a PCP. The degree to

  7. Estimation of the probability of bacterial population survival: Development of a probability model to describe the variability in time to inactivation of Salmonella enterica.

    Science.gov (United States)

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu

    2017-12-01

    Despite the development of numerous predictive microbial inactivation models, a model focusing on the variability in time to inactivation for a bacterial population has not been developed. Additionally, an appropriate estimation of the risk of there being any remaining bacterial survivors in foods after the application of an inactivation treatment has not yet been established. Here, Gamma distribution, as a representative probability distribution, was used to estimate the variability in time to inactivation for a bacterial population. Salmonella enterica serotype Typhimurium was evaluated for survival in a low relative humidity environment. We prepared bacterial cells with an initial concentration that was adjusted to 2 × 10n colony-forming units/2 μl (n = 1, 2, 3, 4, 5) by performing a serial 10-fold dilution, and then we placed 2 μl of the inocula into each well of 96-well microplates. The microplates were stored in a desiccated environment at 10-20% relative humidity at 5, 15, or 25 °C. The survival or death of bacterial cells for each well in the 96-well microplate was confirmed by adding tryptic soy broth as an enrichment culture. The changes in the death probability of the 96 replicated bacterial populations were described as a cumulative Gamma distribution. The variability in time to inactivation was described by transforming the cumulative Gamma distribution into a Gamma distribution. We further examined the bacterial inactivation on almond kernels and radish sprout seeds. Additionally, we described certainty levels of bacterial inactivation that ensure the death probability of a bacterial population at six decimal reduction levels, ranging from 90 to 99.9999%. Consequently, the probability model developed in the present study enables us to estimate the death probability of bacterial populations in a desiccated environment over time. This probability model may be useful for risk assessment to estimate the amount of remaining bacteria in a given

  8. An informative prior probability distribution of the gompertz parameters for bayesian approaches in paleodemography.

    Science.gov (United States)

    Sasaki, Tomohiko; Kondo, Osamu

    2016-03-01

    In paleodemography, the Bayesian approach has been suggested to provide an effective means by which mortality profiles of past populations can be adequately estimated, and thus avoid problems of "age-mimicry" inherent in conventional approaches. In this study, we propose an application of the Gompertz model using an "informative" prior probability distribution by revising a recent example of the Bayesian approach based on an "uninformative" distribution. Life-table data of 134 human populations including those of contemporary hunter-gatherers were used to determine the Gompertz parameters of each population. In each population, we used both raw life-table data and the Gompertz parameters to calculate some demographic values such as the mean life-span, to confirm representativeness of the model. Then, the correlation between the two Gompertz parameters (the Strehler-Mildvan correlation) was re-established. We incorporated the correlation into the Bayesian approach as an "informative" prior probability distribution, and tested its effectiveness using simulated data. Our analyses showed that the mean life-span (≥ age 15) and the proportion of living persons aging over 45 were well-reproduced by the Gompertz model. The simulation showed that using the correlation as an informative prior provides a narrower estimation range in the Bayesian approach than does the uninformative prior. The Gompertz model can be assumed to accurately estimate the mean life-span and/or the proportion of old people in a population. We suggest that the Strehler-Mildvan correlation can be used as a useful constraint in demographic reconstructions of past human populations. © 2015 Wiley Periodicals, Inc.

  9. Estimating the probability of survival of individual shortleaf pine (Pinus echinata mill.) trees

    Science.gov (United States)

    Sudip Shrestha; Thomas B. Lynch; Difei Zhang; James M. Guldin

    2012-01-01

    A survival model is needed in a forest growth system which predicts the survival of trees on individual basis or on a stand basis (Gertner, 1989). An individual-tree modeling approach is one of the better methods available for predicting growth and yield as it provides essential information about particular tree species; tree size, tree quality and tree present status...

  10. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  11. Survival probability of an immobile target in a sea of evanescent diffusive or subdiffusive traps: a fractional equation approach.

    Science.gov (United States)

    Abad, E; Yuste, S B; Lindenberg, Katja

    2012-12-01

    We calculate the survival probability of an immobile target surrounded by a sea of uncorrelated diffusive or subdiffusive evanescent traps (i.e., traps that disappear in the course of their motion). Our calculation is based on a fractional reaction-subdiffusion equation derived from a continuous time random walk model of the system. Contrary to an earlier method valid only in one dimension (d=1), the equation is applicable in any Euclidean dimension d and elucidates the interplay between anomalous subdiffusive transport, the irreversible evanescence reaction, and the dimension in which both the traps and the target are embedded. Explicit results for the survival probability of the target are obtained for a density ρ(t) of traps which decays (i) exponentially and (ii) as a power law. In the former case, the target has a finite asymptotic survival probability in all integer dimensions, whereas in the latter case there are several regimes where the values of the decay exponent for ρ(t) and the anomalous diffusion exponent of the traps determine whether or not the target has a chance of eternal survival in one, two, and three dimensions.

  12. Additional components of risk assessment and their impact on the probability parameter

    Directory of Open Access Journals (Sweden)

    Piotr Saja

    2017-04-01

    Full Text Available The article raises the issue of risk assessment and its impact on the quality and safety of work. During the assessment of the turning lathe position additional components associated with the jobs personalization were taken into account. Paragraph 2 item 7 of the Regulation of the Minister of Laborr and Social Policy of 26 September 1997 on general safety regulations defines occupational risk as the likelihood of an adverse event. The authors drew attention to the reality of the accident, which sometimes depends on the predisposition of the employee. It turns out that a correct estimation of the probability of occurrence of the accident to be able to react in a timely way seems extremely important.. This parameter will be assessed more accurately if we take into account a number of additional components resulting from the characteristics of the employee. The results of the personalized assessment of risk may allow appropriate planning of corrective and preventive actions.

  13. Simultaneous parameter and tolerance optimization of structures via probability-interval mixed reliability model

    DEFF Research Database (Denmark)

    Luo, Yangjun; Wu, Xiaoxiang; Zhou, Mingdong

    2015-01-01

    Both structural sizes and dimensional tolerances strongly influence the manufacturing cost and the functional performance of a practical product. This paper presents an optimization method to simultaneously find the optimal combination of structural sizes and dimensional tolerances. Based...... points (TPPs) and the worst case points (WCPs), which shows better performance than traditional approaches for highly nonlinear problems. Numerical results reveal that reasonable dimensions and tolerances can be suggested for the minimum manufacturing cost and a desirable structural safety....... on a probability-interval mixed reliability model, the imprecision of design parameters is modeled as interval uncertainties fluctuating within allowable tolerance bounds. The optimization model is defined as to minimize the total manufacturing cost under mixed reliability index constraints, which are further...

  14. Age-specific survival and reproductive probabilities: evidence for senescence in male fallow deer (Dama dama)

    National Research Council Canada - National Science Library

    A. G. McElligott; R. Altwegg; T. J. Altwegg

    2002-01-01

    ...–fitting model revealed that fallow bucks have four life–history stages: yearling, pre–reproductive, prime–age and senescent. Pre–reproductive males (2 and 3 years old) had the highest survival...

  15. Surviving probability indicators of landing juvenile magellanic penguins arriving along the southern Brazilian coast

    Directory of Open Access Journals (Sweden)

    Sandra Carvalho Rodrigues

    2010-04-01

    Full Text Available The aim of this work was to monitor and study the hematocrit and weight of juvenile penguins, with and without oil cover, found alive along the southern coast of Brazil, after capture, as well as before eventual death or release. Released juvenile penguins showed higher weight and hematocrit (3.65 ± 0.06 kg and 44.63 ± 0.29%, respectively than those that died (2.88 ± 0.08 kg and 34.42 ± 1.70%, respectively. Penguins with higher hematocrit and weight after capture had higher mean weight gain than their counterparts with smaller hematocrit and weight after the capture. Besides, juveniles with higher hematocrit and weight after the capture had higher survival rates, independent of the presence or absence of oil. The results suggested that juveniles covered with oil might have been healthier than the juveniles without oil. The animals without oil probably died as a consequence of health disturbances, while the animals with oil possibly were healthy before contact with oil in the sea.O hematócrito e o peso de pingüins juvenis, com e sem óleo, encontrados vivos na costa do sul do Brasil, foram monitorados após sua captura, bem como antes de sua morte ou liberação do centro de reabilitação. Os pingüins juvenis liberados apresentaram o último peso e hematócrito (3.65 ± 0.06 kg e 44.63 ± 0.29%, respectivamente maiores do que os pingüins que morreram (2.88 ± 0.08 kg e 34.42 ± 1.70%, respectivamente. Pingüins juvenis com maior hematócrito e peso após a captura tiveram maior ganho médio de peso do que os pingüins com menor hematócrito e peso após a captura. Além disso, os juvenis com maior hematócrito e peso após a captura tiveram maiores taxas de sobrevivência, independente da presença ou ausência de óleo. Os resultados sugerem que os pingüins juvenis com óleo poderiam estar mais saudáveis do que os juvenis sem óleo. Os animais sem óleo provavelmente morreram em decorrência de doenças, endoparasitas ou outros dist

  16. Integrating hyper-parameter uncertainties in a multi-fidelity Bayesian model for the estimation of a probability of failure

    OpenAIRE

    Stroh, Rémi; Bect, Julien; Demeyer, Séverine; Fischer, Nicolas; Vazquez, Emmanuel

    2017-01-01

    International audience; A multi-fidelity simulator is a numerical model, in which one of the inputs controls a trade-off between the realism and the computational cost of the simulation. Our goal is to estimate the probability of exceeding a given threshold on a multi-fidelity stochastic simulator. We propose a fully Bayesian approach based on Gaussian processes to compute the posterior probability distribution of this probability. We pay special attention to the hyper-parameters of the model...

  17. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  18. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift.

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  19. Intraseasonal variation in survival and probable causes of mortality in greater sage-grouse Centrocercus urophasianus

    Science.gov (United States)

    Blomberg, Erik J.; Gibson, Daniel; Sedinger, James S.; Casazza, Michael L.; Coates, Peter S.

    2013-01-01

    The mortality process is a key component of avian population dynamics, and understanding factors that affect mortality is central to grouse conservation. Populations of greater sage-grouse Centrocercus urophasianus have declined across their range in western North America. We studied cause-specific mortality of radio-marked sage-grouse in Eureka County, Nevada, USA, during two seasons, nesting (2008-2012) and fall (2008-2010), when survival was known to be lower compared to other times of the year. We used known-fate and cumulative incidence function models to estimate weekly survival rates and cumulative risk of cause-specific mortalities, respectively. These methods allowed us to account for temporal variation in sample size and staggered entry of marked individuals into the sample to obtain robust estimates of survival and cause-specific mortality. We monitored 376 individual sage-grouse during the course of our study, and investigated 87 deaths. Predation was the major source of mortality, and accounted for 90% of all mortalities during our study. During the nesting season (1 April - 31 May), the cumulative risk of predation by raptors (0.10; 95% CI: 0.05-0.16) and mammals (0.08; 95% CI: 0.03-013) was relatively equal. In the fall (15 August - 31 October), the cumulative risk of mammal predation was greater (M(mam) = 0.12; 95% CI: 0.04-0.19) than either predation by raptors (M(rap) = 0.05; 95% CI: 0.00-0.10) or hunting harvest (M(hunt) = 0.02; 95% CI: 0.0-0.06). During both seasons, we observed relatively few additional sources of mortality (e.g. collision) and observed no evidence of disease-related mortality (e.g. West Nile Virus). In general, we found little evidence for intraseasonal temporal variation in survival, suggesting that the nesting and fall seasons represent biologically meaningful time intervals with respect to sage-grouse survival.

  20. Computer-assisted predictive formulas expressing survival probability and life expectancy in US adults, men and women, 2001.

    Science.gov (United States)

    Chung, Sung J

    2007-06-01

    The National Center for Health Statistics (NCHS) reported the United States life tables, 2001 for US total, male and female populations on the basis of 2001 mortality statistics, the 2000 decennial census and the data from the Medicare program [E. Arias, United State life tables, 2001, Natl. Vital Stat. Rep. 52 (2004) 1-40]. The life tables show life expectancy, survival and death rate at each year between birth and 100 years of age. In this study formulas expressing survival probability and life expectancy in US adults, men and women are constructed from the data of the NCHS. A model of the 'probacent'-probability equation previously published by the author is employed in the study. Analysis of the formula-predicted values and the NCHS-reported data indicates that the formulas are accurate and reliable with a close agreement. The formula representing a generalized lognormal distribution might be useful for biomedical investigation, and epidemiological and demographic studies in US adults, men and women.

  1. Sugar administration to newly emerged Aedes albopictus males increases their survival probability and mating performance.

    Science.gov (United States)

    Bellini, Romeo; Puggioli, Arianna; Balestrino, Fabrizio; Brunelli, Paolo; Medici, Anna; Urbanelli, Sandra; Carrieri, Marco

    2014-04-01

    Aedes albopictus male survival in laboratory cages is no more than 4-5 days when kept without any access to sugar indicating their need to feed on a sugar source soon after emergence. We therefore developed a device to administer energetic substances to newly emerged males when released as pupae as part of a sterile insect technique (SIT) programme, made with a polyurethane sponge 4 cm thick and perforated with holes 2 cm in diameter. The sponge was imbibed with the required sugar solution and due to its high retention capacity the sugar solution was available for males to feed for at least 48 h. When evaluated in lab cages, comparing adults emerged from the device with sugar solution vs the device with water only (as negative control), about half of the males tested positive for fructose using the Van Handel anthrone test, compared to none of males in the control cage. We then tested the tool in semi-field and in field conditions with different sugar concentrations (10%, 15%, and 20%) and compared results to the controls fed with water only. Males were recaptured by a battery operated manual aspirator at 24 and 48 h after pupae release. Rather high share 10-25% of captured males tested positive for fructose in recollections in the vicinity of the control stations, while in the vicinity of the sugar stations around 40-55% of males were positive, though variability between replicates was large. The sugar positive males in the control test may have been released males that had access to natural sugar sources found close to the release station and/or wild males present in the environment. Only a slight increase in the proportion of positive males was obtained by increasing the sugar concentration in the feeding device from 10% to 20%. Surprisingly, modification of the device to add a black plastic inverted funnel above the container reduced rather than increased the proportion of fructose positive males collected around the station. No evidence of difference in the

  2. Construction and identification of a D-Vine model applied to the probability distribution of modal parameters in structural dynamics

    Science.gov (United States)

    Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.

    2018-01-01

    This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.

  3. IDENTIFICATION OF OPTIMAL PARAMETERS OF REINFORCED CONCRETE STRUCTURES WITH ACCOUNT FOR THE PROBABILITY OF FAILURE

    Directory of Open Access Journals (Sweden)

    Filimonova Ekaterina Aleksandrovna

    2012-10-01

    The author suggests splitting the aforementioned parameters into the two groups, namely, natural parameters and value-related parameters that are introduced to assess the costs of development, transportation, construction and operation of a structure, as well as the costs of its potential failure. The author proposes a new improved methodology for the identification of the above parameters that ensures optimal solutions to non-linear objective functions accompanied by non-linear restrictions that are critical to the design of reinforced concrete structures. Any structural failure may be interpreted as the bounce of a random process associated with the surplus bearing capacity into the negative domain. Monte Carlo numerical methods make it possible to assess these bounces into the unacc eptable domain.

  4. Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of five consulting hydrologists

    Energy Technology Data Exchange (ETDEWEB)

    Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.

    1984-11-01

    The Columbia River basalts underlying the Hanford Site in Washington State are being considered as a possible location for a geologic repository for high-level nuclear waste. To investigate the feasibility of a repository at this site, the hydrologic parameters of the site must be evaluated. Among hydrologic parameters of particular interest are the effective porosity of the Cohassett basalt flow top and flow interior and the vertical-to-horizontal hydraulic conductivity, or anisotropy ratio, of the Cohassett basalt flow interior. The Cohassett basalt flow is the prime candidate horizon for repository studies. Site-specific data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. To obtain credible, auditable, and independently derived estimates of the specified hydrologic parameters, a panel of five nationally recognized hydrologists was assembled. Their expert judgments were quantified during two rounds of Delphi process by means of a probability encoding method developed to estimate the probability distributions of the selected hydrologic variables. The results indicate significant differences of expert opinion for cumulative probabilities of less than 10% and greater than 90%, but relatively close agreement in the middle ranges of values. The principal causes of the diversity of opinion are believed to be the lack of site-specific data and the absence of a single, widely accepted, conceptual or theoretical basis for analyzing these variables.

  5. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Science.gov (United States)

    Santos, Sara M; Carvalho, Filipe; Mira, António

    2011-01-01

    Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. The guidance given here on monitoring frequencies is particularly relevant to provide conservation and transportation agencies with accurate numbers of road

  6. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Directory of Open Access Journals (Sweden)

    Sara M Santos

    Full Text Available BACKGROUND: Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. METHODOLOGY/PRINCIPAL FINDINGS: Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i describe carcass persistence timings for overall and for specific animal groups; ii assess optimal sampling designs according to research objectives; and iii model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning and lizards (in the afternoon, daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. CONCLUSION/SIGNIFICANCE: The guidance given here on monitoring frequencies is particularly relevant to provide

  7. How Long Do the Dead Survive on the Road? Carcass Persistence Probability and Implications for Road-Kill Monitoring Surveys

    Science.gov (United States)

    Santos, Sara M.; Carvalho, Filipe; Mira, António

    2011-01-01

    Background Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Methodology/Principal Findings Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. Conclusion/Significance The guidance given here on monitoring frequencies is particularly relevant to provide conservation and

  8. Automatic Sleep Stage Determination by Multi-Valued Decision Making Based on Conditional Probability with Optimal Parameters

    Science.gov (United States)

    Wang, Bei; Sugi, Takenao; Wang, Xingyu; Nakamura, Masatoshi

    Data for human sleep study may be affected by internal and external influences. The recorded sleep data contains complex and stochastic factors, which increase the difficulties for the computerized sleep stage determination techniques to be applied for clinical practice. The aim of this study is to develop an automatic sleep stage determination system which is optimized for variable sleep data. The main methodology includes two modules: expert knowledge database construction and automatic sleep stage determination. Visual inspection by a qualified clinician is utilized to obtain the probability density function of parameters during the learning process of expert knowledge database construction. Parameter selection is introduced in order to make the algorithm flexible. Automatic sleep stage determination is manipulated based on conditional probability. The result showed close agreement comparing with the visual inspection by clinician. The developed system can meet the customized requirements in hospitals and institutions.

  9. THE REGULARITY OF INFLUENCE OF TRAFFIC PARAMETERS ON THE PROBABILITY OF REALISATION OF PLANNED PASSENGER TRANSFER AT TRANSFER NODES

    Directory of Open Access Journals (Sweden)

    G. Samchuk

    2017-06-01

    Full Text Available The article deals with the definition of traffic parameters that ensure the minimum value of the transfer waiting time for passengers. On the basis of experimental studies results, a regression equation to determine the probability of realisation of the planned transfer between a pair of vehicles was proposed. Using the identified regression equation, the transfer waiting time can be assessed for any headway exceeding 7,5 min.

  10. Consistent modeling of scalar mixing for presumed, multiple parameter probability density functions

    Science.gov (United States)

    Mortensen, Mikael

    2005-01-01

    In this Brief Communication we describe a consistent method for calculating the conditional scalar dissipation (or diffusion) rate for inhomogeneous turbulent flows. The model follows from the transport equation for the conserved scalar probability density function (PDF) using a gradient diffusion closure for the conditional mean velocity and a presumed PDF depending on any number of mixture fraction moments. With the presumed β PDF, the model is an inhomogeneous modification to the homogeneous model of Girimaji ["On the modeling of scalar diffusion in isotropic turbulence," Phys. Fluids A 4, 2529 (1992)]. An important feature of the model is that it makes the classical approach to the conditional moment closure completely conservative for inhomogeneous flows.

  11. Dynamic identification of growth and survival kinetic parameters of microorganisms in foods

    Science.gov (United States)

    Inverse analysis is a mathematical method used in predictive microbiology to determine the kinetic parameters of microbial growth and survival in foods. The traditional approach in inverse analysis relies on isothermal experiments that are time-consuming and labor-intensive, and errors are accumula...

  12. Genetic parameters and factors influencing survival to 24 hrs after birth in Danish meat sheep breeds

    DEFF Research Database (Denmark)

    Maxa, J; Sharifi, A R; Pedersen, J

    2009-01-01

    negative, which will make breeding for this trait more difficult. However, on the basis of estimated genetic parameters, it can be concluded that it is possible to improve survival to 24 h after birth in meat sheep breeds by accounting for both direct and maternal genetic effects in breeding programs...

  13. Internationally comparable diagnosis-specific survival probabilities for calculation of the ICD-10-based Injury Severity Score

    DEFF Research Database (Denmark)

    Gedeborg, R.; Warner, M.; Chen, L. H.

    2014-01-01

    BACKGROUND: The International Statistical Classification of Diseases, 10th Revision (ICD-10) -based Injury Severity Score (ICISS) performs well but requires diagnosis-specific survival probabilities (DSPs), which are empirically derived, for its calculation. The objective was to examine if DSPs b...... based on data pooled from several countries could increase accuracy, precision, utility, and international comparability of DSPs and ICISS. METHODS: Australia, Argentina, Austria, Canada, Denmark, New Zealand, and Sweden provided ICD-10-coded injury hospital discharge data, including in......-hospital mortality status. Data from the seven countries were pooled using four different methods to create an international collaborative effort ICISS (ICE-ICISS). The ability of the ICISS to predict mortality using the country-specific DSPs and the pooled DSPs was estimated and compared. RESULTS: The pooled DSPs...... generated empirically derived DSPs. These pooled DSPs facilitate international comparisons and enables the use of ICISS in all settings where ICD-10 hospital discharge diagnoses are available. The modest reduction in performance of the ICE-ICISS compared with the country-specific scores is unlikely...

  14. Probability Distributions for Cyclone Key Parameters and Cyclonic Wind Speed for the East Coast of Indian Region

    Directory of Open Access Journals (Sweden)

    Pradeep K. Goyal

    2011-09-01

    Full Text Available This paper presents a study conducted on the probabilistic distribution of key cyclone parameters and the cyclonic wind speed by analyzing the cyclone track records obtained from India meteorological department for east coast region of India. The dataset of historical landfalling storm tracks in India from 1975–2007 with latitude /longitude and landfall locations are used to map the cyclone tracks in a region of study. The statistical tests were performed to find a best fit distribution to the track data for each cyclone parameter. These parameters include central pressure difference, the radius of maximum wind speed, the translation velocity, track angle with site and are used to generate digital simulated cyclones using wind field simulation techniques. For this, different sets of values for all the cyclone key parameters are generated randomly from their probability distributions. Using these simulated values of the cyclone key parameters, the distribution of wind velocity at a particular site is obtained. The same distribution of wind velocity at the site is also obtained from actual track records and using the distributions of the cyclone key parameters as published in the literature. The simulated distribution is compared with the wind speed distributions obtained from actual track records. The findings are useful in cyclone disaster mitigation.

  15. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    Science.gov (United States)

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  16. Nomogram including pretherapeutic parameters for prediction of survival after SIRT of hepatic metastases from colorectal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Fendler, Wolfgang Peter [Ludwig-Maximilians-University of Munich, Department of Nuclear Medicine, Munich (Germany); Klinik und Poliklinik fuer Nuklearmedizin, Munich (Germany); Ilhan, Harun [Ludwig-Maximilians-University of Munich, Department of Nuclear Medicine, Munich (Germany); Paprottka, Philipp M. [Ludwig-Maximilians-University of Munich, Department of Clinical Radiology, Munich (Germany); Jakobs, Tobias F. [Hospital Barmherzige Brueder, Department of Diagnostic and Interventional Radiology, Munich (Germany); Heinemann, Volker [Ludwig-Maximilians-University of Munich, Department of Internal Medicine III, Munich (Germany); Ludwig-Maximilians-University of Munich, Comprehensive Cancer Center, Munich (Germany); Bartenstein, Peter; Haug, Alexander R. [Ludwig-Maximilians-University of Munich, Department of Nuclear Medicine, Munich (Germany); Ludwig-Maximilians-University of Munich, Comprehensive Cancer Center, Munich (Germany); Khalaf, Feras [University Hospital Bonn, Department of Nuclear Medicine, Bonn (Germany); Ezziddin, Samer [Saarland University Medical Center, Department of Nuclear Medicine, Homburg (Germany); Hacker, Marcus [Vienna General Hospital, Department of Nuclear Medicine, Vienna (Austria)

    2015-09-15

    Pre-therapeutic prediction of outcome is important for clinicians and patients in determining whether selective internal radiation therapy (SIRT) is indicated for hepatic metastases of colorectal cancer (CRC). Pre-therapeutic characteristics of 100 patients with colorectal liver metastases (CRLM) treated by radioembolization were analyzed to develop a nomogram for predicting survival. Prognostic factors were selected by univariate Cox regression analysis and subsequent tested by multivariate analysis for predicting patient survival. The nomogram was validated with reference to an external patient cohort (n = 25) from the Bonn University Department of Nuclear Medicine. Of the 13 parameters tested, four were independently associated with reduced patient survival in multivariate analysis. These parameters included no liver surgery before SIRT (HR:1.81, p = 0.014), CEA serum level ≥ 150 ng/ml (HR:2.08, p = 0.001), transaminase toxicity level ≥2.5 x upper limit of normal (HR:2.82, p = 0.001), and summed computed tomography (CT) size of the largest two liver lesions ≥10 cm (HR:2.31, p < 0.001). The area under the receiver-operating characteristic curve for our prediction model was 0.83 for the external patient cohort, indicating superior performance of our multivariate model compared to a model ignoring covariates. The nomogram developed in our study entailing four pre-therapeutic parameters gives good prediction of patient survival post SIRT. (orig.)

  17. Determination of the compound nucleus survival probability Psurv for various "hot" fusion reactions based on the dynamical cluster-decay model

    Science.gov (United States)

    Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.

    2015-03-01

    After a successful attempt to define and determine recently the compound nucleus (CN) fusion/ formation probability PCN within the dynamical cluster-decay model (DCM), we introduce and estimate here for the first time the survival probability Psurv of CN against fission, again within the DCM. Calculated as the dynamical fragmentation process, Psurv is defined as the ratio of the evaporation residue (ER) cross section σER and the sum of σER and fusion-fission (ff) cross section σff, the CN formation cross section σCN, where each contributing fragmentation cross section is determined in terms of its formation and barrier penetration probabilities P0 and P . In DCM, the deformations up to hexadecapole and "compact" orientations for both in-plane (coplanar) and out-of-plane (noncoplanar) configurations are allowed. Some 16 "hot" fusion reactions, forming a CN of mass number ACN˜100 to superheavy nuclei, are analyzed for various different nuclear interaction potentials, and the variation of Psurv on CN excitation energy E*, fissility parameter χ , CN mass ACN, and Coulomb parameter Z1Z2 is investigated. Interesting results are that three groups, namely, weakly fissioning, radioactive, and strongly fissioning superheavy nuclei, are identified with Psurv, respectively, ˜1 ,˜10-6 , and ˜10-10 . For the weakly fissioning group (100

  18. Local relapse after breast-conserving surgery and radiotherapy. Effects on survival parameters

    Energy Technology Data Exchange (ETDEWEB)

    Hammer, Josef; Spiegl, Kurt J.; Feichtinger, Johannes; Braeutigam, Elisabeth [Dept. of Radiation Oncology, Barmherzige Schwesten Hospital, Linz (Austria); Track, Christine [Dept. of Radiation Oncology, Barmherzige Schwesten Hospital, Linz (Austria); Comprehensive Breast Health Center, Barmherzige Schwesten Hospital, Linz (Austria); Seewald, Dietmar H. [Dept. of Radiation Oncology, General Hospital, Voecklabruck (Austria); Petzer, Andreas L. [Dept. of Internal Medicine I - Hematology and Oncology, Barmherzige Schwesten Hospital, Linz (Austria); Langsteger, Werner [Dept. of Nuclear Medicine and PET Center, Barmherzige Schwesten Hospital, Linz (Austria); Poestlberger, Sabine [Comprehensive Breast Health Center, Barmherzige Schwesten Hospital, Linz (Austria); Dept. of Surgery, Barmherzige Schwesten Hospital, Linz (Austria)

    2009-07-15

    Purpose: This retrospective analysis of 1,610 women treated for breast cancer and 88 patients with local relapse aims to show the poor survival parameters after local failure and to evaluate risk factors and compare them with other studies and analyses published. Patients and methods: Between 1984 and 1997, 1,610 patients presenting with a total of 1,635 pT1-2 invasive and noninvasive carcinomas of the breast were treated at the authors' institution. The mean age was 57.1 years (range 25-85 years). Treatment protocols involved breast-conserving surgery with or without systemic therapy and whole-breast radiotherapy in all women, followed by a boost dose to the tumor bed according to risk factors for local recurrence. All axillary node-positive patients underwent systemic therapy (six cycles of classic CMF or tamoxifen 20 mg/day for 2-5 years). The time of diagnosis of local relapse was defined as time 0 for the survival curves after local failure. The association of clinicopathologic factors was studied using uni- and multivariate analyses. Survival and local control were calculated by the Kaplan-Meier actuarial method and significance by the log-rank test. Results: After a mean follow-up of 104 months, 88 local failures were recorded (5.4%). Calculated from the time of diagnosis of local relapse, 5-year overall survival (OS) was 62.8%, metastasis-free survival 60.6%, and disease-specific survival 64.2%. In patients with failure during the first 5 years after treatment, the survival parameters were worse (OS 50.6%) compared to those who relapsed after 5 years (OS 78.8%; p < 0.028). Significances were also found for initial T- and N-stage and type of failure (solid tumor vs. diffuse spread). Conclusion: This analysis again shows that the survival parameters are worsening after local relapse, especially in case of early occurrence. In breast cancer treatment, therefore, the goal remains to avoid local failure. (orig.)

  19. Alterations in erythrocyte survival parameters in rats after 19.5 days aboard Cosmos 782

    Science.gov (United States)

    Leon, H. A.; Serova, L. V.; Cummins, J.; Landaw, S. A.

    1978-01-01

    Rats were subjected to 19.5 days of weightless space flight aboard the Soviet biosatellite, Cosmos 782. Based on the output of CO-14, survival parameters of a cohort of erythrocytes labeled 15.5 days preflight were evaluated upon return from orbit. These were compared to vivarium control rats injected at the same time. Statistical evaluation indicates that all survival factors were altered by the space flight. The mean potential lifespan, which was 63.0 days in the control rats, was decreased to 59.0 days in the flight rats, and random hemolysis was increased three-fold in the flight rats. The measured size of the cohort was decreased, lending further support to the idea that hemolysis was accelerated during some portion of the flight. A number of factors that might be contributory to these changes are discussed, including forces associated with launch and reentry, atmospheric and environmental parameters, dietary factors, radiation, and weightlessness.

  20. Predictive parameters of survival in hemodialysis patients with restless leg syndrome

    Directory of Open Access Journals (Sweden)

    Radojica V Stolic

    2014-01-01

    Full Text Available Restless leg syndrome (RLS affects the quality of life and survival in patients on hemodialysis (HD. The aim of this study was to determine the characteristics and survival parameters in patients on HD with RLS. This study was a non-randomized clinical study involving 204 patients on HD, of whom 71 were female and 133 were male. Symptoms of RLS were defined as positive responses to four questions comprising the criteria of RLS. We recorded the outcome of treatment, biochemical analyses, demographic, sexual, anthropometric and clinical characteristics in all study patients. Patients with RLS who completed the study had a significantly higher body mass index and lower intima-media thickness and flow through the arteriovenous fistula. Among patients with RLS who died, there were more smokers as well as higher incidences of cardiovascular disease and diabetes mellitus. Among patients with RLS who survived, there were a greater number of patients with preserved diuresis and receiving erythropoietin therapy. Patients who completed the study had significantly higher levels of hemoglobin, creatinine, serum iron and transferrin satura-tion. Diabetes mellitus (B = 1.802; P = 0.002 and low Kt/V (B = -5.218; P = 0.001 were major predictive parameters for survival.

  1. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  2. Quantifying Uranium Isotope Ratios Using Resonance Ionization Mass Spectrometry: The Influence of Laser Parameters on Relative Ionization Probability

    Energy Technology Data Exchange (ETDEWEB)

    Isselhardt, Brett H. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure relative uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process to provide a distinction between uranium atoms and potential isobars without the aid of chemical purification and separation. We explore the laser parameters critical to the ionization process and their effects on the measured isotope ratio. Specifically, the use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of 235U/238U ratios to decrease laser-induced isotopic fractionation. By broadening the bandwidth of the first laser in a 3-color, 3-photon ionization process from a bandwidth of 1.8 GHz to about 10 GHz, the variation in sequential relative isotope abundance measurements decreased from >10% to less than 0.5%. This procedure was demonstrated for the direct interrogation of uranium oxide targets with essentially no sample preparation. A rate equation model for predicting the relative ionization probability has been developed to study the effect of variation in laser parameters on the measured isotope ratio. This work demonstrates that RIMS can be used for the robust measurement of uranium isotope ratios.

  3. Colon cancer: association of histopathological parameters and patients' survival with clinical presentation.

    Science.gov (United States)

    Alexiusdottir, Kristin K; Snaebjornsson, Petur; Tryggvadottir, Laufey; Jonasson, Larus; Olafsdottir, Elinborg J; Björnsson, Einar Stefan; Möller, Pall Helgi; Jonasson, Jon G

    2013-10-01

    Available data correlating symptoms of colon cancer patients with the severity of the disease are very limited. In a population-based setting, we correlated information on symptoms of colon cancer patients with several pathological tumor parameters and survival. Information on all patients diagnosed with colon cancer in Iceland in 1995-2004 for this retrospective, population-based study was obtained from the Icelandic Cancer Registry. Information on symptoms of patients and blood hemoglobin was collected from patients' files. Pathological parameters were obtained from a previously performed standardized tumor review. A total of 768 patients entered this study; the median age was 73 years. Tumors in patients presenting at diagnosis with visible blood in stools were significantly more likely to be of lower grade, having pushing border, conspicuous peritumoral lymphocytic infiltration, and lower frequency of vessel invasion. Patients with abdominal pain and anemia were significantly more likely to have vessel invasion. Logistic regression showed that visible blood in stools was significantly associated with protecting pathological factors (OR range 0.38-0.83, p < 0.05). Tumors in patients presenting with abdominal pain were strongly associated with infiltrative margin and scarce peritumoral lymphocytic infiltration (OR = 1.95; 2.18 respectively, p < 0.05). Changes in bowel habits were strongly associated with vessel invasion (OR = 2.03, p < 0.05). Cox regression showed that blood in stools predicted survival (HR = 0.54). In conclusion, visible blood in stools correlates significantly with all the beneficial pathological parameters analyzed and with better survival of patients. Anemia, general symptoms, changes in bowel habits, acute symptoms, and abdominal pain correlate with more aggressive tumor characteristics and adverse outcome for patients. © 2013 APMIS Published by Blackwell Publishing Ltd.

  4. Effect of noise and detector sensitivity on a dynamical process: inverse power law and Mittag-Leffler interevent time survival probabilities.

    Science.gov (United States)

    Pramukkul, Pensri; Svenkeson, Adam; Grigolini, Paolo

    2014-02-01

    We study the combined effects of noise and detector sensitivity on a dynamical process that generates intermittent events mimicking the behavior of complex systems. By varying the sensitivity level of the detector we move between two forms of complexity, from inverse power law to Mittag-Leffler interevent time survival probabilities. Here fluctuations fight against complexity, causing an exponential truncation to the survival probability. We show that fluctuations of relatively weak intensity have a strong effect on the generation of Mittag-Leffler complexity, providing a reason why stretched exponentials are frequently found in nature. Our results afford a more unified picture of complexity resting on the Mittag-Leffler function and encompassing the standard inverse power law definition.

  5. Effect of noise and detector sensitivity on a dynamical process: Inverse power law and Mittag-Leffler interevent time survival probabilities

    Science.gov (United States)

    Pramukkul, Pensri; Svenkeson, Adam; Grigolini, Paolo

    2014-02-01

    We study the combined effects of noise and detector sensitivity on a dynamical process that generates intermittent events mimicking the behavior of complex systems. By varying the sensitivity level of the detector we move between two forms of complexity, from inverse power law to Mittag-Leffler interevent time survival probabilities. Here fluctuations fight against complexity, causing an exponential truncation to the survival probability. We show that fluctuations of relatively weak intensity have a strong effect on the generation of Mittag-Leffler complexity, providing a reason why stretched exponentials are frequently found in nature. Our results afford a more unified picture of complexity resting on the Mittag-Leffler function and encompassing the standard inverse power law definition.

  6. Evaluation of Genetic Diversity Using Parameters Based on Probability of Gene Origin in the Slovak Spotted Bulls

    Directory of Open Access Journals (Sweden)

    E. Hazuchová

    2012-05-01

    Full Text Available The aim of this study was to assess the diversity based on probability of gene origin in Slovak Spotted bulls. The pedigree information was available from The Breeding Services of the Slovak Republic, s. e. The pedigree file consisted of 752 individuals. The 62 sires born from 1995 to 2009 and registered in Herd book set up the analyzed reference (RP population. Total number of founders in the RP was 308, effective number of founders was 115 and the effective number of ancestors was 37. The number of ancestors explaining 50 % of the diversity was 15 and founder’s genome equivalent was 20.46. The sire GS Malf and Horwein were with 16 offspring’s the most frequently used bulls in the artificial insemination. We found that the genetic conservation index for RP was 16.34 %. Results will be used in genetic management of breeding work in Slovak Spotted and monitoring of parameters characterizing genetic diversity and their development, as well.

  7. Application of Survival Analysis to Study Timing and Probability of Outcome Attainment by a Community College Student Cohort

    Science.gov (United States)

    Mourad, Roger; Hong, Ji-Hee

    2008-01-01

    This study applies competing risks survival analysis to describe outcome attainment for an entire cohort of students who first attended a Midwestern community college in the Fall Semester 2001. Outcome attainment included transfer to a four-year institution, degree/ certificate attainment from the community college under study, and transfer to a…

  8. RATIO BETWEEN CARIOMETRIC PARAMETERS OF THE NUCLEUS AND THE SURVIVAL DURATION OF THE PATIENTS SUFFERING FROM THE INTRAORAL LOCALIZATION CARCINOMA

    Directory of Open Access Journals (Sweden)

    Biljana Đorđević

    2002-11-01

    Full Text Available Carcinoma of intraoral localization is characterized by an intensive and short biological course with a low degree of five-year survival from 30 to 50%. In the second half of the twentieth century cariometry developed as a sub discipline of patho-histology. The aim of the paper is to determine the ratio between cariometric parameters for the nuclei of the intraoral localization carcinoma cells and the survival duration of the patients with this kind of tumor.The research included the examination of 75 patients; the multivariant regression analysis was applied to find the correlation between cariometric parameters for the nucleus and the survival duration.There was no statistically important correlation established between cariometric parameters for the nucleus and the survival duration in the patients with intraoral carcinoma; thus, it can be said that the parameters for the nucleus cannot give any relevant data for the biological course of the intraoral localization carcinoma.

  9. Significance of platelet and AFP levels and liver function parameters for HCC size and survival.

    Science.gov (United States)

    Carr, Brian I; Guerra, Vito; Giannini, Edoardo G; Farinati, Fabio; Ciccarese, Francesca; Rapaccini, Gian Ludovico; Di Marco, Maria; Benvegnù, Luisa; Zoli, Marco; Borzio, Franco; Caturelli, Eugenio; Chiaramonte, Maria; Trevisani, Franco

    2014-09-30

    Hepatocellular carcinoma (HCC) is a heterogeneous disease with both tumor and liver factors being involved. To investigate HCC clinical phenotypes and factors related to HCC size. Prospectively-collected HCC patients' data from a large Italian database were arranged according to the maximum tumor diameter (MTD) and divided into tumor size terciles, which were then compared in terms of several common clinical parameters and patients' survival. An higer MTD tercile was significantly associated with increased blood alpha-fetoprotein (AFP), gamma-glutamyl transpeptidase (GGTP), and platelet levels. Patients with higher platelet levels had larger tumors and higher GGTP levels, with lower bilirubin levels. However, patients with the highest AFP levels had larger tumors and higher bilirubin levels, reflecting an aggressive biology. AFP correlation analysis revealed the existence of 2 different groups of patients: those with higher and with lower AFP levels, each with different patient and tumor characteristics. The Cox proportional-hazard model showed that a higher risk of death was correlated with GGTP and bilirubin levels, tumor size and number, and portal vein thrombosis (PVT), but not with AFP or platelet levels. An increased tumor size was associated with increased blood platelet counts, AFP and GGTP levels. Platelet and AFP levels were important indicators of tumor size, but not of survival.

  10. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    NARCIS (Netherlands)

    Mourik, van S.; Braak, ter C.J.F.; Stigter, J.D.; Molenaar, J.

    2014-01-01

    Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate

  11. Survival

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data provide information on the survival of California red-legged frogs in a unique ecosystem to better conserve this threatened species while restoring...

  12. Survival probability of Baltic larval cod in relation to spatial overlap patterns with their prey obtained from drift model studies

    DEFF Research Database (Denmark)

    Hinrichsen, H.H.; Schmidt, J.O.; Petereit, C.

    2005-01-01

    patterns on the overlap of Baltic cod larvae with their prey. A three-dimensional hydrodynamic model was used to analyse spatio-temporally resolved drift patterns of larval Baltic cod. A coefficient of overlap between modelled larval and idealized prey distributions indicated the probability of predator......-prey overlap, dependent on the hatching time of cod larvae. By performing model runs for the years 1979-1998 investigated the intra- and interannual variability of potential spatial overlap between predator and prey. Assuming uniform prey distributions, we generally found the overlap to have decreased since...... the mid-1980s, but with the highest variability during the 1990s. Seasonally, predator-prey overlap on the Baltic cod spawning grounds was highest in summer and lowest at the end of the cod spawning season. Horizontally variable prey distributions generally resulted in decreased overlap coefficients...

  13. Fracture strength and probability of survival of narrow and extra-narrow dental implants after fatigue testing: In vitro and in silico analysis.

    Science.gov (United States)

    Bordin, Dimorvan; Bergamo, Edmara T P; Fardin, Vinicius P; Coelho, Paulo G; Bonfante, Estevam A

    2017-07-01

    To assess the probability of survival (reliability) and failure modes of narrow implants with different diameters. For fatigue testing, 42 implants with the same macrogeometry and internal conical connection were divided, according to diameter, as follows: narrow (Ø3.3×10mm) and extra-narrow (Ø2.9×10mm) (21 per group). Identical abutments were torqued to the implants and standardized maxillary incisor crowns were cemented and subjected to step-stress accelerated life testing (SSALT) in water. The use-level probability Weibull curves, and reliability for a mission of 50,000 and 100,000 cycles at 50N, 100, 150 and 180N were calculated. For the finite element analysis (FEA), two virtual models, simulating the samples tested in fatigue, were constructed. Loading at 50N and 100N were applied 30° off-axis at the crown. The von-Mises stress was calculated for implant and abutment. The beta (β) values were: 0.67 for narrow and 1.32 for extra-narrow implants, indicating that failure rates did not increase with fatigue in the former, but more likely were associated with damage accumulation and wear-out failures in the latter. Both groups showed high reliability (up to 97.5%) at 50 and 100N. A decreased reliability was observed for both groups at 150 and 180N (ranging from 0 to 82.3%), but no significant difference was observed between groups. Failure predominantly involved abutment fracture for both groups. FEA at 50N-load, Ø3.3mm showed higher von-Mises stress for abutment (7.75%) and implant (2%) when compared to the Ø2.9mm. There was no significant difference between narrow and extra-narrow implants regarding probability of survival. The failure mode was similar for both groups, restricted to abutment fracture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    Directory of Open Access Journals (Sweden)

    Simon van Mourik

    2014-06-01

    Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.

  15. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters.

    Science.gov (United States)

    van Mourik, Simon; Ter Braak, Cajo; Stigter, Hans; Molenaar, Jaap

    2014-01-01

    Multi-parameter models in systems biology are typically 'sloppy': some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC) algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler) and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.

  16. Estimation of Survival Probabilities for Use in Cost-effectiveness Analyses: A Comparison of a Multi-state Modeling Survival Analysis Approach with Partitioned Survival and Markov Decision-Analytic Modeling.

    Science.gov (United States)

    Williams, Claire; Lewsey, James D; Mackay, Daniel F; Briggs, Andrew H

    2017-05-01

    Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results.

  17. Effects of some probable antioxidants on selenite-induced cataract formation and oxidative stress-related parameters in rats.

    Science.gov (United States)

    Orhan, H; Marol, S; Hepşen, I F; Sahin, G

    1999-12-06

    The effect of several natural and synthetic compounds on selenite-induced cataract was investigated in rat pups. Simultaneous determination of glutathione S-transferase (GST), selenium dependent glutathione peroxidase (Se-GPx), catalase (CAT), superoxide dismutase (SOD) activities and malondialdehyde (MDA) levels were carried out in the lens, erythrocyte and plasma. The results showed that propolis, diclofenac, vitamin C (Vit-C) and quercetin prevented cataract formation to the extent of 70, 60, 58.4, and 40%, respectively. Standardized extract of Ginkgo biloba (Egb 761) did not affect the cataract formation. Selenite treatment caused a significant decrease in the activity of erythrocyte SOD. This was accompanied by a simultaneous increase in the levels of MDA either in lens and in plasma. A significant increase was shown in erythrocyte GST (substrate ethacrynic acid; eaa), and GPx activities and lens GST (substrate chlorodinitro benzene; cdnb) activity. Antioxidant treatment caused significant changes in enzyme activities and MDA levels. There was no effect of selenite and antioxidants on total body weight increase during the course of the study. Blood parameters did not correlate to lens parameters following selenite treatment. Our results suggest that antioxidant supplementation following selenite exposure may prevent the cataract formation and may enhance antioxidant defence of blood and lens.

  18. The influence of printing parameters on cell survival rate and printability in microextrusion-based 3D cell printing technology.

    Science.gov (United States)

    Zhao, Yu; Li, Yang; Mao, Shuangshuang; Sun, Wei; Yao, Rui

    2015-11-02

    Three-dimensional (3D) cell printing technology has provided a versatile methodology to fabricate cell-laden tissue-like constructs and in vitro tissue/pathological models for tissue engineering, drug testing and screening applications. However, it still remains a challenge to print bioinks with high viscoelasticity to achieve long-term stable structure and maintain high cell survival rate after printing at the same time. In this study, we systematically investigated the influence of 3D cell printing parameters, i.e. composition and concentration of bioink, holding temperature and holding time, on the printability and cell survival rate in microextrusion-based 3D cell printing technology. Rheological measurements were utilized to characterize the viscoelasticity of gelatin-based bioinks. Results demonstrated that the bioink viscoelasticity was increased when increasing the bioink concentration, increasing holding time and decreasing holding temperature below gelation temperature. The decline of cell survival rate after 3D cell printing process was observed when increasing the viscoelasticity of the gelatin-based bioinks. However, different process parameter combinations would result in the similar rheological characteristics and thus showed similar cell survival rate after 3D bioprinting process. On the other hand, bioink viscoelasticity should also reach a certain point to ensure good printability and shape fidelity. At last, we proposed a protocol for 3D bioprinting of temperature-sensitive gelatin-based hydrogel bioinks with both high cell survival rate and good printability. This research would be useful for biofabrication researchers to adjust the 3D bioprinting process parameters quickly and as a referable template for designing new bioinks.

  19. Effect of inactive yeast cell wall on growth performance, survival rate and immune parameters in Pacific White Shrimp (Litopenaeus vannamei

    Directory of Open Access Journals (Sweden)

    Rutchanee Chotikachinda

    2008-10-01

    Full Text Available Effects of dietary inactive yeast cell wall on growth performance, survival rate, and immune parameters in pacific white shrimp (Litopenaeus vannamei was investigated. Three dosages of inactive yeast cell wall (0, 1, and 2 g kg-1 were tested in three replicate groups of juvenile shrimps with an average initial weight of 7.15±0.05 g for four weeks. There was no significant difference in final weight, survival rate, specific growth rate, feed conversion ratio, feed intake, protein efficiency ratio, and apparent net protein utilization of each treatments. However, different levels of inactive yeast cell wall showed an effect on certain immune parameters (p<0.05. Total hemocyte counts, granular hemocyte count, and bacterial clearance were better in shrimp fed diets supplemented with 1 and 2 g kg-1 inactive yeast cell wall as compared with thecontrol group.

  20. Sensor Fusion Based on an Integrated Neural Network and Probability Density Function (PDF) Dual Kalman Filter for On-Line Estimation of Vehicle Parameters and States.

    Science.gov (United States)

    Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente

    2017-04-29

    Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.

  1. Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of three basalt waste isolation project staff hydrologists

    Energy Technology Data Exchange (ETDEWEB)

    Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.

    1984-11-01

    The present study implemented a probability encoding method to estimate the probability distributions of selected hydrologic variables for the Cohassett basalt flow top and flow interior, and the anisotropy ratio of the interior of the Cohassett basalt flow beneath the Hanford Site. Site-speciic data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. However, this information is required to complete preliminary performance assessment studies. Rockwell chose a probability encoding method developed by SRI International to generate credible and auditable estimates of the probability distributions of effective porosity and hydraulic conductivity anisotropy. The results indicate significant differences of opinion among the experts. This was especially true of the values of the effective porosity of the Cohassett basalt flow interior for which estimates differ by more than five orders of magnitude. The experts are in greater agreement about the values of effective porosity of the Cohassett basalt flow top; their estimates for this variable are generally within one to two orders of magnitiude of each other. For anisotropy ratio, the expert estimates are generally within two or three orders of magnitude of each other. Based on this study, the Rockwell hydrologists estimate the effective porosity of the Cohassett basalt flow top to be generally higher than do the independent experts. For the effective porosity of the Cohassett basalt flow top, the estimates of the Rockwell hydrologists indicate a smaller uncertainty than do the estimates of the independent experts. On the other hand, for the effective porosity and anisotropy ratio of the Cohassett basalt flow interior, the estimates of the Rockwell hydrologists indicate a larger uncertainty than do the estimates of the independent experts.

  2. Post-treatment changes of tumour perfusion parameters can help to predict survival in patients with high-grade astrocytoma

    Energy Technology Data Exchange (ETDEWEB)

    Sanz-Requena, Roberto; Marti-Bonmati, Luis [Hospital Quironsalud Valencia, Radiology Department, Valencia (Spain); Hospital Universitari i Politecnic La Fe, Grupo de Investigacion Biomedica en Imagen, Valencia (Spain); Revert-Ventura, Antonio J.; Salame-Gamarra, Fares [Hospital de Manises, Radiology Department, Manises (Spain); Garcia-Marti, Gracian [Hospital Quironsalud Valencia, Radiology Department, Valencia (Spain); Hospital Universitari i Politecnic La Fe, Grupo de Investigacion Biomedica en Imagen, Valencia (Spain); CIBER-SAM, Instituto de Salud Carlos III, Madrid (Spain); Perez-Girbes, Alexandre [Hospital Universitari i Politecnic La Fe, Grupo de Investigacion Biomedica en Imagen, Valencia (Spain); Molla-Olmos, Enrique [Hospital La Ribera, Radiology Department, Alzira (Spain)

    2017-08-15

    Vascular characteristics of tumour and peritumoral volumes of high-grade gliomas change with treatment. This work evaluates the variations of T2*-weighted perfusion parameters as overall survival (OS) predictors. Forty-five patients with histologically confirmed high-grade astrocytoma (8 grade III and 37 grade IV) were included. All patients underwent pre- and post-treatment T2*-weighted contrast-enhanced magnetic resonance (MR) imaging. Tumour, peritumoral and control volumes were segmented. Relative variations of cerebral blood flow (CBF), cerebral blood volume (CBV), mean transit time (MTT), K{sup trans-T2*}, k{sub ep-T2*}, v{sub e-T2*} and v{sub p-T2*} were calculated. Differences regarding tumour grade and surgical resection extension were evaluated with ANOVA tests. For each parameter, two groups were defined by non-supervised clusterisation. Survival analysis were performed on these groups. For the tumour region, the 90th percentile increase or stagnation of CBV was associated with shorter survival, while a decrease related to longer survival (393 ± 189 vs 594 ± 294 days; log-rank p = 0.019; Cox hazard-ratio, 2.31; 95% confidence interval [CI], 1.12-4.74). K{sup trans-T2*} showed similar results (414 ± 177 vs 553 ± 312 days; log-rank p = 0.037; hazard-ratio, 2.19; 95% CI, 1.03-4.65). The peritumoral area values showed no relationship with OS. Post-treatment variations of the highest CBV and K{sup trans-T2*} values in the tumour volume are predictive factors of OS in patients with high-grade gliomas. (orig.)

  3. Echocardiographic parameters and survival in Chagas heart disease with severe systolic dysfunction.

    Science.gov (United States)

    Rassi, Daniela do Carmo; Vieira, Marcelo Luiz Campos; Arruda, Ana Lúcia Martins; Hotta, Viviane Tiemi; Furtado, Rogério Gomes; Rassi, Danilo Teixeira; Rassi, Salvador

    2014-03-01

    Echocardiography provides important information on the cardiac evaluation of patients with heart failure. The identification of echocardiographic parameters in severe Chagas heart disease would help implement treatment and assess prognosis. To correlate echocardiographic parameters with the endpoint cardiovascular mortality in patients with ejection fraction Cardiopatias) - Chagas heart disease arm. The following parameters were collected: left ventricular systolic and diastolic diameters and volumes; ejection fraction; left atrial diameter; left atrial volume; indexed left atrial volume; systolic pulmonary artery pressure; integral of the aortic flow velocity; myocardial performance index; rate of increase of left ventricular pressure; isovolumic relaxation time; E, A, Em, Am and Sm wave velocities; E wave deceleration time; E/A and E/Em ratios; and mitral regurgitation. In the mean 24.18-month follow-up, 27 patients died. The mean ejection fraction was 26.6 ± 5.34%. In the multivariate analysis, the parameters ejection fraction (HR = 1.114; p = 0.3704), indexed left atrial volume (HR = 1.033; p 70.71 mL/m2 were associated with a significant increase in mortality (log rank p < 0.0001). The indexed left atrial volume was the only independent predictor of mortality in this population of Chagasic patients with severe systolic dysfunction.

  4. Echocardiographic Parameters and Survival in Chagas Heart Disease with Severe Systolic Dysfunction

    Energy Technology Data Exchange (ETDEWEB)

    Rassi, Daniela do Carmo, E-mail: dani.rassi@hotmail.com [Faculdade de Medicina e Hospital das Clínicas da Universidade Federal de Goiás (UFG), Goiânia, GO (Brazil); Vieira, Marcelo Luiz Campos [Instituto do Coração da Faculdade de Medicina da Universidade de São Paulo (USP), São Paulo, SP (Brazil); Arruda, Ana Lúcia Martins [Instituto de Radiologia da Faculdade de Medicina da Universidade de São Paulo (USP), São Paulo, SP (Brazil); Hotta, Viviane Tiemi [Instituto do Coração da Faculdade de Medicina da Universidade de São Paulo (USP), São Paulo, SP (Brazil); Furtado, Rogério Gomes; Rassi, Danilo Teixeira; Rassi, Salvador [Faculdade de Medicina e Hospital das Clínicas da Universidade Federal de Goiás (UFG), Goiânia, GO (Brazil)

    2014-03-15

    Echocardiography provides important information on the cardiac evaluation of patients with heart failure. The identification of echocardiographic parameters in severe Chagas heart disease would help implement treatment and assess prognosis. To correlate echocardiographic parameters with the endpoint cardiovascular mortality in patients with ejection fraction < 35%. Study with retrospective analysis of pre-specified echocardiographic parameters prospectively collected from 60 patients included in the Multicenter Randomized Trial of Cell Therapy in Patients with Heart Diseases (Estudo Multicêntrico Randomizado de Terapia Celular em Cardiopatias) - Chagas heart disease arm. The following parameters were collected: left ventricular systolic and diastolic diameters and volumes; ejection fraction; left atrial diameter; left atrial volume; indexed left atrial volume; systolic pulmonary artery pressure; integral of the aortic flow velocity; myocardial performance index; rate of increase of left ventricular pressure; isovolumic relaxation time; E, A, Em, Am and Sm wave velocities; E wave deceleration time; E/A and E/Em ratios; and mitral regurgitation. In the mean 24.18-month follow-up, 27 patients died. The mean ejection fraction was 26.6 ± 5.34%. In the multivariate analysis, the parameters ejection fraction (HR = 1.114; p = 0.3704), indexed left atrial volume (HR = 1.033; p < 0.0001) and E/Em ratio (HR = 0.95; p = 0.1261) were excluded. The indexed left atrial volume was an independent predictor in relation to the endpoint, and values > 70.71 mL/m{sup 2} were associated with a significant increase in mortality (log rank p < 0.0001). The indexed left atrial volume was the only independent predictor of mortality in this population of Chagasic patients with severe systolic dysfunction.

  5. Reproductive parameters and cub survival of brown bears in the Rusha area of the Shiretoko Peninsula, Hokkaido, Japan.

    Science.gov (United States)

    Shimozuru, Michito; Yamanaka, Masami; Nakanishi, Masanao; Moriwaki, Jun; Mori, Fumihiko; Tsujino, Masakatsu; Shirane, Yuri; Ishinazaka, Tsuyoshi; Kasai, Shinsuke; Nose, Takane; Masuda, Yasushi; Tsubota, Toshio

    2017-01-01

    Knowing the reproductive characteristics of a species is essential for the appropriate conservation and management of wildlife. In this study, we investigated the demographic parameters, including age of primiparity, litter size, inter-birth interval, reproductive rate, and cub survival rate, of Hokkaido brown bears (Ursus arctos yesoensis) in the Rusha area on the Shiretoko Peninsula, Hokkaido, Japan, based on a long-term, individual-based monitoring survey. A total of 15 philopatric females were observed nearly every year from 2006 to 2016, and these observations were used to estimate reproductive parameters. The mean age of primiparity was 5.3 ± 0.2 (SE) years (n = 7, 95% CI = 5.0-5.6). We observed 81 cubs in 46 litters from 15 bears. Litter size ranged from one to three cubs, and averaged 1.76 ± 0.08 (SE) cubs/litter (95% CI = 1.61-1.91). Inter-birth intervals ranged from 1 to 4 years, and the mean value was estimated as 2.43 (95% CI = 2.16-2.76) and 2.53 (95% CI = 2.26-2.85) years in all litters and in litters that survived at least their first year, respectively. The reproductive rate was estimated from 0.70 to 0.76 young born/year/reproductive adult female, depending on the method of calculation. The cub survival rate between 0.5 and 1.5 years ranged from 60 to 73%. Most cub disappearances occurred in July and August, suggesting that cub mortality is mainly due to poor nutrition in the summer. All reproductive parameters observed in the Rusha area on the Shiretoko Peninsula fell within the range reported in Europe and North America, and were among the lowest or shortest age of primiparity, litter size, and inter-birth intervals, and ranked at a high level for reproductive rate.

  6. The association of {sup 18}F-FDG PET/CT parameters with survival in malignant pleural mesothelioma

    Energy Technology Data Exchange (ETDEWEB)

    Klabatsa, Astero; Lang-Lazdunski, Loic [Guys and St Thomas' NHS Foundation Trust, Department of Thoracic Oncology, London (United Kingdom); Chicklore, Sugama; Barrington, Sally F.; Goh, Vicky [Kings College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Cook, Gary J.R. [Kings College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Kings College London, Clinical PET Centre, Division of Imaging Sciences and Biomedical Engineering, St Thomas' Hospital, London (United Kingdom)

    2014-02-15

    Malignant pleural mesothelioma (MPM) is a disease with poor prognosis despite multimodal therapy but there is variation in survival between patients. Prognostic information is therefore potentially valuable in managing patients, particularly in the context of clinical trials where patients could be stratified according to risk. Therefore we have evaluated the prognostic ability of parameters derived from baseline 2-[{sup 18}F]fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography ({sup 18}F-FDG PET/CT). In order to determine the relationships between metabolic activity and prognosis we reviewed all {sup 18}F-FDG PET/CT scans used for pretreatment staging of MPM patients in our institution between January 2005 and December 2011 (n = 60) and measured standardised uptake values (SUV) including mean, maximum and peak values, metabolic tumour volume (MTV) and total lesion glycolysis (TLG). Overall survival (OS) or time to last censor was recorded, as well as histological subtypes. Median follow-up was 12.7 months (1.9-60.9) and median OS was 14.1 months (1.9-54.9). By univariable analysis histological subtype (p = 0.013), TLG (p = 0.024) and MTV (p = 0.038) were significantly associated with OS and SUV{sub max} was borderline (p = 0.051). On multivariable analysis histological subtype and TLG were associated with OS but at borderline statistical significance (p = 0.060 and 0.058, respectively). No statistically significant differences in any PET parameters were found between the epithelioid and non-epithelioid histological subtypes. {sup 18}F-FDG PET/CT parameters that take into account functional volume (MTV, TLG) show significant associations with survival in patients with MPM before adjusting for histological subtype and are worthy of further evaluation to determine their ability to stratify patients in clinical trials. (orig.)

  7. Effect of gamma radiation on the growth, survival, hematology and histological parameters of rainbow trout (Oncorhynchus mykiss) larvae

    Energy Technology Data Exchange (ETDEWEB)

    Oujifard, Amin, E-mail: oujifard.amin@gmail.com [Fisheries Department, Faculty of Agriculture and Natural Resources, Persian Gulf University, Borazjan, Bushehr (Iran, Islamic Republic of); Amiri, Roghayeh [Department of Veterinary, Agricultural Medical and Industrial Research School, Nuclear Science and Technology Research Institute, AEOI, Karaj (Iran, Islamic Republic of); Shahhosseini, Gholamreza [Fisheries Department, Faculty of Natural Resources and Marine Sciences, TarbiatModares University, Noor, Mazandaran (Iran, Islamic Republic of); Davoodi, Reza [Fisheries Department, Faculty of Agriculture and Natural Resources, Persian Gulf University, Borazjan, Bushehr (Iran, Islamic Republic of); Moghaddam, Jamshid Amiri [Fisheries Department, Faculty of Natural Resources and Marine Sciences, TarbiatModares University, Noor, Mazandaran (Iran, Islamic Republic of)

    2015-08-15

    Highlights: • Incrementing of gamma radiation reveals the negative effects on fish larvae. • Radiation adversely affected the weight, blood cells and intestinal morphology of the larvae. • No mortality was observed at low dosage of gamma radiation on fish larvae. - Abstract: Effects of low (1, 2.5 and 5 Gy) and high doses (10, 20 and 40 Gy) of gamma radiation were examined on the growth, survival, blood parameters and morphological changes of the intestines of rainbow trout (Oncorhynchus mykiss) larvae (103 ± 20 mg) after 12 weeks of exposure. Negative effects of gamma radiation on growth and survival were observed as radiation level and time increased. Changes were well documented at 10 and 20 Gy. All the fish were dead at the dose of 40 Gy. In all the treatments, levels of red blood cells (RBC), hematocrit (HCT) and hemoglobin (HB) were significantly (P < 0.05) declined as the irradiation levels increased, whereas the amount of mean corpuscular volume (MCV) and mean corpuscular hemoglobin (MCH) did not change. No significant differences (P > 0.05) were found in the levels of white blood cells (WBC), lymphocytes and monocytes. Destruction of the intestinal epithelium cells was indicated as the irradiation levels increased to 1 Gy and above. The highest levels of growth, survival, specific growth rate (SGR), condition factor (CF) and protein efficiency rate (PER) were obtained in the control treatment. The results showed that gamma rays can be a potential means for damaging rainbow trout cells.

  8. Prognostic value of pre-treatment DCE-MRI parameters in predicting disease free and overall survival for breast cancer patients undergoing neoadjuvant chemotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Pickles, Martin D. [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: m.pickles@hull.ac.uk; Manton, David J. [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: d.j.manton@hull.ac.uk; Lowry, Martin [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: m.lowry@hull.ac.uk; Turnbull, Lindsay W. [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: l.w.turnbull@hull.ac.uk

    2009-09-15

    The purpose of this study was to investigate whether dynamic contrast enhanced MRI (DCE-MRI) data, both pharmacokinetic and empirical, can predict, prior to neoadjuvant chemotherapy, which patients are likely to have a shorter disease free survival (DFS) and overall survival (OS) interval following surgery. Traditional prognostic parameters were also included in the survival analysis. Consequently, a comparison of the prognostic value could be made between all the parameters studied. MR examinations were conducted on a 1.5 T system in 68 patients prior to the initiation of neoadjuvant chemotherapy. DCE-MRI consisted of a fast spoiled gradient echo sequence acquired over 35 phases with a mean temporal resolution of 11.3 s. Both pharmacokinetic and empirical parameters were derived from the DCE-MRI data. Kaplan-Meier survival plots were generated for each parameter and group comparisons were made utilising logrank tests. The results from the 54 patients entered into the univariate survival analysis demonstrated that traditional prognostic parameters (tumour grade, hormonal status and size), empirical parameters (maximum enhancement index, enhancement index at 30 s, area under the curve and initial slope) and adjuvant therapies demonstrated significant differences in survival intervals. Further multivariate Cox regression survival analysis revealed that empirical enhancement parameters contributed the greatest prediction of both DFS and OS in the resulting models. In conclusion, this study has demonstrated that in patients who exhibit high levels of perfusion and vessel permeability pre-treatment, evidenced by elevated empirical DCE-MRI parameters, a significantly lower disease free survival and overall survival can be expected.

  9. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  10. Survival probabilities of Pugh-child-PBC classified patients in the Euricterus primary biliary cirrhosis population, based on the Mayo clinic prognostic model

    NARCIS (Netherlands)

    Reisman, Y; vanDam, GM; Gips, CH; Lavelle, SM; CuervasMons, [No Value; deDombal, FT; Gauthier, A; MalchowMoller, A; Molino, G; Theodossi, A; Tsiftsis, DD; Dawids, S; Larsson, L

    1997-01-01

    Background/Aims: Estimation of prognosis becomes increasingly important in primary biliary cirrhosis (PBC) with advancing disease and also with regard to patient management. The ubiquitous used Pugh scoring for severity of disease is simple while the Mayo model which has been validated for survival

  11. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time.

    Science.gov (United States)

    Baker, Stuart G; Sargent, Daniel J; Buyse, Marc; Burzykowski, Tomasz

    2012-03-01

    Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. © 2011, The International Biometric Society No claim to original US government works.

  12. A study of V79 cell survival after for proton and carbon ion beams as represented by the parameters of Katz' track structure model

    DEFF Research Database (Denmark)

    Grzanka, Leszek; Waligórski, M. P. R.; Bassler, Niels

    carbon irradiation. 1. Katz, R., Track structure in radiobiology and in radiation detection. Nuclear Track Detection 2: 1-28 (1978). 2. Furusawa Y. et al. Inactivation of aerobic and hypoxic cells from three different cell lines by accelerated 3He-, 12C- and 20Ne beams. Radiat Res. 2012 Jan; 177......Katz’s theory of cellular track structure (1) is an amorphous analytical model which applies a set of four cellular parameters representing survival of a given cell line after ion irradiation. Usually the values of these parameters are best fitted to a full set of experimentally measured survival...... curves available for a variety of ions. Once fitted, using these parameter values and the analytical formulae of the model calculations, cellular survival curves and RBE may be predicted for that cell line after irradiation by any ion, including mixed ion fields. While it is known that the Katz model...

  13. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  14. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  15. Survival probability of larval sprat in response to decadal changes in diel vertical migration behavior and prey abundance in the Baltic Sea

    DEFF Research Database (Denmark)

    Hinrichsen, Hans-Harald; Peck, Myron A.; Schmidt, Jörn

    2010-01-01

    We employed a coupled three-dimensional biophysical model to explore long-term inter- and intra-annual variability in the survival of sprat larvae in the Bornholm Basin, a major sprat spawning area in the Baltic Sea. Model scenarios incorporated observed decadal changes in larval diel vertical...... in the 1990s compared to the 1980s. After changing their foraging strategy by shifting from mid-depth, low prey environment to near-surface waters, first-feeding larvae encountered much higher rates of prey encounter and almost optimal feeding conditions and had a much higher growth potential. Consequently...

  16. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  17. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  18. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  19. Influence of binder type and process parameters on the compression properties and microbial survival in diclofenac tablet formulations

    Directory of Open Access Journals (Sweden)

    John Oluwasogo Ayorinde

    2011-12-01

    Full Text Available The influence of binder type and process parameters on the compression properties and microbial survival in diclofenac tablet formulations were studied using a novel gum from Albizia zygia. Tablets were produced from diclofenac formulations containing corn starch, lactose and dicalcium phosphate. Formulations were analyzed using the Heckel and Kawakita plots. Determination of microbial viability in the formulations was done on the compressed tablets of both contaminated and uncontaminated tablets prepared from formulations. Direct compression imparted a higher plasticity on the materials than the wet granulation method. Tablets produced by wet granulation presented with a higher crushing strength than those produced by the direct compression method. Significantly higher microbial survival (pA influência do tipo de ligante e os parâmetros do processo de propriedades de compressão e sobrevivência microbiana em comprimidos de diclofenaco foram estudados utilizando uma nova goma de Albizia zygia. Os comprimidos foram produzidos a partir de formulações de diclofenaco contendo amido de milho, lactose e fosfato bicálcico. As formulações foram analisadas usando os gráficos de Heckel e Kawakita. A determinação da viabilidade microbiana nas formulações foi feita nos comprimidos contaminados e não contaminados preparados a partir de formulações. A compressão direta confere maior plasticidade dos materiais do que o método de granulação úmida. Comprimidos produzidos por granulação úmida apresentaram maior força de esmagamento do que aqueles produzidos pelo método de compressão direta. Observou-se sobrevivência significativamente maior (p<0,05 em formulações preparadas por compressão direta. A sobrevivência percentual dos esporos de Bacillus subtilis diminuiu com o aumento da concentração do agregante. O estudo mostrou que a goma de Albizia é capaz de conferir maior plasticidade aos materiais e apresentou maior redução da

  20. Long-term Survival and Clinical Benefit from Adoptive T-cell Transfer in Stage IV Melanoma Patients Is Determined by a Four-Parameter Tumor Immune Signature.

    Science.gov (United States)

    Melief, Sara M; Visconti, Valeria V; Visser, Marten; van Diepen, Merel; Kapiteijn, Ellen H W; van den Berg, Joost H; Haanen, John B A G; Smit, Vincent T H B M; Oosting, Jan; van der Burg, Sjoerd H; Verdegaal, Els M E

    2017-02-01

    The presence of tumor-infiltrating immune cells is associated with longer survival and a better response to immunotherapy in early-stage melanoma, but a comprehensive study of the in situ immune microenvironment in stage IV melanoma has not been performed. We investigated the combined influence of a series of immune factors on survival and response to adoptive cell transfer (ACT) in stage IV melanoma patients. Metastases of 73 stage IV melanoma patients, 17 of which were treated with ACT, were studied with respect to the number and functional phenotype of lymphocytes and myeloid cells as well as for expression of galectins-1, -3, and -9. Single factors associated with better survival were identified using Kaplan-Meier curves and multivariate Cox regression analyses, and those factors were used for interaction analyses. The results were validated using The Cancer Genome Atlas database. We identified four parameters that were associated with a better survival: CD8(+) T cells, galectin-9(+) dendritic cells (DC)/DC-like macrophages, a high M1/M2 macrophage ratio, and the expression of galectin-3 by tumor cells. The presence of at least three of these parameters formed an independent positive prognostic factor for long-term survival. Patients displaying this four-parameter signature were found exclusively among patients responding to ACT and were the ones with sustained clinical benefit. Cancer Immunol Res; 5(2); 170-9. ©2017 AACR. ©2017 American Association for Cancer Research.

  1. Effects of Garlic (Alliumsativum and chloramphenicol on growth performance, physiological parameters and survival of Nile tilapia (Oreochromis niloticus

    Directory of Open Access Journals (Sweden)

    A. M. Shalaby

    2006-04-01

    Full Text Available We studied and compared the effects of chloramphenicol antibiotic and garlic (Allium sativum, used as immunostimulants and growth promoters, on some physiological parameters, growth performance, survival rate, and bacteriological characteristics of Nile tilapia (Oreochromis niloticus. Fish (7±1g/fish were assigned to eight treatments, with three replicates each. Treatment groups had a different level of Allium sativum (10, 20, 30, and 40g/kg diet and chloramphenicol (15, 30, and 45mg/kg diet added to their diets; the control group diet was free from garlic and antibiotic. Diets also contained 32% crude protein (CP and were administered at a rate of 3% live body weight twice daily for 90 days. Results showed that the final weight and specific growth rate (SGR of O. niloticus increased significantly with increasing levels of Allium sativum and chloramphenicol. The highest growth performance was verified with 30g Allium sativum / kg diet and 30mg chloramphenicol / kg diet. The lowest feed conversion ratio (FCR was observed with 30g Allium sativum / kg diet and 30mg chloramphenicol / kg diet. There were significant differences in the protein efficiency ratio (PER with all treatments, except with 45mg chloramphenicol / kg diet. No changes in the hepatosomatic index and survival rate were observed. Crude protein content in whole fish increased significantly in the group fed on 30g Allium sativum / kg diet, while total lipids decreased significantly in the same group. Ash of whole fish showed significantly high values with 30g Allium sativum and 15mg chloramphenicol / kg diet while the lowest value was observed in the control group. Blood parameters, erythrocyte count (RBC, and hemoglobin content in fish fed on diets containing 40g Allium sativum and all levels of chloramphenicol were significantly higher than in control. Significantly higher hematocrit values were seen with 30 and 45mg chloramphenicol / kg diet. There were no significant differences

  2. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  3. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  4. Two-step probability plot for parameter estimation of lifetime distribution affected by defect clustering in time-dependent dielectric breakdown

    Science.gov (United States)

    Yokogawa, Shinji

    2017-07-01

    In this study, a simple method of statistical parameter estimation is proposed for lifetime distribution that has three parameters due to the defect clustering in the middle-of-line and back-end-of-line. A two-step procedure provides the estimations of distribution parameters effectively for the time-dependent dielectric breakdown. In the first step, a clustering parameter of distribution, which is one of the shape parameters, is estimated by a linearization treatment of plotted data on the proposed chart. Then, in the second step, shape and scale parameters are estimated by calculating of a slope and an intercept, respectively. The statistical accuracy of the estimates is evaluated using the Monte-Carlo simulation technique and mean squared error of estimates.

  5. p53/Surviving Ratio as a Parameter for Chemotherapy Induction Response in Children with Acute Myeloid Leukemia

    Directory of Open Access Journals (Sweden)

    Rinaldi Lenggana

    2016-11-01

    Full Text Available Acute myeloid leukemia (AML is a malignancy that is often found in children. Many studies into the failure of apoptosis function, or programmed cell death, is one of the most important regulatory mechanisms of cellular hemostasis which is closely linked to the development of cancer, are important. Also, regulation of the apoptotic (p53 and anti-apoptotic (surviving proteins influence treatment outcome. One role of p53 is to monitor cellular stress necessary to induce apoptosis. Surviving (BIRC5 is a group of proteins in the apoptosis inhibitor which works by inhibiting caspase-3. The role of surviving is considered very important in oncogenesis proliferation and cell growth regulation. Chemotherapy in childhood AML can inhibit cell growth and induce slowing as well as stopping the cell cycle. Thus, the aim of this study was to compare p53 and surviving before and after receiving induction chemotherapy in children with AML and also to determine the p53/surviving ratio. Peripheral blood mononuclear cells were collected from AML children before treatment and three months after starting their induction therapy. p53 and surviving were measured by flowcytometry using monoclonal antibodies. Data were analyzed by t-test for comparison between groups and Spearman’s test to find out the correlation between variables with a significant value of p < 0.05. A total of 8 children were evaluated. The intensity of p53 expression was not significantly increased after induction phase chemotherapy (p = 0.224, but surviving expression and the ratio of p53/surviving were significantly increased in the treatment group compared with the levels prior to chemotherapy (p = 0.002, p = 0.034, and there was a strong negative correlation between p53 and surviving after chemotherapy (r = −0.63, p = 0.049.

  6. COUNTRY-LEVEL SOCIOECONOMIC INDICATORS ASSOCIATED WITH SURVIVAL PROBABILITY OF BECOMING A CENTENARIAN AMONG OLDER EUROPEAN ADULTS: GENDER INEQUALITY, MALE LABOUR FORCE PARTICIPATION AND PROPORTIONS OF WOMEN IN PARLIAMENTS.

    Science.gov (United States)

    Kim, Jong In; Kim, Gukbin

    2017-03-01

    This study confirms an association between survival probability of becoming a centenarian (SPBC) for those aged 65 to 69 and country-level socioeconomic indicators in Europe: the gender inequality index (GII), male labour force participation (MLP) rates and proportions of seats held by women in national parliaments (PWP). The analysis was based on SPBC data from 34 countries obtained from the United Nations (UN). Country-level socioeconomic indicator data were obtained from the UN and World Bank databases. The associations between socioeconomic indicators and SPBC were assessed using correlation coefficients and multivariate regression models. The findings show significant correlations between the SPBC for women and men aged 65 to 69 and country-level socioeconomic indicators: GII (r=-0.674, p=0.001), MLP (r=0.514, p=0.002) and PWP (r=0.498, p=0.003). The SPBC predictors for women and men were lower GIIs and higher MLP and PWP (R 2=0.508, p=0.001). Country-level socioeconomic indicators appear to have an important effect on the probability of becoming a centenarian in European adults aged 65 to 69. Country-level gender equality policies in European counties may decrease the risk of unhealthy old age and increase longevity in elders through greater national gender equality; disparities in GII and other country-level socioeconomic indicators impact longevity probability. National longevity strategies should target country-level gender inequality.

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  8. Clinicopathological characteristics of Barrett's carcinoma, cardia carcinoma type II and distal gastric carcinoma: Influence of observed parameters on the five-year postoperative survival of patients

    Directory of Open Access Journals (Sweden)

    Jovanović Ivan

    2009-01-01

    Full Text Available Introduction. In the past two decades, the increased frequency of distal esophageal adenocarcinoma, esophagogastric junction and proximal gastric adenocarcinoma has been observed. The vast majority of these tumours are diagnosed in advanced stages, when the prognosis is poorer than in other gastric cancers. Objective. The aim of our study was to analyze the demographic and clinicopathological characteristics of patients operated on for Barrett's, cardia and distal gastric adenocarcinomas, as well as to study the influence of manifestations of each cancerogenetic indication on the studied clinicopathological parameters and to analyze the 5-year survival rate of patients surgically treated for cardia adenocarcinoma in relation to the patients operated on for distal gastric adenocarcinoma. Methods. We analyzed gender and age, tumour type, depth of tumour invasion, involvement of blood and lymph vessels in 66 patients surgically treated at the Centre for Oesophageal Surgery of the Institute for Digestive Diseases of the Belgrade Clinical Centre. Results. Except for significant differences in the depth of tumour invasion during surgery, there were no other statistically significant differences between the studied groups of patients. In the patients operated on for Barrett's and cardia cancers, the tumours invaded more deeply the wall layers, i.e. they were significantly more invasive than the distal gastric tumour. The lymph node involvement was present in 87.5% of patients with Barrett's cancer, in 80% with cardia cancer and in 87% with distal gastric cancer. The 3-year survival rate of patients operated on for cardia cancer was 47.4% and the 5-year survival rate was 31.6%, while the 3-year survival rate of patients operated on for distal gastric cancer was 46.2% and the 5-year survival rate was 34.6%. These differences were not statistically significant (Wilcoxon 0,036; p=0,85. Singly, the patients' gender, cancer type and the degree of tumour

  9. Optimizing the parameters of the Lyman-Kutcher-Burman, Källman, and Logit+EUD models for the rectum - a comparison between normal tissue complication probability and clinical data

    Science.gov (United States)

    Trojková, Darina; Judas, Libor; Trojek, Tomáš

    2014-11-01

    Minimizing the late rectal toxicity of prostate cancer patients is a very important and widely-discussed topic. Normal tissue complication probability (NTCP) models can be used to evaluate competing treatment plans. In our work, the parameters of the Lyman-Kutcher-Burman (LKB), Källman, and Logit+EUD models are optimized by minimizing the Brier score for a group of 302 prostate cancer patients. The NTCP values are calculated and are compared with the values obtained using previously published values for the parameters. χ2 Statistics were calculated as a check of goodness of optimization.

  10. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  11. Perfusion Parameters on Breast Dynamic Contrast-Enhanced MRI Are Associated With Disease-Specific Survival in Patients With Triple-Negative Breast Cancer.

    Science.gov (United States)

    Park, Vivian Youngjean; Kim, Eun-Kyung; Kim, Min Jung; Yoon, Jung Hyun; Moon, Hee Jung

    2017-03-01

    The aim of this study was to investigate the association between perfusion parameters on MRI performed before treatment and survival outcome (disease-free survival [DFS], disease-specific survival [DSS]) in patients with triple-negative breast cancer (TNBC). Sixty-one patients (median age, 50 years; age range, 27-77 years) with TNBC (tumor size on MRI: median, 25.5 mm; range, 11.0-142.0 mm) were included. We analyzed clinical and pathologic variables and MRI parameters. Cox proportional hazards models were used to determine associations with survival outcome. The median follow-up time was 46.1 months (range, 13.9-58.4 months). Eleven of 61 (18.0%) patients had events (i.e., local, regional, or distant recurrence or contralateral breast cancer) and seven (11.5%) died of breast cancer. Among the pretreatment variables, a larger tumor size on MR images (hazard ratio [HR] = 1.024, p = 0.003) was associated with worse DFS at univariate analysis. In multivariate pretreatment models for DSS, a higher fractional volume of extravascular extracellular space per unit volume of tissue (ve) value (HR = 1.658, p = 0.038), higher peak enhancement (HR = 1.843, p = 0.018), and a larger tumor size on MR images (HR = 1.060, p = 0.001) were associated with worse DSS. In multivariate posttreatment models, a larger pathologic tumor size (HR for DFS, 1.074 [p = 0.005]; HR for DSS, 1.050 [p = 0.042]) and metastasis in surgically resected axillary lymph nodes (HR for DFS, 5.789 [p = 0.017]; HR for DSS, 23.717 [p = 0.005]) were associated with worse survival outcome. A higher ve value, higher peak enhancement, and larger tumor size of the primary tumor on pretreatment MRI were independent predictors of worse DSS in patients with TNBC.

  12. Influence of Deceased Donor and Pretransplant Recipient Parameters on Early Overall Kidney Graft-Survival in Germany

    Directory of Open Access Journals (Sweden)

    Carl-Ludwig Fischer-Fröhlich

    2015-01-01

    Full Text Available Background. Scarcity of grafts for kidney transplantation (KTX caused an increased consideration of deceased donors with substantial risk factors. There is no agreement on which ones are detrimental for overall graft-survival. Therefore, we investigated in a nationwide multicentre study the impact of donor and recipient related risks known before KTX on graft-survival based on the original data used for allocation and graft acceptance. Methods. A nationwide deidentified multicenter study-database was created of data concerning kidneys donated and transplanted in Germany between 2006 and 2008 as provided by the national organ procurement organization (Deutsche Stiftung Organtransplantation and BQS Institute. Multiple Cox regression (significance level 5%, hazard ratio [95% CI] was conducted (n=4411, isolated KTX. Results. Risk factors associated with graft-survival were donor age (1.020 [1.013–1.027] per year, donor size (0.985 [0.977–0.993] per cm, donor’s creatinine at admission (1.002 [1.001–1.004] per µmol/L, donor treatment with catecholamine (0.757 [0.635–0.901], and reduced graft-quality at procurement (1.549 [1.217–1.973], as well as recipient age (1.012 [1.003–1.021] per year, actual panel reactive antibodies (1.007 [1.002–1.011] per percent, retransplantation (1.850 [1.484–2.306], recipient’s cardiovascular comorbidity (1.436 [1.212–1.701], and use of IL2-receptor antibodies for induction (0.741 [0.619–0.887]. Conclusion. Some donor characteristics persist to impact graft-survival (e.g., age while the effect of others could be mitigated by elaborate donor-recipient match and care.

  13. Impact of immune parameters on long-term survival in metastatic renal cell      carcinoma

    DEFF Research Database (Denmark)

    Donskov, Frede; Maase, Hans von der

    2006-01-01

    with estimated       5-year survival rates of 60%, 25%, and 0%, respectively. These findings       were apparent in both our own prognostic model and in an extended Memorial       Sloan-Kettering Cancer Center (New York, NY) prognostic model. CONCLUSION:       This study points on five clinical and three...

  14. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  15. Central Versus Peripheral Pulmonary Embolism: Analysis of the Impact on the Physiological Parameters and Long-term Survival

    Science.gov (United States)

    Alonso Martinez, José Luis; Anniccherico Sánchez, Francisco Javier; Urbieta Echezarreta, Miren Aranzazu; García, Ione Villar; Álvaro, Jorge Rojo

    2016-01-01

    Background: Studies aimed at assessing whether the emboli lodged in the central pulmonary arteries carry a worse prognosis than more peripheral emboli have yielded controversial results. Aims: To explore the impact on survival and long-term prognosis of central pulmonary embolism. Patients and Methods: Consecutive patients diagnosed with acute symptomatic pulmonary embolism by means of computed tomography (CT) angiography were evaluated at episode index and traced through the computed system of clinical recording and following-up. Central pulmonary embolism was diagnosed when thrombi were seen in the trunk or in the main pulmonary arteries and peripheral pulmonary embolism when segmental or subsegmental arteries were affected. Results: A total of 530 consecutive patients diagnosed with pulmonary embolism were evaluated; 255 patients had central pulmonary embolism and 275 patients had segmental or subsegmental pulmonary embolism. Patients with central pulmonary embolism were older, had higher plasma levels of N-terminal of the prohormone brain natriuretic peptide (NT-ProBNP), troponin I, D-dimer, alveolar-arterial gradient, and shock index (P pulmonary embolism had an all-cause mortality of 40% while patients with segmental or subsegmental pulmonary embolism (PE) had an overall mortality of 27% and odds ratio of 1.81 [confidence interval (CI) 95% 1.16-1.9]. Survival was lower in patients with central PE than in patients with segmental or subsegmental pulmonary embolism, even after avoiding confounders (P = .018). Conclusions: Apart from a greater impact on hemodynamics, gas exchange, and right ventricular dysfunction, central pulmonary embolism associates a shorter survival and an increased long-term mortality. PMID:27114970

  16. Association analysis of insulin-like growth factor-1 axis parameters with survival and functional status in nonagenarians of the Leiden Longevity Study

    DEFF Research Database (Denmark)

    van der Spoel, Evie; Rozing, Maarten P; Houwing-Duistermaat, Jeanine J

    2015-01-01

    Reduced insulin/insulin-like growth factor 1 (IGF-1) signaling has been associated with longevity in various model organisms. However, the role of insulin/IGF-1 signaling in human survival remains controversial. The aim of this study was to test whether circulating IGF-1 axis parameters associate...... with old age survival and functional status in nonagenarians from the Leiden Longevity Study. This study examined 858 Dutch nonagenarian (males≥89 years; females≥91 years) siblings from 409 families, without selection on health or demographic characteristics. Nonagenarians were divided over sex.......91) compared to the quartile with the highest ratio (ptrend=0.002). Functional status was assessed by (Instrumental) Activities of Daily Living ((I)ADL) scales. Compared to those in the quartile with the highest IGF-1/IGFBP3 ratio, nonagenarians in the lowest quartile had higher scores for ADL (ptrend=0...

  17. Discriminant Analysis of 18F-Fluoro-Thymidine Kinetic Parameters to Predict Survival in Patients with Recurrent High-Grade Glioma

    Science.gov (United States)

    Wardak, Mirwais; Schiepers, Christiaan; Dahlbom, Magnus; Cloughesy, Timothy; Chen, Wei; Satyamurthy, Nagichettiar; Czernin, Johannes; Phelps, Michael E.; Huang, Sung-Cheng

    2011-01-01

    Purpose The primary objective of this study was to investigate if changes in 18F-FLT kinetic parameters, taken at an early stage after start of therapy, could predict overall survival (OS) and progression-free survival (PFS) in patients with recurrent malignant glioma undergoing treatment with bevacizumab and irinotecan. Experimental Design High-grade recurrent brain tumors were investigated in 18 patients (8M, 10F), 26-76 yr. Each had 3 dynamic PET studies: at baseline, and after 2 weeks, and 6 weeks from the start of treatment. 2.0 MBq/kg of 18F-FLT was injected intravenously and dynamic PET images acquired for 1 hr. Factor analysis generated factor images from which blood and tumor uptake curves were derived. A 3-compartment, 2-tissue model was applied to estimate the tumor 18F-FLT kinetic rate constants using a metabolite and partial volume corrected input function. Different combinations of predictor variables were exhaustively searched in a discriminant function to accurately classify patients into their known OS and PFS groups. A leave-one-out cross-validation technique was used to assess the generalizability of the model predictions. Results In this study population, changes in single parameters such as standardized uptake value or influx rate constant did not accurately classify patients into their respective OS groups (Discriminant analysis using changes in 18F-FLT kinetic parameters early during treatment appears to be a powerful method for evaluating the efficacy of therapeutic regimens. PMID:21868765

  18. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  19. Change in volume parameters induced by neoadjuvant chemotherapy provide accurate prediction of overall survival after resection in patients with oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Tamandl, Dietmar; Fueger, Barbara; Kinsperger, Patrick; Haug, Alexander; Ba-Ssalamah, Ahmed [Medical University of Vienna, Department of Biomedical Imaging and Image-Guided Therapy, Comprehensive Cancer Center GET-Unit, Vienna (Austria); Gore, Richard M. [University of Chicago Pritzker School of Medicine, Department of Radiology, Chicago, IL (United States); Hejna, Michael [Medical University of Vienna, Department of Internal Medicine, Division of Medical Oncology, Comprehensive Cancer Center GET-Unit, Vienna (Austria); Paireder, Matthias; Schoppmann, Sebastian F. [Medical University of Vienna, Department of Surgery, Upper-GI-Service, Comprehensive Cancer Center GET-Unit, Vienna (Austria)

    2016-02-15

    To assess the prognostic value of volumetric parameters measured with CT and PET/CT in patients with neoadjuvant chemotherapy (NACT) and resection for oesophageal cancer (EC). Patients with locally advanced EC, who were treated with NACT and resection, were retrospectively analysed. Data from CT volumetry and {sup 18} F-FDG PET/CT (maximum standardized uptake [SUVmax], metabolic tumour volume [MTV], and total lesion glycolysis [TLG]) were recorded before and after NACT. The impact of volumetric parameter changes induced by NACT (MTV{sub RATIO}, TLG{sub RATIO}, etc.) on overall survival (OS) was assessed using a Cox proportional hazards model. Eighty-four patients were assessed using CT volumetry; of those, 50 also had PET/CT before and after NACT. Low post-treatment CT volume and thickness, MTV, TLG, and SUVmax were all associated with longer OS (p < 0.05), as were CTthickness{sub RATIO}, MTV{sub RATIO}, TLG{sub RATIO}, and SUVmax{sub RATIO} (p < 0.05). In the multivariate analysis, only MTV{sub RATIO} (Hazard ratio, HR 2.52 [95 % Confidence interval, CI 1.33-4.78], p = 0.005), TLG{sub RATIO} (HR 3.89 [95%CI 1.46-10.34], p = 0.006), and surgical margin status (p < 0.05), were independent predictors of OS. MTV{sub RATIO} and TLG{sub RATIO} are independent prognostic factors for survival in patients after NACT and resection for EC. (orig.)

  20. Pretreatment F-18 FDG PET/CT Parameters to Evaluate Progression-Free Survival in Gastric Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeonghun; Lim, Seok Tae; Na, Chang Ju; Han, Yeonhee; Kim, Chanyoung; Jeong, Hwanjeong; Sohn, Myunghee [Chonbuk National Univ., Jeonju (Korea, Republic of)

    2014-03-15

    We performed this study to evaluate the predictive value of pretreatment F-18 FDG PET/CT for progression-free survival (PFS) in patients with gastric cancer. Of 321 patients with a diagnosis of gastric cancer, we retrospectively enrolled 97 patients (men:women = 61:36, age 59.8±13.2 years), who underwent pretreatment F-18 fluoro-2-deoxyglucose positron emission tomography/computed tomography (F-18 FDG PET/CT) from January 2009 to December 2009. Maximum standardized uptake value (SUVmax) was measured for each case with detectable primary lesions. In the remaining non-detectable cases, SUVmax was measured from the corresponding site seen on gastroduodenoscopy for analysis. In subgroup analysis, metabolic tumor volume (MTV) was measured in 50 patients with clearly distinguishable primary lesions. SUVmax, stage, depth of tumor invasion and presence of lymph node metastasis were analyzed in terms of PFS. Receiver operating characteristic (ROC) curves were used to find optimal cutoff values of SUVmax and MTV for disease progression. The relationship between SUVmax, MTV and PFS was analyzed using the Kaplan-Meier with log-rank test and Cox's proportional hazard regression methods. Of 97 patients, 15 (15.5 %) had disease progression. The mean follow-up duration was 29.6±10.2 months. The mean PFS of low SUVmax group (≤5.74) was significantly longer than that of the high SUVmax group (>5.74) (30.9±8.0 vs 24.3±13.6 months, p =0.008). In univariate analysis, stage (I vs II, III, IV), depth of tumor invasion (T1 vs T2, T3, T4), presence of lymph node metastasis and SUVmax (>5.74 vs ≤5.74) were significantly associated with recurrence. In multivariate analysis, high SUVmax (>5.74) was the only poor prognostic factor for PFS (p =0.002, HR 11.03, 95% CI 2.48.49.05). Subgroup multivariate analysis revealed that high MTV (>16.42) was the only poor prognostic factor for PFS (p =0.034, HR 3.59, 95 % CI 1.10.11.71). In gastric cancer, SUVmax measured by pretreatment F-18

  1. Quantifying extinction probabilities from sighting records: inference and uncertainties.

    Directory of Open Access Journals (Sweden)

    Peter Caley

    Full Text Available Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes.

  2. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  3. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  4. A new formalism for modelling parameters α and β of the linear-quadratic model of cell survival for hadron therapy

    Science.gov (United States)

    Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe

    2017-10-01

    We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.

  5. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  6. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  7. On the probability of cure for heavy-ion radiotherapy.

    Science.gov (United States)

    Hanin, Leonid; Zaider, Marco

    2014-07-21

    The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  8. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  9. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  10. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  11. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  12. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  13. Methods for investigating parameter redundancy

    Directory of Open Access Journals (Sweden)

    Gimenez, O.

    2004-06-01

    Full Text Available The quantitative study of marked individuals relies mainly on the use of meaningful biological models. Classical inference is then conducted based on the model likelihood, parameterized by parameters such as survival, recovery, transition and recapture probabilities. In classical statistics, we seek parameter estimates by maximising the likelihood. However, models are often overparameterized and, as a consequence, some parameters cannot be estimated separately. Identifying how many and which (functions of parameters are estimable is thus crucial not only for proper model selection based upon likelihood ratio tests or information criteria but also for the interpretation of the estimates obtained. In this paper, we provide the reader with a description of the tools available to check for parameter redundancy. We aim to assist people in choosing the most appropriate method to solve their own specific problems.

  14. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  15. Serum autotaxin is a parameter for the severity of liver cirrhosis and overall survival in patients with liver cirrhosis--a prospective cohort study.

    Directory of Open Access Journals (Sweden)

    Thomas Pleli

    Full Text Available Autotaxin (ATX and its product lysophosphatidic acid (LPA are considered to be involved in the development of liver fibrosis and elevated levels of serum ATX have been found in patients with hepatitis C virus associated liver fibrosis. However, the clinical role of systemic ATX in the stages of liver cirrhosis was unknown. Here we investigated the relation of ATX serum levels and severity of cirrhosis as well as prognosis of cirrhotic patients.Patients with liver cirrhosis were prospectively enrolled and followed until death, liver transplantation or last contact. Blood samples drawn at the day of inclusion in the study were assessed for ATX content by an enzyme-linked immunosorbent assay. ATX levels were correlated with the stage as well as complications of cirrhosis. The prognostic value of ATX was investigated by uni- and multivariate Cox regression analyses. LPA concentration was determined by liquid chromatography-tandem mass spectrometry.270 patients were enrolled. Subjects with liver cirrhosis showed elevated serum levels of ATX as compared to healthy subjects (0.814±0.42 mg/l vs. 0.258±0.40 mg/l, P<0.001. Serum ATX levels correlated with the Child-Pugh stage and the MELD (model of end stage liver disease score and LPA levels (r = 0.493, P = 0.027. Patients with hepatic encephalopathy (P = 0.006, esophageal varices (P = 0.002 and portal hypertensive gastropathy (P = 0.008 had higher ATX levels than patients without these complications. Low ATX levels were a parameter independently associated with longer overall survival (hazard ratio 0.575, 95% confidence interval 0.365-0.905, P = 0.017.Serum ATX is an indicator for the severity of liver disease and the prognosis of cirrhotic patients.

  16. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  17. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  18. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  19. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  20. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  1. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  2. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  3. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  5. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  6. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  7. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  8. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  9. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  10. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  11. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  12. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  13. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  14. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  15. Genetic parameters of natural antibody isotypes and survival analysis in beak-trimmed and non-beak-trimmed crossbred laying hens

    NARCIS (Netherlands)

    Sun, Y.; Ellen, E.D.; Parmentier, H.K.; Poel, van der J.J.

    2013-01-01

    Natural antibodies (NAb) are important humoral components of innate immunity. As the first line of defense, NAb provide protection against infection and support adaptive immunity. An earlier study indicated that serum levels of NAb isotypes IgM and IgG at a young age were predictive for survival in

  16. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  17. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  18. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  19. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  20. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  1. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  2. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  3. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  4. Feeding ω-3 PUFA enriched rotifers to Galaxias maculatus (Jenyns, 1842 larvae reared at different salinity conditions: effects on growth parameters, survival and fatty acids profile

    Directory of Open Access Journals (Sweden)

    Patricio Dantagnan

    2013-07-01

    Full Text Available Despite the well known importance of ω-3 polyunsaturated fatty acids (PUFA in marine and freshwater fish larvae, there are few studies on how essential fatty acid requirements and composition on whole body can be altered by changes in water salinity. The present study aimed to determine the effect of salinity on ω-3 PUFA requirements, larval growth survival and fatty acid composition of Galaxias maculatus larvae cultured at two different salinities (0 and 15 g L-1 for 20 days while fed rotifers containing two different levels of ω-3 PUFA (1.87 and 3.16%. The results denoted a marked difference in ω-3 PUFA requirements and in the pattern of fatty acid deposition in the whole body of larvae reared at different salinities, depending of ω-3 PUFA in diets. Thus, to improve growth and survival larvae of G. maculatus reared at 0 g L-1 require higher levels of ω-3 PUFA, principally 18:3 ω-3. Larvae reared at salinities of 15 g L-1 require low levels of ω-3 PUFA for optimal survival, especially 18:3 ω-3. Eicosapentaenoic acid and docosahexaenoic acid content in the whole body of larvae was also affected by water salinity.

  5. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  6. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  7. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  8. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  9. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  10. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  11. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  12. Slow and continuous delivery of a low dose of nimodipine improves survival and electrocardiogram parameters in rescue therapy of mice with experimental cerebral malaria.

    Science.gov (United States)

    Martins, Yuri C; Clemmer, Leah; Orjuela-Sánchez, Pamela; Zanini, Graziela M; Ong, Peng Kai; Frangos, John A; Carvalho, Leonardo J M

    2013-04-24

    Human cerebral malaria (HCM) is a life-threatening complication caused by Plasmodium falciparum infection that continues to be a major global health problem despite optimal anti-malarial treatment. In the experimental model of cerebral malaria (ECM) by Plasmodium berghei ANKA, bolus administration of nimodipine at high doses together with artemether, increases survival of mice with ECM. However, the dose and administration route used is associated with cardiovascular side effects such as hypotension and bradycardia in humans and mice, which could preclude its potential use as adjunctive treatment in HCM. In the present study, alternative delivery systems for nimodipine during late-stage ECM in association with artesunate were searched to define optimal protocols to achieve maximum efficacy in increasing survival in rescue therapy while causing the least cardiac side effects. The baseline electrocardiogram (ECG) and arterial pressure characteristics of uninfected control animals and of mice with ECM and its response upon rescue treatment with artesunate associated or not with nimodipine is also analysed. Nimodipine, given at 0.5 mg/kg/day via a slow and continuous delivery system by osmotic pumps, increases survival of mice with ECM when used as adjunctive treatment to artesunate. Mice with ECM showed hypotension and ECG changes, including bradycardia and increases in PR, QRS, QTc and ST interval duration. ECM mice also show increased QTc dispersion, heart rate variability (HRV), RMSSD, low frequency (LF) and high frequency (HF) bands of the power spectrum. Both sympathetic and parasympathetic inputs to the heart were increased, but there was a predominance of sympathetic tone as demonstrated by an increased LF/HF ratio. Nimodipine potentiated bradycardia when given by bolus injection, but not when via osmotic pumps. In addition, nimodipine shortened PR duration and improved HRV, RMSSD, LF and HF powers in mice with ECM. In addition, nimodipine did not increased

  13. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  14. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  15. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  16. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  17. Fetal lung growth represented by longitudinal changes in MRI-derived fetal lung volume parameters predicts survival in isolated left-sided congenital diaphragmatic hernia.

    Science.gov (United States)

    Coleman, Alan; Phithakwatchara, Nisarat; Shaaban, Aimen; Keswani, Sundeep; Kline-Fath, Beth; Kingma, Paul; Haberman, Beth; Lim, Foong-Yen

    2015-02-01

    The aim of this study was to evaluate fetal lung growth rate for isolated left-sided congenital diaphragmatic hernia (CDH) using serial magnetic resonance imaging (MRI)-based volumetric measures. Early and late gestational (22-30 and >30 weeks' gestation) lung volumetry was obtained by fetal MRI in 47 cases of isolated left-sided CDH. At both of these time points, lung volume indices, including total lung volume (TLV), observed to expected TLV (o/e TLV), and percentage of predicted lung volume (PPLV) as well as their change rates (Δ) and relative Δ during gestation were calculated and analyzed in regard to their capacity to predict neonatal survival. TLV, o/e TLV, and PPLV had various changes during gestation. Late TLV, early and late o/e TLV, and late PPLV were predictive of neonatal survival. Non-survivors had lower ΔTLV and more negative relative ΔPPLV than survivors (1.18 vs 1.85 mL/week, P = 0.004 and -4.15%/week vs -1.95%/week, P = 0.002, respectively). The severity of pulmonary hypoplasia is dynamic and can worsen in the third trimester. MRI lung volumetry repeated in late gestation can provide additional information on individual lung growth that may facilitate prenatal counseling and focus perinatal management. © 2014 John Wiley & Sons, Ltd.

  18. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  19. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  20. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  1. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  2. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  3. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  4. Survival analysis of cervical cancer using stratified Cox regression

    Science.gov (United States)

    Purnami, S. W.; Inayati, K. D.; Sari, N. W. Wulan; Chosuvivatwong, V.; Sriplung, H.

    2016-04-01

    Cervical cancer is one of the mostly widely cancer cause of the women death in the world including Indonesia. Most cervical cancer patients come to the hospital already in an advanced stadium. As a result, the treatment of cervical cancer becomes more difficult and even can increase the death's risk. One of parameter that can be used to assess successfully of treatment is the probability of survival. This study raises the issue of cervical cancer survival patients at Dr. Soetomo Hospital using stratified Cox regression based on six factors such as age, stadium, treatment initiation, companion disease, complication, and anemia. Stratified Cox model is used because there is one independent variable that does not satisfy the proportional hazards assumption that is stadium. The results of the stratified Cox model show that the complication variable is significant factor which influent survival probability of cervical cancer patient. The obtained hazard ratio is 7.35. It means that cervical cancer patient who has complication is at risk of dying 7.35 times greater than patient who did not has complication. While the adjusted survival curves showed that stadium IV had the lowest probability of survival.

  5. Early differential cell death and survival mechanisms initiate and contribute to the development of OPIDN: A study of molecular, cellular, and anatomical parameters

    Energy Technology Data Exchange (ETDEWEB)

    Damodaran, T.V., E-mail: tdamodar@nccu.edu [Dept of Medicine, Duke University Medical Center, Durham, NC (United States); Pharmacology and Cancer biology, Duke University Medical Center, Durham, NC (United States); Dept of Biology, North Carolina Central University, Durham, NC 27707 (United States); Attia, M.K. [Pharmacology and Cancer biology, Duke University Medical Center, Durham, NC (United States); Abou-Donia, M.B., E-mail: donia@mc.duke.edu [Pharmacology and Cancer biology, Duke University Medical Center, Durham, NC (United States)

    2011-11-15

    Organophosphorus-ester induced delayed neurotoxicity (OPIDN) is a neurodegenerative disorder characterized by ataxia progressing to paralysis with a concomitant central and peripheral, distal axonapathy. Diisopropylphosphorofluoridate (DFP) produces OPIDN in the chicken that results in mild ataxia in 7-14 days and severe paralysis as the disease progresses with a single dose. White leghorn layer hens were treated with DFP (1.7 mg/kg, sc) after prophylactic treatment with atropine (1 mg/kg, sc) in normal saline and eserine (1 mg/kg, sc) in dimethyl sulfoxide. Control groups were treated with vehicle propylene glycol (0.1 ml/kg, sc), atropine in normal saline and eserine in dimethyl sulfoxide. The hens were euthanized at different time points such as 1, 2, 5, 10 and 20 days, and the tissues from cerebrum, midbrain, cerebellum, brainstem and spinal cord were quickly dissected and frozen for mRNA (northern) studies. Northern blots were probed with BCL2, GADD45, beta actin, and 28S RNA to investigate their expression pattern. Another set of hens was treated for a series of time points and perfused with phosphate buffered saline and fixative for histological studies. Various staining protocols such as Hematoxylin and Eosin (H and E); Sevier-Munger; Cresyl echt Violet for Nissl substance; and Gallocynin stain for Nissl granules were used to assess various patterns of cell death and degenerative changes. Complex cell death mechanisms may be involved in the neuronal and axonal degeneration. These data indicate altered and differential mRNA expressions of BCL2 (anti apoptotic gene) and GADD45 (DNA damage inducible gene) in various tissues. Increased cell death and other degenerative changes noted in the susceptible regions (spinal cord and cerebellum) than the resistant region (cerebrum), may indicate complex molecular pathways via altered BCL2 and GADD45 gene expression, causing the homeostatic imbalance between cell survival and cell death mechanisms. Semi quantitative

  6. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  7. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  8. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  9. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  10. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  11. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  12. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  13. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  14. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  15. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  16. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  17. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  18. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  19. Compensatory effects of recruitment and survival when amphibian populations are perturbed by disease

    Science.gov (United States)

    Muths, E.; Scherer, R. D.; Pilliod, D.S.

    2011-01-01

    The need to increase our understanding of factors that regulate animal population dynamics has been catalysed by recent, observed declines in wildlife populations worldwide. Reliable estimates of demographic parameters are critical for addressing basic and applied ecological questions and understanding the response of parameters to perturbations (e.g. disease, habitat loss, climate change). However, to fully assess the impact of perturbation on population dynamics, all parameters contributing to the response of the target population must be estimated. We applied the reverse-time model of Pradel in Program mark to 6years of capture-recapture data from two populations of Anaxyrus boreas (boreal toad) populations, one with disease and one without. We then assessed a priori hypotheses about differences in survival and recruitment relative to local environmental conditions and the presence of disease. We further explored the relative contribution of survival probability and recruitment rate to population growth and investigated how shifts in these parameters can alter population dynamics when a population is perturbed. High recruitment rates (0??41) are probably compensating for low survival probability (range 0??51-0??54) in the population challenged by an emerging pathogen, resulting in a relatively slow rate of decline. In contrast, the population with no evidence of disease had high survival probability (range 0??75-0??78) but lower recruitment rates (0??25). Synthesis and applications.We suggest that the relationship between survival and recruitment may be compensatory, providing evidence that populations challenged with disease are not necessarily doomed to extinction. A better understanding of these interactions may help to explain, and be used to predict, population regulation and persistence for wildlife threatened with disease. Further, reliable estimates of population parameters such as recruitment and survival can guide the formulation and implementation of

  20. Refinement of Probability of Survival Decision Aid (PSDA)

    Science.gov (United States)

    2014-03-01

    thermoregulation , search and rescue, SaR, predictive modeling Adam W. Potter Unclassified 26Unclass Unclass Unclass 508-233-4735 Form Approved OMB No. 0704-0188...SCTM, then posts or updates the display predictions for cold functional time (i.e., the point in time when core temperature reaches 34 °C), cold

  1. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  2. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  3. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  4. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  5. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  6. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  7. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  8. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  9. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  10. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  11. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  12. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  13. Network ties and survival

    DEFF Research Database (Denmark)

    Acheampong, George; Narteh, Bedman; Rand, John

    2017-01-01

    Poultry farming has been touted as one of the major ways by which poverty can be reduced in low-income economies like Ghana. Yet, anecdotally there is a high failure rate among these poultry farms. This current study seeks to understand the relationship between network ties and survival chances...... of small commercial poultry farms (SCPFs). We utilize data from a 2-year network survey of SCPFs in rural Ghana. The survival of these poultry farms are modelled using a lagged probit model of farms that persisted from 2014 into 2015. We find that network ties are important to the survival chances...... but this probability reduces as the number of industry ties increases but moderation with dynamic capability of the firm reverses this trend. Our findings show that not all network ties aid survival and therefore small commercial poultry farmers need to be circumspect in the network ties they cultivate and develop....

  14. Inferential Statistics from Black Hispanic Breast Cancer Survival Data

    Directory of Open Access Journals (Sweden)

    Hafiz M. R. Khan

    2014-01-01

    Full Text Available In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973–2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC, Bayesian Information Criteria (BIC, and Deviance Information Criteria (DIC to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation.

  15. Probability distribution functions

    OpenAIRE

    Sousa, Paulo Baltarejo; Ferreira, Luís Lino

    2007-01-01

    This technical report describes the PDFs which have been implemented to model the behaviours of certain parameters of the Repeater-Based Hybrid Wired/Wireless PROFIBUS Network Simulator (RHW2PNetSim) and Bridge-Based Hybrid Wired/Wireless PROFIBUS Network Simulator (BHW2PNetSim).

  16. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  17. Laplace's 1774 Memoir on Inverse Probability

    OpenAIRE

    Stigler, Stephen M.

    1986-01-01

    Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, and English translation of this epochal memoir is given.

  18. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  19. A track-event theory of cell survival

    Energy Technology Data Exchange (ETDEWEB)

    Besserer, Juergen; Schneider, Uwe [Zuerich Univ. (Switzerland). Inst. of Physics; Radiotherapy Hirslanden, Zuerich (Switzerland)

    2015-09-01

    When fractionation schemes for hypofractionation and stereotactic body radiotherapy are considered, a reliable cell survival model at high dose is needed for calculating doses of similar biological effectiveness. In this work a simple model for cell survival which is valid also at high dose is developed from Poisson statistics. An event is defined by two double strand breaks (DSB) on the same or different chromosomes. An event is always lethal due to direct lethal damage or lethal binary misrepair by the formation of chromosome aberrations. Two different mechanisms can produce events: one-track events (OTE) or two-track-events (TTE). The target for an OTE is always a lethal event, the target for an TTE is one DSB. At least two TTEs on the same or different chromosomes are necessary to produce an event. Both, the OTE and the TTE are statistically independent. From the stochastic nature of cell kill which is described by the Poisson distribution the cell survival probability was derived. It was shown that a solution based on Poisson statistics exists for cell survival. It exhibits exponential cell survival at high dose and a finite gradient of cell survival at vanishing dose, which is in agreement with experimental cell studies. The model fits the experimental data nearly as well as the three-parameter formula of Hug-Kellerer and is only based on two free parameters. It is shown that the LQ formalism is an approximation of the model derived in this work. It could be also shown that the derived model predicts a fractionated cell survival experiment better than the LQ-model. It was shown that cell survival can be described with a simple analytical formula on the basis of Poisson statistics. This solution represents in the limit of large dose the typical exponential behavior and predicts cell survival after fractionated dose application better than the LQ-model.

  20. Determining the Probability of Close Approach between Two Satellites

    Science.gov (United States)

    1986-12-01

    Cases 52 3. Orbital Parameters of Test Cases 53 4. Simulated Probability of Close Approach Versus COPCA Probability of Close...Approach 55 5. Simulated Probability of Close Approach Versus EOPCA Probability of Close Approach 56 B-l. COPCA Test Results For Distance...Thresholds Less Than 4000 km 70 B-2. COPCA Test Results For Distance Thresholds Not Less Than 4000 km 71 B-3. COPCA Test

  1. Efficacy of Sanitizer Treatments on Survival and Growth Parameters of Escherichia coli O157:H7, Salmonella, and Listeria monocytogenes on Fresh-Cut Pieces of Cantaloupe during Storage.

    Science.gov (United States)

    Ukuku, Dike O; Huang, Lihan; Sommers, Christopher

    2015-07-01

    For health reasons, people are consuming fresh-cut fruits with or without minimal processing and, thereby, exposing themselves to the risk of foodborne illness if such fruits are contaminated with bacterial pathogens. This study investigated survival and growth parameters of Escherichia coli O157:H7, Salmonella, Listeria monocytogenes, and aerobic mesophilic bacteria transferred from cantaloupe rind surfaces to fresh-cut pieces during fresh-cut preparation. All human bacterial pathogens inoculated on cantaloupe rind surfaces averaged ∼4.8 log CFU/cm(2), and the populations transferred to fresh-cut pieces before washing treatments ranged from 3 to 3.5 log CFU/g for all pathogens. A nisin-based sanitizer developed in our laboratory and chlorinated water at 1,000 mg/liter were evaluated for effectiveness in minimizing transfer of bacterial populations from cantaloupe rind surface to fresh-cut pieces. Inoculated and uninoculated cantaloupes were washed for 5 min before fresh-cut preparation and storage of fresh-cut pieces at 5 and 10°C for 15 days and at 22°C for 24 h. In fresh-cut pieces from cantaloupe washed with chlorinated water, only Salmonella was found (0.9 log CFU/g), whereas E. coli O157:H7 and L. monocytogenes were positive only by enrichment. The nisin-based sanitizer prevented transfer of human bacteria from melon rind surfaces to fresh-cut pieces, and the populations in fresh-cut pieces were below detection even by enrichment. Storage temperature affected survival and the growth rate for each type of bacteria on fresh-cut cantaloupe. Specific growth rates of E. coli O157:H7, Salmonella, and L. monocytogenes in fresh-cut pieces were similar, whereas the aerobic mesophilic bacteria grew 60 to 80 % faster and had shorter lag phases.

  2. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  3. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  4. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  6. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Liquefaction Probability Curves for Surficial Geologic Units

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2009-12-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both

  9. Verification and estimation of a posterior probability and probability density function using vector quantization and neural network

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Hee Seok; Kim, Hyun Duck [Kyungnam University, Masan (Korea, Republic of); Lee, Kwang Seok [Chinju National University (Korea, Republic of)

    1996-02-01

    In this paper, we proposed an estimation method of a posterior probability and PDF(Probability density function) using a feed forward neural network and code books of VQ(vector quantization). In this study, We estimates a posterior probability and probability density function, which compose a new parameter with well-known Mel cepstrum and verificate the performance for the five vowels taking from syllables by NN(neural network) and PNN(probabilistic neural network). In case of new parameter, showed the best result by probabilistic neural network and recognition rates are average 83.02%. (author). 7 refs., 4 figs., 3 tabs.

  10. Survival Predictions of Ceramic Crowns Using Statistical Fracture Mechanics.

    Science.gov (United States)

    Nasrin, S; Katsube, N; Seghi, R R; Rokhlin, S I

    2017-05-01

    This work establishes a survival probability methodology for interface-initiated fatigue failures of monolithic ceramic crowns under simulated masticatory loading. A complete 3-dimensional (3D) finite element analysis model of a minimally reduced molar crown was developed using commercially available hardware and software. Estimates of material surface flaw distributions and fatigue parameters for 3 reinforced glass-ceramics (fluormica [FM], leucite [LR], and lithium disilicate [LD]) and a dense sintered yttrium-stabilized zirconia (YZ) were obtained from the literature and incorporated into the model. Utilizing the proposed fracture mechanics-based model, crown survival probability as a function of loading cycles was obtained from simulations performed on the 4 ceramic materials utilizing identical crown geometries and loading conditions. The weaker ceramic materials (FM and LR) resulted in lower survival rates than the more recently developed higher-strength ceramic materials (LD and YZ). The simulated 10-y survival rate of crowns fabricated from YZ was only slightly better than those fabricated from LD. In addition, 2 of the model crown systems (FM and LD) were expanded to determine regional-dependent failure probabilities. This analysis predicted that the LD-based crowns were more likely to fail from fractures initiating from margin areas, whereas the FM-based crowns showed a slightly higher probability of failure from fractures initiating from the occlusal table below the contact areas. These 2 predicted fracture initiation locations have some agreement with reported fractographic analyses of failed crowns. In this model, we considered the maximum tensile stress tangential to the interfacial surface, as opposed to the more universally reported maximum principal stress, because it more directly impacts crack propagation. While the accuracy of these predictions needs to be experimentally verified, the model can provide a fundamental understanding of the

  11. Local linear estimation of concordance probability with application to covariate effects models on association for bivariate failure-time data.

    Science.gov (United States)

    Ding, Aidong Adam; Hsieh, Jin-Jian; Wang, Weijing

    2015-01-01

    Bivariate survival analysis has wide applications. In the presence of covariates, most literature focuses on studying their effects on the marginal distributions. However covariates can also affect the association between the two variables. In this article we consider the latter issue by proposing a nonstandard local linear estimator for the concordance probability as a function of covariates. Under the Clayton copula, the conditional concordance probability has a simple one-to-one correspondence with the copula parameter for different data structures including those subject to independent or dependent censoring and dependent truncation. The proposed method can be used to study how covariates affect the Clayton association parameter without specifying marginal regression models. Asymptotic properties of the proposed estimators are derived and their finite-sample performances are examined via simulations. Finally, for illustration, we apply the proposed method to analyze a bone marrow transplant data set.

  12. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  13. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  14. An Improved Upper Bound for the Critical Probability of the Frog Model on Homogeneous Trees

    Science.gov (United States)

    Lebensztayn, Élcio; Machado, Fábio P.; Popov, Serguei

    2005-04-01

    We study the frog model on homogeneous trees, a discrete time system of simple symmetric random walks whose description is as follows. There are active and inactive particles living on the vertices. Each active particle performs a simple symmetric random walk having a geometrically distributed random lifetime with parameter (1 - p). When an active particle hits an inactive particle, the latter becomes active. We obtain an improved upper bound for the critical parameter for having indefinite survival of active particles, in the case of one-particle-per-vertex initial configuration. The main tool is to construct a class of branching processes which are dominated by the frog model and analyze their supercritical behavior. This approach allows us also to present an upper bound for the critical probability in the case of random initial configuration.

  15. Optimized lower leg injury probability curves from postmortem human subject tests under axial impacts.

    Science.gov (United States)

    Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko

    2014-01-01

    Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 k

  16. Flexural strength and the probability of failure of cold isostatic pressed zirconia core ceramics.

    Science.gov (United States)

    Siarampi, Eleni; Kontonasaki, Eleana; Papadopoulou, Lambrini; Kantiranis, Nikolaos; Zorba, Triantafillia; Paraskevopoulos, Konstantinos M; Koidis, Petros

    2012-08-01

    The flexural strength of zirconia core ceramics must predictably withstand the high stresses developed during oral function. The in-depth interpretation of strength parameters and the probability of failure during clinical performance could assist the clinician in selecting the optimum materials while planning treatment. The purpose of this study was to evaluate the flexural strength based on survival probability and Weibull statistical analysis of 2 zirconia cores for ceramic restorations. Twenty bar-shaped specimens were milled from 2 core ceramics, IPS e.max ZirCAD and Wieland ZENO Zr, and were loaded until fracture according to ISO 6872 (3-point bending test). An independent samples t test was used to assess significant differences of fracture strength (α=.05). Weibull statistical analysis of the flexural strength data provided 2 parameter estimates: Weibull modulus (m) and characteristic strength (σ(0)). The fractured surfaces of the specimens were evaluated by scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS). The investigation of the crystallographic state of the materials was performed with x-ray diffraction analysis (XRD) and Fourier transform infrared (FTIR) spectroscopy. Higher mean flexural strength (Pstrength of WZ ceramics was associated with a lower m and more voids in their microstructure. These findings suggest a greater scattering of strength values and a flaw distribution that are expected to increase failure probability. Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  17. Probability plots based on student’s t-distribution

    NARCIS (Netherlands)

    Hooft, R.W.W.|info:eu-repo/dai/nl/109722213; Straver, L.H.; Spek, A.L.|info:eu-repo/dai/nl/156517566

    2009-01-01

    The validity of the normal distribution as an error model is commonly tested with a (half) normal probability plot. Real data often contain outliers. The use of t-distributions in a probability plot to model such data more realistically is described. It is shown how a suitable value of the parameter

  18. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  19. Modelling survival

    DEFF Research Database (Denmark)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

    2016-01-01

    well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates...

  20. Training Teachers to Teach Probability

    Science.gov (United States)

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  1. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  2. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  3. Test for age-specificity in survival of the common tern

    Science.gov (United States)

    Nisbet, I.C.T.; Cam, E.

    2002-01-01

    Much effort in life-history theory has been addressed to the dependence of life-history traits on age, especially the phenomenon of senescence and its evolution. Although senescent declines in survival are well documented in humans and in domestic and laboratory animals, evidence for their occurrence and importance in wild animal species remains limited and equivocal. Several recent papers have suggested that methodological issues may contribute to this problem, and have encouraged investigators to improve sampling designs and to analyse their data using recently developed approaches to modelling of capture-mark-recapture data. Here we report on a three-year, two-site, mark-recapture study of known-aged common terns (Sterna hirundo) in the north-eastern USA. The study was nested within a long-term ecological study in which large numbers of chicks had been banded in each year for > 25 years. We used a range of models to test the hypothesis of an influence of age on survival probability. We also tested for a possible influence of sex on survival. The cross-sectional design of the study (one year's parameter estimates) avoided the possible confounding of effects of age and time. The study was conducted at a time when one of the study sites was being colonized and numbers were increasing rapidly. We detected two-way movements between the sites and estimated movement probabilities in the year for which they could be modelled. We also obtained limited data on emigration from our study area to more distant sites. We found no evidence that survival depended on either sex or age, except that survival was lower among the youngest birds (ages 2-3 years). Despite the large number of birds included in the study (1599 known-aged birds, 2367 total), confidence limits on estimates of survival probability were wide, especially for the oldest age-classes, so that a slight decline in survival late in life could not have been detected. In addition, the cross-sectional design of this

  4. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  5. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  6. Foreign Ownership and Long-term Survival

    DEFF Research Database (Denmark)

    Kronborg, Dorte; Thomsen, Steen

    2006-01-01

    Does foreign ownership enhance or decrease a firm's chances of survival? Over the 100 year period 1895-2001 this paper compares the survival of foreign subsidiaries in Denmark to a control sample matched by industry and firm size. We find that foreign-owned companies have higher survival...... probability. On average exit risk for domestic companies is 2.3 times higher than for foreign companies. First movers like Siemens, Philips, Kodak, Ford, GM or Goodyear have been active in the country for almost a century. Relative foreign survival increases with company age. However, the foreign survival...

  7. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  8. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  9. Flu Shots, Mammogram, and the Perception of Probabilities

    NARCIS (Netherlands)

    Carman, K.G.; Kooreman, P.

    2010-01-01

    We study individuals’ decisions to decline or accept preventive health care interventions such as flu shots and mammograms. In particular, we analyze the role of perceptions of the effectiveness of the intervention, by eliciting individuals' subjective probabilities of sickness and survival, with

  10. Clustered survival data with left-truncation

    DEFF Research Database (Denmark)

    Eriksson, Frank; Martinussen, Torben; Scheike, Thomas H.

    2015-01-01

    Left-truncation occurs frequently in survival studies, and it is well known how to deal with this for univariate survival times. However, there are few results on how to estimate dependence parameters and regression effects in semiparametric models for clustered survival data with delayed entry...

  11. Relief for surviving relatives following a suicide.

    NARCIS (Netherlands)

    Oud, MJT; de Groot, MH

    2006-01-01

    Relief for surviving relatives following a suicide. - After the suicide of a 43-year-old woman with known depression, a 41-year-old paraplegic man who recently developed diarrhoea and a 41-year-old woman with probable depression with symptoms of psychosis, the general practitioners of the surviving

  12. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  13. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  14. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  15. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  16. Transition Probabilities of Gd I

    Science.gov (United States)

    Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.

    2011-01-01

    Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.

  17. Estimating true instead of apparent survival using spatial Cormack-Jolly-Seber models

    Science.gov (United States)

    Schaub, Michael; Royle, J. Andrew

    2014-01-01

    Survival is often estimated from capture–recapture data using Cormack–Jolly–Seber (CJS) models, where mortality and emigration cannot be distinguished, and the estimated apparent survival probability is the product of the probabilities of true survival and of study area fidelity. Consequently, apparent survival is lower than true survival unless study area fidelity equals one. Underestimation of true survival from capture–recapture data is a main limitation of the method.

  18. Extensions and applications of the Cox-Aalen survival model.

    Science.gov (United States)

    Scheike, Thomas H; Zhang, Mei-Jie

    2003-12-01

    Cox's regression model is the standard regression tool for survival analysis in most applications. Often, however, the model only provides a rough summary of the effect of some covariates. Therefore, if the aim is to give a detailed description of covariate effects and to consequently calculate predicted probabilities, more flexible models are needed. In another article, Scheike and Zhang (2002, Scandinavian Journal of Statistics 29, 75-88), we suggested a flexible extension of Cox's regression model, which aimed at extending the Cox model only for those covariates where additional flexibility are needed. One important advantage of the suggested approach is that even though covariates are allowed a nonparametric effect, the hassle and difficulty of finding smoothing parameters are not needed. We show how the extended model also leads to simple formulae for predicted probabilities and their standard errors, for example, in the competing risk framework.

  19. An all-timescales rainfall probability distribution

    Science.gov (United States)

    Papalexiou, S. M.; Koutsoyiannis, D.

    2009-04-01

    The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.

  20. [Epidemiological analysis of leukemia survival in Cracow for cases registered in 1980-1990].

    Science.gov (United States)

    Fornal, Maria; Janicki, Kazimierz; Grodzicki, Tomasz

    2003-01-01

    The aim of the study was epidemiological analysis of survival from all types of leukemia occurring in Cracow in the years 1980-1990. The study was focused on survival times in patients according to a) cytologico-clinical type of leukemia, b) timeframe in which treatment was initiated (between 1980-1985 and 1986-1090). All patients diagnosed of leukemia between the years 1980-1990, living in Cracow and whose cytologico-clinical picture was determined had their survival times and censored survival times established. Survival until 1997 was taken into account. For each cytologico-clinical type of leukemia survival function according to Kaplan-Meier was calculated. The Cox model was implemented to analyze the risk of death depending on the period in which the disease appeared--two time frames were established 1980-1985 and 1986-1990. Other parameters considered were; age, sex and area in which the patient lived (suburb). Practically in all types of leukemia a higher probability of survival was found in patients in whom leukemia was diagnosed (and consequently treated) in the second period i.e., 1986-1990. The highest achievement was observed in acute lymphoblastic leukemia in children, in which the relative 5-year survival probability rose from 35% in the years 1980-1985 to 78% in the years 1986-1990, thus achieving the level of well developed countries. A similar picture was seen in chronic lymphocytic leukemia where the relative 5 year survival probability rose from 57% to 77%, and in chronic granulocytic leukemia where the 5 year survival probabilities were accordingly 23% and 39%. All cited values for the second period of analysis are at the levels noted in the United States and in Europe. The positive changes in the survival times observed in patients with leukemia seen in the second half of the 80-ies (in comparison to the period 1980-1985) has been interpreted as the result of advancements in therapy of the disease in Cracow.

  1. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  2. Probability-based TCP congestion control mechanism

    Science.gov (United States)

    Xu, Changbiao; Yang, Shizhong; Xian, Yongju

    2005-11-01

    To mitigate TCP global synchronization and improve network throughput, an improved TCP congestion control mechanism is proposed, namely P-TCP, which adopts the probability-based way to adjust congestion window independently when the network occurs congestion. Therefore, some P-TCP connections may decrease the congestion window greatly while other P-TCP connections may decrease the congestion window lightly. Simulation results show that TCP global synchronization can be effectively mitigated, which leads to efficient utilization of network resources as well as the effective mitigation for network congestion. Simulation results also give some valuable references for determining the related parameters in P-TCP.

  3. Innovations’ Survival

    Directory of Open Access Journals (Sweden)

    Jakub Tabas

    2016-01-01

    Full Text Available Innovations currently represent a tool of maintaining the going concern of a business entity and its competitiveness. However, effects of innovations are not infinite and if an innovation should constantly preserve a life of business entity, it has to be a continual chain of innovations, i.e. continual process. Effective live of a single innovation is limited while the limitation is derived especially from industry. The paper provides the results of research on innovations effects in the financial performance of small and medium-sized enterprises in the Czech Republic. Objective of this paper is to determine the length and intensity of the effects of technical innovations in company’s financial performance. The economic effect of innovations has been measured at application of company’s gross production power while the Deviation Analysis has been applied for three years’ time series. Subsequently the Survival Analysis has been applied. The analyses are elaborated for three statistical samples of SMEs constructed in accordance to the industry. The results obtained show significant differences in innovations’ survival within these three samples of enterprises then. The results are quite specific for the industries, and are confronted and discussed with the results of authors’ former research on the issue.

  4. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  6. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  7. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  8. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  9. Incompatible Stochastic Processes and Complex Probabilities

    Science.gov (United States)

    Zak, Michail

    1997-01-01

    The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.

  10. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  11. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  12. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  13. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  14. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  15. Survival of radio-implanted drymarchon couperi (Eastern Indigo Snake) in relation to body size and sex

    Science.gov (United States)

    Hyslop, N.L.; Meyers, J.M.; Cooper, R.J.; Norton, Terry M.

    2009-01-01

    Drymarchon couperi (eastern indigo snake) has experienced population declines across its range primarily as a result of extensive habitat loss, fragmentation, and degradation. Conservation efforts for D. couperi have been hindered, in part, because of informational gaps regarding the species, including a lack of data on population ecology and estimates of demographic parameters such as survival. We conducted a 2- year radiotelemetry study of D. couperi on Fort Stewart Military Reservation and adjacent private lands located in southeastern Georgia to assess individual characteristics associated with probability of survival. We used known-fate modeling to estimate survival, and an information-theoretic approach, based on a priori hypotheses, to examine intraspecific differences in survival probabilities relative to individual covariates (sex, size, size standardized by sex, and overwintering location). Annual survival in 2003 and 2004 was 0.89 (95% CI = 0.73-0.97, n = 25) and 0.72 (95% CI = 0.52-0.86; n = 27), respectively. Results indicated that body size, standardized by sex, was the most important covariate determining survival of adult D. couperi, suggesting lower survival for larger individuals within each sex. We are uncertain of the mechanisms underlying this result, but possibilities may include greater resource needs for larger individuals within each sex, necessitating larger or more frequent movements, or a population with older individuals. Our results may also have been influenced by analysis limitations because of sample size, other sources of individual variation, or environmental conditions. ?? 2009 by The Herpetologists' League, Inc.

  16. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  17. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    George, L.L.

    1983-01-01

    This paper describes seismic risk, load combination, and probabilistic risk problems in power plant reliability, and it suggests applications of extreme value theory. Seismic risk analysis computes the probability of power plant failure in an earthquake and the resulting risk. Components fail if their peak responses to an earthquake exceed their strengths. Dependent stochastic processes represent responses, and peak responses are maxima. A Boolean function of component failures and survivals represents plant failure. Load combinations analysis computes the cdf of the peak of the superposition of stochastic processes that represent earthquake and operating loads. It also computes the probability of pipe fracture due to crack growth, a Markov process, caused by loads. Pipe fracture is an absorbing state. Probabilistic risk analysis computes the cdf's of probabilities which represent uncertainty. These Cdf's are induced by randomizing parameters of cdf's and by randomizing properties of stochastic processes such as initial crack size distributions, marginal cdf's, and failure criteria.

  18. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  19. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  20. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  1. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  2. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  3. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  4. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  5. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  6. The effect of chemical weapons incineration on the survival rates of Red-tailed Tropicbirds

    Science.gov (United States)

    Schreiber, E.A.; Schenk, G.A.; Doherty, P.F.

    2001-01-01

    In 1992, the Johnston Atoll Chemical Agent Disposal System (JACADS) began incinerating U.S. chemical weapons stockpiles on Johnston Atoll (Pacific Ocean) where about 500,000 seabirds breed, including Red-tailed Tropicbirds (Phaethon rubricauda). We hypothesized that survival rates of birds were lower in those nesting downwind of the incinerator smokestack compared to those upwind, and that birds might move away from the area. From 1992 - 2000 we monitored survival and movements between areas upwind and downwind from the JACADS facility. We used a multi-strata mark recapture approach to model survival, probability of recapture and movement. Probability of recapture was significantly higher for birds in downwind areas (owing to greater recapture effort) and thus was an important 'nuisance' parameter to take into account in modeling. We found no differences in survival between birds nesting upwind ( 0.8588) and downwind (0.8550). There was no consistent difference in movement rates between upwind or downwind areas from year to year: differences found may be attributed to differing vegetation growth and human activities between the areas. Our results suggest that JACADS has had no documentable influence on the survival and year to year movement of Red-tailed Tropicbirds.

  7. rft1d: Smooth One-Dimensional Random Field Upcrossing Probabilities in Python

    Directory of Open Access Journals (Sweden)

    Todd C. Pataky

    2016-07-01

    Full Text Available Through topological expectations regarding smooth, thresholded n-dimensional Gaussian continua, random field theory (RFT describes probabilities associated with both the field-wide maximum and threshold-surviving upcrossing geometry. A key application of RFT is a correction for multiple comparisons which affords field-level hypothesis testing for both univariate and multivariate fields. For unbroken isotropic fields just one parameter in addition to the mean and variance is required: the ratio of a field's size to its smoothness. Ironically the simplest manifestation of RFT (1D unbroken fields has rarely surfaced in the literature, even during its foundational development in the late 1970s. This Python package implements 1D RFT primarily for exploring and validating RFT expectations, but also describes how it can be applied to yield statistical inferences regarding sets of experimental 1D fields.

  8. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    In probability theory, statistics, statistical mechanics, communication theory, and other fields of science, the calculation of Rényi and Tsallis entropies [1–3] for probability density function ρ(x) involves integral. ∫ b a [ρ(x)] q dx, where q ≥ 0 is a parameter. The aim of this paper is to present a procedure for the discretization of ...

  9. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  10. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  11. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  12. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.

  13. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  14. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  15. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  16. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  17. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  18. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented......The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...

  19. Joint modelling of longitudinal CEA tumour marker progression and survival data on breast cancer

    Science.gov (United States)

    Borges, Ana; Sousa, Inês; Castro, Luis

    2017-06-01

    This work proposes the use of Biostatistics methods to study breast cancer in patients of Braga's Hospital Senology Unit, located in Portugal. The primary motivation is to contribute to the understanding of the progression of breast cancer, within the Portuguese population, using a more complex statistical model assumptions than the traditional analysis that take into account a possible existence of a serial correlation structure within a same subject observations. We aim to infer which risk factors aect the survival of Braga's Hospital patients, diagnosed with breast tumour. Whilst analysing risk factors that aect a tumour markers used on the surveillance of disease progression the Carcinoembryonic antigen (CEA). As survival and longitudinal processes may be associated, it is important to model these two processes together. Hence, a joint modelling of these two processes to infer on the association of these was conducted. A data set of 540 patients, along with 50 variables, was collected from medical records of the Hospital. A joint model approach was used to analyse these data. Two dierent joint models were applied to the same data set, with dierent parameterizations which give dierent interpretations to model parameters. These were used by convenience as the ones implemented in R software. Results from the two models were compared. Results from joint models, showed that the longitudinal CEA values were signicantly associated with the survival probability of these patients. A comparison between parameter estimates obtained in this analysis and previous independent survival[4] and longitudinal analysis[5][6], lead us to conclude that independent analysis brings up bias parameter estimates. Hence, an assumption of association between the two processes in a joint model of breast cancer data is necessary. Results indicate that the longitudinal progression of CEA is signicantly associated with the probability of survival of these patients. Hence, an assumption of

  20. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  1. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  2. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  3. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  4. Tobit regression for modeling mean survival time using data subject to multiple sources of censoring.

    Science.gov (United States)

    Gong, Qi; Schaubel, Douglas E

    2018-01-22

    Mean survival time is often of inherent interest in medical and epidemiologic studies. In the presence of censoring and when covariate effects are of interest, Cox regression is the strong default, but mostly due to convenience and familiarity. When survival times are uncensored, covariate effects can be estimated as differences in mean survival through linear regression. Tobit regression can validly be performed through maximum likelihood when the censoring times are fixed (ie, known for each subject, even in cases where the outcome is observed). However, Tobit regression is generally inapplicable when the response is subject to random right censoring. We propose Tobit regression methods based on weighted maximum likelihood which are applicable to survival times subject to both fixed and random censoring times. Under the proposed approach, known right censoring is handled naturally through the Tobit model, with inverse probability of censoring weighting used to overcome random censoring. Essentially, the re-weighting data are intended to represent those that would have been observed in the absence of random censoring. We develop methods for estimating the Tobit regression parameter, then the population mean survival time. A closed form large-sample variance estimator is proposed for the regression parameter estimator, with a semiparametric bootstrap standard error estimator derived for the population mean. The proposed methods are easily implementable using standard software. Finite-sample properties are assessed through simulation. The methods are applied to a large cohort of patients wait-listed for kidney transplantation. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  6. Social class and survival on the S.S. Titanic.

    Science.gov (United States)

    Hall, W

    1986-01-01

    Passengers' chances of surviving the sinking of the S.S. Titanic were related to their sex and their social class: females were more likely to survive than males, and the chances of survival declined with social class as measured by the class in which the passenger travelled. The probable reasons for these differences in rates of survival are discussed as are the reasons accepted by the Mersey Committee of Inquiry into the sinking.

  7. On estimating the fracture probability of nuclear graphite components

    Science.gov (United States)

    Srinivasan, Makuteswara

    2008-10-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation.

  8. Survival function of hypo-exponential distributions

    OpenAIRE

    Lotfy, Mamdouh M.; Abdelsamad, Ali S.

    1985-01-01

    Approved for public release; distribution is unlimited The reliability of a system is the probability that the system will survive or complete an intended mission of certain duration. Describing all possible ways that a system can survive a mission in reliability shorthand gives a simple approach to reliability computations. Reliability computation for a system defined by shorthand notation is greatly dependent upon the convolution problem. Assuming constant component failure rates, this p...

  9. Assessing magnitude probability distribution through physics-based rupture scenarios

    Science.gov (United States)

    Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona

    2016-04-01

    When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.

  10. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure.

  11. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  12. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  13. Parameter Estimation

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Heitzig, Martina; Cameron, Ian

    2011-01-01

    In this chapter the importance of parameter estimation in model development is illustrated through various applications related to reaction systems. In particular, rate constants in a reaction system are obtained through parameter estimation methods. These approaches often require the application...... of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set...... of algebraic equations as the basis for parameter estimation.These approaches are illustrated using estimations of kinetic constants from reaction system models....

  14. On the Probability Density Functions of Forster-Greer-Thorbecke ...

    African Journals Online (AJOL)

    This study considers the possibility of using Pearson system of distributions to approximate the probability density functions of. Forster-Greer-Thorbecke (FGT) poverty indices. The application of the Pearson system reveals the potentials of normal and four parameter distributions in poverty analysis. Keywords: Distributional ...

  15. Applications of the Dirichlet distribution to forensic match probabilities.

    Science.gov (United States)

    Lange, K

    1995-01-01

    The Dirichlet distribution provides a convenient conjugate prior for Bayesian analyses involving multinomial proportions. In particular, allele frequency estimation can be carried out with a Dirichlet prior. If data from several distinct populations are available, then the parameters characterizing the Dirichlet prior can be estimated by maximum likelihood and then used for allele frequency estimation in each of the separate populations. This empirical Bayes procedure tends to moderate extreme multinomial estimates based on sample proportions. The Dirichlet distribution can also be employed to model the contributions from different ancestral populations in computing forensic match probabilities. If the ancestral populations are in genetic equilibrium, then the product rule for computing match probabilities is valid conditional on the ancestral contributions to a typical person of the reference population. This fact facilitates computation of match probabilities and tight upper bounds to match probabilities.

  16. Parameter estimation through ignorance.

    Science.gov (United States)

    Du, Hailiang; Smith, Leonard A

    2012-07-01

    Dynamical modeling lies at the heart of our understanding of physical systems. Its role in science is deeper than mere operational forecasting, in that it allows us to evaluate the adequacy of the mathematical structure of our models. Despite the importance of model parameters, there is no general method of parameter estimation outside linear systems. A relatively simple method of parameter estimation for nonlinear systems is introduced, based on variations in the accuracy of probability forecasts. It is illustrated on the logistic map, the Henon map, and the 12-dimensional Lorenz96 flow, and its ability to outperform linear least squares in these systems is explored at various noise levels and sampling rates. As expected, it is more effective when the forecast error distributions are non-Gaussian. The method selects parameter values by minimizing a proper, local skill score for continuous probability forecasts as a function of the parameter values. This approach is easier to implement in practice than alternative nonlinear methods based on the geometry of attractors or the ability of the model to shadow the observations. Direct measures of inadequacy in the model, the "implied ignorance," and the information deficit are introduced.

  17. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  18. A practical overview on probability distributions

    OpenAIRE

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-01-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a bino...

  19. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  20. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K) and population density (N) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc , while N, and for most regions K, was generally positively correlated with Pocc . Thus, in temperate forest trees the regions of highest occurrence probability are

  1. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  2. Response-probability volume histograms and iso-probability of response charts in treatment plan evaluation.

    Science.gov (United States)

    Mavroidis, Panayiotis; Ferreira, Brigida Costa; Lopes, Maria do Carmo

    2011-05-01

    This study aims at demonstrating a new method for treatment plan evaluation and comparison based on the radiobiological response of individual voxels. This is performed by applying them on three different cancer types and treatment plans of different conformalities. Furthermore, their usefulness is examined in conjunction with traditionally applied radiobiological and dosimetric treatment plan evaluation criteria. Three different cancer types (head and neck, breast and prostate) were selected to quantify the benefits of the proposed treatment plan evaluation method. In each case, conventional conformal radiotherapy (CRT) and intensity modulated radiotherapy (IMRT) treatment configurations were planned. Iso-probability of response charts was produced by calculating the response probability in every voxel using the linear-quadratic-Poisson model and the dose-response parameters of the corresponding structure to which this voxel belongs. The overall probabilities of target and normal tissue responses were calculated using the Poisson and the relative seriality models, respectively. The 3D dose distribution converted to a 2 Gy fractionation, D2(GY) and iso-BED distributions are also shown and compared with the proposed methodology. Response-probability volume histograms (RVH) were derived and compared with common dose volume histograms (DVH). The different dose distributions were also compared using the complication-free tumor control probability, P+, the biologically effective uniform dose, D, and common dosimetric criteria. 3D Iso-probability of response distributions is very useful for plan evaluation since their visual information focuses on the doses that are likely to have a larger clinical effect in that particular organ. The graphical display becomes independent of the prescription dose highlighting the local radiation therapy effect in each voxel without the loss of important spatial information. For example, due to the exponential nature of the Poisson

  3. Foreign Ownership and long-term Survival

    OpenAIRE

    Kronborg, Dorte; Thomsen, Steen

    2006-01-01

    Does foreign ownership enhance or decrease a firm’s chances of survival? Over the 100 year period 1895-2001 this paper compares the survival of foreign subsidiaries in Denmark to a control sample matched by industry and firm size. We find that foreign-owned companies have higher survival probability. On average exit risk for domestic companies is 2.3 times higher than for foreign companies. First movers like Siemens, Philips, Kodak, Ford, GM or Goodyear have been active in the country for alm...

  4. Dependence in Probability and Statistics

    CERN Document Server

    Doukhan, Paul; Surgailis, Donatas; Teyssiere, Gilles

    2010-01-01

    This volume collects recent works on weakly dependent, long-memory and multifractal processes and introduces new dependence measures for studying complex stochastic systems. Other topics include the statistical theory for bootstrap and permutation statistics for infinite variance processes, the dependence structure of max-stable processes, and the statistical properties of spectral estimators of the long memory parameter. The asymptotic behavior of Fejer graph integrals and their use for proving central limit theorems for tapered estimators are investigated. New multifractal processes are intr

  5. Using Inverse Probability Bootstrap Sampling to Eliminate Sample Induced Bias in Model Based Analysis of Unequal Probability Samples.

    Science.gov (United States)

    Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E

    2015-01-01

    In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be

  6. Tumor expression, plasma levels and genetic polymorphisms of the coagulation inhibitor TFPI are associated with clinicopathological parameters and survival in breast cancer, in contrast to the coagulation initiator TF.

    Science.gov (United States)

    Tinholt, Mari; Vollan, Hans Kristian Moen; Sahlberg, Kristine Kleivi; Jernström, Sandra; Kaveh, Fatemeh; Lingjærde, Ole Christian; Kåresen, Rolf; Sauer, Torill; Kristensen, Vessela; Børresen-Dale, Anne-Lise; Sandset, Per Morten; Iversen, Nina

    2015-03-26

    Hypercoagulability in malignancy increases the risk of thrombosis, but is also involved in cancer progression. Experimental studies suggest that tissue factor (TF) and tissue factor pathway inhibitor (TFPI) are involved in cancer biology as a tumor- promoter and suppressor, respectively, but the clinical significance is less clear. Here, we aimed to investigate the clinical relevance of TF and TFPI genetic and phenotypic diversity in breast cancer. The relationship between tumor messenger RNA (mRNA) expression and plasma levels of TF and TFPI (α and β), tagging single nucleotide polymorphisms (tagSNPs) in F3 (TF) (n=6) and TFPI (n=18), and clinicopathological characteristics and molecular tumor subtypes were explored in 152 treatment naive breast cancer patients. The effect of tumor expressed TF and TFPIα and TFPIβ on survival was investigated in a merged breast cancer dataset of 1881 patients. Progesterone receptor negative patients had higher mRNA expression of total TFPI (α+β) (P=0.021) and TFPIβ (P=0.014) in tumors. TF mRNA expression was decreased in grade 3 tumors (P=0.003). In plasma, total TFPI levels were decreased in patients with larger tumors (P=0.013). SNP haplotypes of TFPI, but not TF, were associated with specific clinicopathological characteristics like tumor size (odds ratio (OR) 3.14, P=0.004), triple negativity (OR 2.4, P=0.004), lymph node spread (OR 3.34, P=0.006), and basal-like (OR 2.3, P=0.011) and luminal B (OR 3.5, P=0.005) molecular tumor subtypes. Increased expression levels of TFPIα and TFPIβ in breast tumors were associated with better outcome in all tumor subtypes combined (P=0.007 and P=0.005) and in multiple subgroups, including lymph node positive subjects (P=0.006 and P=0.034). This study indicates that genetic and phenotypic variation of both TFPIα and TFPIβ, more than TF, are markers of cancer progression. Together with the previously demonstrated tumor suppressor effects of TFPI, the beneficial effect of tumor

  7. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  8. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  9. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...

  10. Probability output modeling for support vector machines

    Science.gov (United States)

    Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian

    2007-11-01

    In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.

  11. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  12. Bayesian probability theory applications in the physical sciences

    CERN Document Server

    Linden, Wolfgang von der; Toussaint, Udo von

    2014-01-01

    From the basics to the forefront of modern research, this book presents all aspects of probability theory, statistics and data analysis from a Bayesian perspective for physicists and engineers. The book presents the roots, applications and numerical implementation of probability theory, and covers advanced topics such as maximum entropy distributions, stochastic processes, parameter estimation, model selection, hypothesis testing and experimental design. In addition, it explores state-of-the art numerical techniques required to solve demanding real-world problems. The book is ideal for students and researchers in physical sciences and engineering.

  13. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...

  14. CRC standard probability and statistics tables and formulae Student ed.

    CERN Document Server

    Kokoska, Stephen

    2000-01-01

    Users of statistics in their professional lives and statistics students will welcome this concise, easy-to-use reference for basic statistics and probability. It contains all of the standardized statistical tables and formulas typically needed plus material on basic statistics topics, such as probability theory and distributions, regression, analysis of variance, nonparametric statistics, and statistical quality control.For each type of distribution the authors supply:?definitions?tables?relationships with other distributions, including limiting forms?statistical parameters, such as variance a

  15. Probability plots based on Student's t-distribution.

    Science.gov (United States)

    Hooft, Rob W W; Straver, Leo H; Spek, Anthony L

    2009-07-01

    The validity of the normal distribution as an error model is commonly tested with a (half) normal probability plot. Real data often contain outliers. The use of t-distributions in a probability plot to model such data more realistically is described. It is shown how a suitable value of the parameter nu of the t-distribution can be determined from the data. The results suggest that even data that seem to be modeled well using a normal distribution can be better modeled using a t-distribution.

  16. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  17. Probability of flooding: An uncertainty analysis

    NARCIS (Netherlands)

    Slijkhuis, K.A.H.; Frijters, M.P.C.; Cooke, R.M.; Vrouwenvelder, A.C.W.M.

    1998-01-01

    In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could

  18. Lévy processes in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.

  19. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  20. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...

  1. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  2. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  3. Estimating demographic parameters using a combination of known-fate and open N-mixture models

    Science.gov (United States)

    Schmidt, Joshua H.; Johnson, Devin S.; Lindberg, Mark S.; Adams, Layne G.

    2015-01-01

    Accurate estimates of demographic parameters are required to infer appropriate ecological relationships and inform management actions. Known-fate data from marked individuals are commonly used to estimate survival rates, whereas N-mixture models use count data from unmarked individuals to estimate multiple demographic parameters. However, a joint approach combining the strengths of both analytical tools has not been developed. Here we develop an integrated model combining known-fate and open N-mixture models, allowing the estimation of detection probability, recruitment, and the joint estimation of survival. We demonstrate our approach through both simulations and an applied example using four years of known-fate and pack count data for wolves (Canis lupus). Simulation results indicated that the integrated model reliably recovered parameters with no evidence of bias, and survival estimates were more precise under the joint model. Results from the applied example indicated that the marked sample of wolves was biased toward individuals with higher apparent survival rates than the unmarked pack mates, suggesting that joint estimates may be more representative of the overall population. Our integrated model is a practical approach for reducing bias while increasing precision and the amount of information gained from mark–resight data sets. We provide implementations in both the BUGS language and an R package.

  4. Inventory parameters

    CERN Document Server

    Sharma, Sanjay

    2017-01-01

    This book provides a detailed overview of various parameters/factors involved in inventory analysis. It especially focuses on the assessment and modeling of basic inventory parameters, namely demand, procurement cost, cycle time, ordering cost, inventory carrying cost, inventory stock, stock out level, and stock out cost. In the context of economic lot size, it provides equations related to the optimum values. It also discusses why the optimum lot size and optimum total relevant cost are considered to be key decision variables, and uses numerous examples to explain each of these inventory parameters separately. Lastly, it provides detailed information on parameter estimation for different sectors/products. Written in a simple and lucid style, it offers a valuable resource for a broad readership, especially Master of Business Administration (MBA) students.

  5. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  6. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  7. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  8. Approximating the Probability of Mortality Due to Protracted Radiation Exposures

    Science.gov (United States)

    2016-06-01

    2.25 168 573.4 924.7 1291.1 2.25 Radiological weapons (“dirty bombs”) will in most cases disperse radionuclides whose half-life is long enough that...Under the current Nuclear Survivability and Forensics contract, HDTRA1-14-D-0003; 0005, Dr. Paul Blake of DTRA/NTPR has supported the transition of...present approximate methods for estimating the probability of mortality due to radiological environments from nuclear weapon detonations or from a

  9. Adolescents' misinterpretation of health risk probability expressions.

    Science.gov (United States)

    Cohn, L D; Schydlower, M; Foley, J; Copeland, R L

    1995-05-01

    To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).

  10. A practical overview on probability distributions.

    Science.gov (United States)

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-03-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.

  11. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  12. Can confidence indicators forecast the probability of expansion in Croatia?

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2016-04-01

    Full Text Available The aim of this paper is to investigate how reliable are confidence indicators in forecasting the probability of expansion. We consider three Croatian Business Survey indicators: the Industrial Confidence Indicator (ICI, the Construction Confidence Indicator (BCI and the Retail Trade Confidence Indicator (RTCI. The quarterly data, used in the research, covered the periods from 1999/Q1 to 2014/Q1. Empirical analysis consists of two parts. The non-parametric Bry-Boschan algorithm is used for distinguishing periods of expansion from the period of recession in the Croatian economy. Then, various nonlinear probit models were estimated. The models differ with respect to the regressors (confidence indicators and the time lags. The positive signs of estimated parameters suggest that the probability of expansion increases with an increase in Confidence Indicators. Based on the obtained results, the conclusion is that ICI is the most powerful predictor of the probability of expansion in Croatia.

  13. Transition probabilities and radiative lifetimes of levels in F I

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Gueltekin, E-mail: gultekin@selcuk.edu.tr; Dogan, Duygu; Ates, Sule; Taser, Mehmet

    2012-07-15

    The electric dipole transition probabilities and the lifetimes of excited levels have been calculated using the weakest bound electron potential model theory (WBEPMT) and the quantum defect orbital theory (QDOT) in atomic fluorine. In the calculations, many of transition arrays included both multiplet and fine-structure transitions are considered. We employed Numerical Coulomb Approximation (NCA) wave functions and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii in determination of parameters. The necessary energy values have been taken from experimental energy data in the literature. The calculated transition probabilities and lifetimes have been compared with available theoretical and experimental results. A good agreement with results in literature has been obtained. Moreover, some transition probability and the lifetime values not existing in the literature for some highly excited levels have been obtained using these methods.

  14. Probability distributions for the magnification of quasars due to microlensing

    Science.gov (United States)

    Wambsganss, Joachim

    1992-01-01

    Gravitational microlensing can magnify the flux of a lensed quasar considerably and therefore possibly influence quasar source counts or the observed quasar luminosity function. A large number of distributions of magnification probabilities due to gravitational microlensing for finite sources are presented, with a reasonable coverage of microlensing parameter space (i.e., surface mass density, external shear, mass spectrum of lensing objects). These probability distributions were obtained from smoothing two-dimensional magnification patterns with Gaussian source profiles. Different source sizes ranging from 10 exp 14 cm to 5 x 10 exp 16 cm were explored. The probability distributions show a large variety of shapes. Coefficients of fitted slopes for large magnifications are presented.

  15. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  16. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  17. Trisomy 13 (Patau syndrome) with an 11-year survival.

    Science.gov (United States)

    Zoll, B; Wolf, J; Lensing-Hebben, D; Pruggmayer, M; Thorpe, B

    1993-01-01

    Trisomy 13 is very rare in live-born children. Only a small number of these children survive the first year and very few cases are reported to live longer. Survival time depends partly on the cytogenetic findings--full trisomy 13 or trisomy 13 mosaicism--and partly on the existence of serious somatic malformations. We report on a 11-year-old girl with full trisomy 13. In this case, missing cerebral and cardiovascular malformations probably allowed the long survival.

  18. Probability of noise- and rate-induced tipping.

    Science.gov (United States)

    Ritchie, Paul; Sieber, Jan

    2017-05-01

    We propose an approximation for the probability of tipping when the speed of parameter change and additive white noise interact to cause tipping. Our approximation is valid for small to moderate drift speeds and helps to estimate the probability of false positives and false negatives in early-warning indicators in the case of rate- and noise-induced tipping. We illustrate our approximation on a prototypical model for rate-induced tipping with additive noise using Monte Carlo simulations. The formula can be extended to close encounters of rate-induced tipping and is otherwise applicable to other forms of tipping. We also provide an asymptotic formula for the critical ramp speed of the parameter in the absence of noise for a general class of systems undergoing rate-induced tipping.

  19. Ubiquitous Log Odds: A Common Representation of Probability and Frequency Distortion in Perception, Action, and Cognition

    Science.gov (United States)

    Zhang, Hang; Maloney, Laurence T.

    2012-01-01

    In decision from experience, the source of probability information affects how probability is distorted in the decision task. Understanding how and why probability is distorted is a key issue in understanding the peculiar character of experience-based decision. We consider how probability information is used not just in decision-making but also in a wide variety of cognitive, perceptual, and motor tasks. Very similar patterns of distortion of probability/frequency information have been found in visual frequency estimation, frequency estimation based on memory, signal detection theory, and in the use of probability information in decision-making under risk and uncertainty. We show that distortion of probability in all cases is well captured as linear transformations of the log odds of frequency and/or probability, a model with a slope parameter, and an intercept parameter. We then consider how task and experience influence these two parameters and the resulting distortion of probability. We review how the probability distortions change in systematic ways with task and report three experiments on frequency distortion where the distortions change systematically in the same task. We found that the slope of frequency distortions decreases with the sample size, which is echoed by findings in decision from experience. We review previous models of the representation of uncertainty and find that none can account for the empirical findings. PMID:22294978

  20. Ubiquitous log odds: a common representation of probability and frequency distortion in perception, action and cognition

    Directory of Open Access Journals (Sweden)

    Hang eZhang

    2012-01-01

    Full Text Available In decision from experience, the source of probability information affects how probability is distorted in the decision task. Understanding how and why probability is distorted is a key issue in understanding the peculiar character of experience-based decision. We consider how probability information is used not just in decision making but also in a wide variety of cognitive, perceptual and motor tasks. Very similar patterns of distortion of probability/frequency information have been found in visual frequency estimation, frequency estimation based on memory, signal detection theory, and in the use of probability information in decision-making under risk and uncertainty. We show that distortion of probability in all cases is well captured as linear transformations of the log odds of frequency and/or probability, a model with a slope parameter and an intercept parameter. We then consider how task and experience influence these two parameters and the resulting distortion of probability. We review how the probability distortions change in systematic ways with task and report three experiments on frequency distortion where the distortions change systematically in the same task. We found that the slope of frequency distortions decreases with the sample size, which is echoed by findings in decision from experience. We review previous models of the representation of uncertainty and find that none can account for the empirical findings.

  1. Total probabilities of ensemble runoff forecasts

    Science.gov (United States)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2017-04-01

    Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes

  2. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  3. Probability Theory as Logic: Data Assimilation for Multiple Source Reconstruction

    Science.gov (United States)

    Yee, Eugene

    2012-03-01

    Probability theory as logic (or Bayesian probability theory) is a rational inferential methodology that provides a natural and logically consistent framework for source reconstruction. This methodology fully utilizes the information provided by a limited number of noisy concentration data obtained from a network of sensors and combines it in a consistent manner with the available prior knowledge (mathematical representation of relevant physical laws), hence providing a rigorous basis for the assimilation of this data into models of atmospheric dispersion for the purpose of contaminant source reconstruction. This paper addresses the application of this framework to the reconstruction of contaminant source distributions consisting of an unknown number of localized sources, using concentration measurements obtained from a sensor array. To this purpose, Bayesian probability theory is used to formulate the full joint posterior probability density function for the parameters of the unknown source distribution. A simulated annealing algorithm, applied in conjunction with a reversible-jump Markov chain Monte Carlo technique, is used to draw random samples of source distribution models from the posterior probability density function. The methodology is validated against a real (full-scale) atmospheric dispersion experiment involving a multiple point source release.

  4. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  5. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  6. Experience matters: information acquisition optimizes probability gain.

    Science.gov (United States)

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  7. UT Biomedical Informatics Lab (BMIL) Probability Wheel.

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  8. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  9. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  10. Fracture probability along a fatigue crack path

    Energy Technology Data Exchange (ETDEWEB)

    Makris, P. [Technical Univ., Athens (Greece)

    1995-03-01

    Long experience has shown that the strength of materials under fatigue load has a stochastic behavior, which can be expressed through the fracture probability. This paper deals with a new analytically derived law for the distribution of the fracture probability along a fatigue crack path. The knowledge of the distribution of the fatigue fracture probability along the crack path helps the connection between stress conditions and the expected fatigue life of a structure under stochasticly varying loads. (orig.)

  11. Probability and statistics: models for research

    National Research Council Canada - National Science Library

    Bailey, Daniel Edgar

    1971-01-01

    This book is an interpretative presentation of the mathematical and logical basis of probability and statistics, indulging in some mathematics, but concentrating on the logical and scientific meaning...

  12. A new approach to the "apparent survival" problem: estimating true survival rates from mark-recapture studies.

    Science.gov (United States)

    Gilroy, James J; Virzi, Thomas; Boulton, Rebecca L; Lockwood, Julie L

    2012-07-01

    Survival estimates generated from live capture-mark-recapture studies may be negatively biased due to the permanent emigration of marked individuals from the study area. In the absence of a robust analytical solution, researchers typically sidestep this problem by simply reporting estimates using the term "apparent survival." Here, we present a hierarchical Bayesian multistate model designed to estimate true survival by accounting for predicted rates of permanent emigration. Initially we use dispersal kernels to generate spatial projections of dispersal probability around each capture location. From these projections, we estimate emigration probability for each marked individual and use the resulting values to generate bias-adjusted survival estimates from individual capture histories. When tested using simulated data sets featuring variable detection probabilities, survival rates, and dispersal patterns, the model consistently eliminated negative biases shown by apparent survival estimates from standard models. When applied to a case study concerning juvenile survival in the endangered Cape Sable Seaside Sparrow (Ammodramus maritimus mirabilis), bias-adjusted survival estimates increased more than twofold above apparent survival estimates. Our approach is applicable to any capture-mark-recapture study design and should be particularly valuable for organisms with dispersive juvenile life stages.

  13. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

    Energy Technology Data Exchange (ETDEWEB)

    Taktak, Azzam F G [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Fisher, Anthony C [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Damato, Bertil E [Department of Ophthalmology, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom)

    2004-01-07

    This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate.

  14. Advantages of the probability amplitude over the probability density in quantum mechanics

    OpenAIRE

    Kurihara, Yoshimasa; Quach, Nhi My Uyen

    2013-01-01

    We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...

  15. Datamining approaches for modeling tumor control probability.

    Science.gov (United States)

    Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D

    2010-11-01

    Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.

  16. A web-based tool for eliciting probability distributions from experts

    OpenAIRE

    Morris, David E.; Oakley, Jeremy E.; Crowe, John A.

    2014-01-01

    We present a web-based probability distribution elicitation tool: The MATCH Uncertainty Elicitation Tool. The Tool is designed to help elicit probability distributions about uncertain model parameters from experts, in situations where suitable data is either unavailable or sparse. The Tool is free to use, and offers five different techniques for eliciting univariate probability distributions. A key feature of the Tool is that users can log in from different sites and view and interact with th...

  17. Combining gene signatures improves prediction of breast cancer survival.

    Directory of Open Access Journals (Sweden)

    Xi Zhao

    Full Text Available BACKGROUND: Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123 and test set (n = 81, respectively. Gene sets from eleven previously published gene signatures are included in the study. PRINCIPAL FINDINGS: To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014. Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001. The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. CONCLUSION: Combining the predictive strength of multiple gene signatures improves

  18. Sex Disparity in Survival of Patients With Uveal Melanoma: Better Survival Rates in Women Than in Men in South Korea.

    Science.gov (United States)

    Park, San Jun; Oh, Chang-Mo; Yeon, Bora; Cho, Hyunsoon; Park, Kyu Hyung

    2017-03-01

    The purpose of this study was to determine the survival rate of patients with uveal melanoma and sex disparity in this rate in South Korea. We extracted incident uveal melanoma patients using the Korea Central Cancer Registry (KCCR) database, which covered the entire population from 1999 to 2012 in South Korea. We estimated all-cause survival probabilities and cancer-specific survival probabilities of patients with uveal melanoma and compared these probabilities between subgroups (sex, tumor site, age at diagnosis, etc.) using Kaplan-Meier methods and log-rank tests. We fitted the Cox-proportional hazards models for all-cause death and cancer death to determine sex disparities in survival. A total of 344 uveal melanoma patients (175 women, 51%) were ascertained. They comprised 283 patients with choroidal melanoma (82%) and 61 patients with ciliary body/iris melanoma (18%). The observed 5-year survival probability from all-cause death was 75% (95% confidence interval [CI]: 69%-79%); women with uveal melanoma showed higher survival probability (83% [95% CI: 76%-89%]) compared with men (66% [95% CI: 58%-73%], P Korea, which requires further investigation of mechanism of the sex disparity in uveal melanoma.

  19. Formation and survival of Population III stellar systems

    Science.gov (United States)

    Hirano, Shingo; Bromm, Volker

    2017-09-01

    The initial mass function of the first, Population III (Pop III), stars plays a vital role in shaping galaxy formation and evolution in the early Universe. One key remaining issue is the final fate of secondary protostars formed in the accretion disc, specifically whether they merge or survive. We perform a suite of hydrodynamic simulations of the complex interplay among fragmentation, protostellar accretion and merging inside dark matter minihaloes. Instead of the traditional sink particle method, we employ a stiff equation of state approach, so that we can more robustly ascertain the viscous transport inside the disc. The simulations show inside-out fragmentation because the gas collapses faster in the central region. Fragments migrate on the viscous time-scale, over which angular momentum is lost, enabling them to move towards the disc centre, where merging with the primary protostar can occur. This process depends on the fragmentation scale, such that there is a maximum scale of (1-5) × 104 au, inside which fragments can migrate to the primary protostar. Viscous transport is active until radiative feedback from the primary protostar destroys the accretion disc. The final mass spectrum and multiplicity thus crucially depends on the effect of viscosity in the disc. The entire disc is subjected to efficient viscous transport in the primordial case with viscous parameter α ≤ 1. An important aspect of this question is the survival probability of Pop III binary systems, possible gravitational wave sources to be probed with the Advanced LIGO detectors.

  20. Analysis of the probability of channel satisfactory state in P2P live ...

    African Journals Online (AJOL)

    In this paper a model based on user behaviour of P2P live streaming systems was developed in order to analyse one of the key QoS parameter of such systems, i.e. the probability of channel-satisfactory state, the impact of upload bandwidths and channels' popularity on the probability of channel-satisfactory state was also ...

  1. Maximum likelihood estimation for predicting the probability of obtaining variable shortleaf pine regeneration densities

    Science.gov (United States)

    Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin

    2003-01-01

    A logistic equation is the basis for a model that predicts the probability of obtaining regeneration at specified densities. The density of regeneration (trees/ha) for which an estimate of probability is desired can be specified by means of independent variables in the model. When estimating parameters, the dependent variable is set to 1 if the regeneration density (...

  2. Undetected error probability for data services in a terrestrial DAB single frequency network

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.; Veldhuis, Raymond N.J.; Veldhuis, R.N.J.; Cronie, H.S.

    2007-01-01

    DAB (Digital Audio Broadcasting) is the European successor of FM radio. Besides audio services, other services such as traffic information can be provided. An important parameter for data services is the probability of non-recognized or undetected errors in the system. To derive this probability, we

  3. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  4. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  5. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  6. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  7. Analytical Study of Thermonuclear Reaction Probability Integrals

    OpenAIRE

    Chaudhry, M.A.; Haubold, H. J.; Mathai, A. M.

    2000-01-01

    An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.

  8. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  9. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  10. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  11. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  12. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  13. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  14. prep misestimates the probability of replication

    NARCIS (Netherlands)

    Iverson, G.; Lee, M.D.; Wagenmakers, E.-J.

    2009-01-01

    The probability of "replication," prep, has been proposed as a means of identifying replicable and reliable effects in the psychological sciences. We conduct a basic test of prep that reveals that it misestimates the true probability of replication, especially for small effects. We show how these

  15. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  16. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  17. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  18. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  19. Probability distributions of the electroencephalogram envelope of preterm infants.

    Science.gov (United States)

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Partial Generalized Probability Weighted Moments for Exponentiated Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Neema Mohamed El Haroun

    2015-09-01

    Full Text Available The generalized probability weighted moments are widely used in hydrology for estimating parameters of flood distributions from complete sample. The method of partial generalized probability weighted moments was used to estimate the parameters of distributions from censored sample. This article offers new method called partial generalized probability weighted moments (PGPWMs for the analysis of censored data. The method of PGPWMs is an extended class from partial generalized probability weighted moments. To illustrate the new method, estimation of the unknown parameters from exponentiated exponential distribution based on doubly censored sample is considered. PGPWMs estimators for right and left censored samples are obtained as special cases.   Simulation study is conducted to investigate performance of estimates for exponentiated exponential distribution. Comparison between estimators is made through simulation via their biases and  mean square errors. An illustration with real data is provided. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"جدول عادي"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi;}

  1. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  2. Challenges in the estimation of Net SURvival: The CENSUR working survival group.

    Science.gov (United States)

    Giorgi, R

    2016-10-01

    Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order

  3. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  4. Parameter Uncertainty in Exponential Family Tail Estimation

    OpenAIRE

    Landsman, Z.; Tsanakas, A.

    2012-01-01

    Actuaries are often faced with the task of estimating tails of loss distributions from just a few observations. Thus estimates of tail probabilities (reinsurance prices) and percentiles (solvency capital requirements) are typically subject to substantial parameter uncertainty. We study the bias and MSE of estimators of tail probabilities and percentiles, with focus on 1-parameter exponential families. Using asymptotic arguments it is shown that tail estimates are subject to significant positi...

  5. Multinationals and plant survival

    DEFF Research Database (Denmark)

    Bandick, Roger

    2010-01-01

    The aim of this paper is twofold: first, to investigate how different ownership structures affect plant survival, and second, to analyze how the presence of foreign multinational enterprises (MNEs) affects domestic plants’ survival. Using a unique and detailed data set on the Swedish manufacturing...... sector, I am able to separate plants into those owned by foreign MNEs, domestic MNEs, exporting non-MNEs, and purely domestic firms. In line with previous findings, the result, when conditioned on other factors affecting survival, shows that foreign MNE plants have lower survival rates than non......-MNE plants. However, separating the non-MNEs into exporters and non-exporters, the result shows that foreign MNE plants have higher survival rates than non-exporting non-MNEs, while the survival rates of foreign MNE plants and exporting non-MNE plants do not seem to differ. Moreover, the simple non...

  6. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  7. Robust Model-Free Multiclass Probability Estimation

    Science.gov (United States)

    Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng

    2010-01-01

    Classical statistical approaches for multiclass probability estimation are typically based on regression techniques such as multiple logistic regression, or density estimation approaches such as linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). These methods often make certain assumptions on the form of probability functions or on the underlying distributions of subclasses. In this article, we develop a model-free procedure to estimate multiclass probabilities based on large-margin classifiers. In particular, the new estimation scheme is employed by solving a series of weighted large-margin classifiers and then systematically extracting the probability information from these multiple classification rules. A main advantage of the proposed probability estimation technique is that it does not impose any strong parametric assumption on the underlying distribution and can be applied for a wide range of large-margin classification methods. A general computational algorithm is developed for class probability estimation. Furthermore, we establish asymptotic consistency of the probability estimates. Both simulated and real data examples are presented to illustrate competitive performance of the new approach and compare it with several other existing methods. PMID:21113386

  8. A model to assess dust explosion occurrence probability.

    Science.gov (United States)

    Hassan, Junaid; Khan, Faisal; Amyotte, Paul; Ferdous, Refaul

    2014-03-15

    Dust handling poses a potential explosion hazard in many industrial facilities. The consequences of a dust explosion are often severe and similar to a gas explosion; however, its occurrence is conditional to the presence of five elements: combustible dust, ignition source, oxidant, mixing and confinement. Dust explosion researchers have conducted experiments to study the characteristics of these elements and generate data on explosibility. These experiments are often costly but the generated data has a significant scope in estimating the probability of a dust explosion occurrence. This paper attempts to use existing information (experimental data) to develop a predictive model to assess the probability of a dust explosion occurrence in a given environment. The pro-posed model considers six key parameters of a dust explosion: dust particle diameter (PD), minimum ignition energy (MIE), minimum explosible concentration (MEC), minimum ignition temperature (MIT), limiting oxygen concentration (LOC) and explosion pressure (Pmax). A conditional probabilistic approach has been developed and embedded in the proposed model to generate a nomograph for assessing dust explosion occurrence. The generated nomograph provides a quick assessment technique to map the occurrence probability of a dust explosion for a given environment defined with the six parameters. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Survival analysis of orthodontic mini-implants.

    Science.gov (United States)

    Lee, Shin-Jae; Ahn, Sug-Joon; Lee, Jae Won; Kim, Seong-Hun; Kim, Tae-Woo

    2010-02-01

    Survival analysis is useful in clinical research because it focuses on comparing the survival distributions and the identification of risk factors. Our aim in this study was to investigate the survival characteristics and risk factors of orthodontic mini-implants with survival analyses. One hundred forty-one orthodontic patients (treated from October 1, 2000, to November 29, 2007) were included in this survival study. A total of 260 orthodontic mini-implants that had sandblasted (large grit) and acid-etched screw parts were placed between the maxillary second premolar and the first molar. Failures of the implants were recorded as event data, whereas implants that were removed because treatment ended and those that were not removed during the study period were recorded as censored data. A nonparametric life table method was used to visualize the hazard function, and Kaplan-Meier survival curves were generated to identify the variables associated with implant failure. Prognostic variables associated with implant failure were identified with the Cox proportional hazard model. Of the 260 implants, 22 failed. The hazard function for implant failure showed that the risk is highest immediately after placement. The survival function showed that the median survival time of orthodontic mini-implants is sufficient for relatively long orthodontic treatments. The Cox proportional hazard model identified that increasing age is a decisive factor for implant survival. The decreasing pattern of the hazard function suggested gradual osseointegration of orthodontic mini-implants. When implants are placed in a young patient, special caution is needed to lessen the increased probability of failure, especially immediately after placement.

  10. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  11. Methods for fitting a parametric probability distribution to most probable number data.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2012-07-02

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  12. Estimating parameters of hidden Markov models based on marked individuals: use of robust design data

    Science.gov (United States)

    Kendall, William L.; White, Gary C.; Hines, James E.; Langtimm, Catherine A.; Yoshizaki, Jun

    2012-01-01

    Development and use of multistate mark-recapture models, which provide estimates of parameters of Markov processes in the face of imperfect detection, have become common over the last twenty years. Recently, estimating parameters of hidden Markov models, where the state of an individual can be uncertain even when it is detected, has received attention. Previous work has shown that ignoring state uncertainty biases estimates of survival and state transition probabilities, thereby reducing the power to detect effects. Efforts to adjust for state uncertainty have included special cases and a general framework for a single sample per period of interest. We provide a flexible framework for adjusting for state uncertainty in multistate models, while utilizing multiple sampling occasions per period of interest to increase precision and remove parameter redundancy. These models also produce direct estimates of state structure for each primary period, even for the case where there is just one sampling occasion. We apply our model to expected value data, and to data from a study of Florida manatees, to provide examples of the improvement in precision due to secondary capture occasions. We also provide user-friendly software to implement these models. This general framework could also be used by practitioners to consider constrained models of particular interest, or model the relationship between within-primary period parameters (e.g., state structure) and between-primary period parameters (e.g., state transition probabilities).

  13. Pathogen survival in chorizos: ecological factors.

    Science.gov (United States)

    Hew, Carrie M; Hajmeer, Maha N; Farver, Thomas B; Riemann, Hans P; Glover, James M; Cliver, Dean O

    2006-05-01

    This study addressed health risks from ethnic sausages produced on a small scale, without inspection, in California and elsewhere. Mexican-style chorizo, a raw pork sausage that is not cured, fermented, or smoked, was contaminated experimentally in the batter with Escherichia coli O157:H7, Listeria monocytogenes, or Salmonella serotypes and stuffed into natural casings. Formulations were based on a market survey in California. Physical parameters that were controlled were pH, water activity (a(w)), and storage temperature. The pH was adjusted with vinegar, stabilizing at 5.0 within 24 h. Initial a(w) levels adjusted with salt were 0.97, 0.95, 0.93, 0.90, and 0.85; levels declined with time because of evaporation. Pathogen numbers declined with storage up to 7 days, with few brief exceptions. Main effects and interactions of constant temperature and pH with declining a(w) on survival of the pathogens were determined. Maximum death rates occurred at higher a(w) for E. coli O157:H7 and Salmonella than for L. monocytogenes. Salt used to adjust a(w) affected palatability. Spices (black pepper, chili pepper, chili powder, cumin, garlic, guajillo pepper, oregano, and paprika) comprised another, potentially significant aspect of the sausage formulation. Some (notably black pepper and cumin) carried an indigenous microflora that contributed significantly to the microbial load of the sausage batter. Only undiluted fresh and powdered garlic exhibited a significant antimicrobial effect on the pathogens. Although each of the tested formulations caused death of the inoculated pathogens, none of the death rates was sufficiently rapid to ensure safety within the probable shelf life of the product.

  14. Panspermia Survival Scenarios for Organisms that Survive Typical Hypervelocity Solar System Impact Events.

    Science.gov (United States)

    Pasini, D.

    2014-04-01

    Previous experimental studies have demonstrated the survivability of living cells during hypervelocity impact events, testing the panspermia and litho-panspermia hypotheses [1]. It has been demonstrated by the authors that Nannochloropsis Oculata Phytoplankton, a eukaryotic photosynthesizing autotroph found in the 'euphotic zone' (sunlit surface layers of oceans [2]), survive impacts up to 6.93 km s-1 (approx. shock pressure 40 GPa) [3, 4]. Also shown to survive impacts up to 5.49 km s-1 is the tardigrade species Hypsibius dujardini (a complex micro-animal consisting of 40,000 cells) [5, 6]. It has also been shown that they can survive sustained pressures up to 600 MPa using a water filled pressure capsule [7]. Additionally bacteria can survive impacts up to 5.4 km s-1 (~30 GPa) - albeit with a low probability of survival [1], and the survivability of yeast spores in impacts up to 7.4 km s-1 (~30 GPa) has also recently been demonstrated [8]. Other groups have also reported that the lichen Xanthoria elegans is able to survive shocks in similar pressure ranges (~40 GPa) [9]. Here we present various simulated impact regimes to show which scenarios are condusive to the panspermia hypothesis of the natural transfer of life (via an icy body) through space to an extraterrestrial environment.

  15. Mixture probability distribution functions to model wind speed distributions

    Energy Technology Data Exchange (ETDEWEB)

    Kollu, Ravindra; Rayapudi, Srinivasa Rao; Pakkurthi, Krishna Mohan [J.N.T. Univ., Kakinada (India). Dept. of Electrical and Electronics Engineering; Narasimham, S.V.L. [J.N.T. Univ., Andhra Pradesh (India). Computer Science and Engineering Dept.

    2012-11-01

    Accurate wind speed modeling is critical in estimating wind energy potential for harnessing wind power effectively. The quality of wind speed assessment depends on the capability of chosen probability density function (PDF) to describe the measured wind speed frequency distribution. The objective of this study is to describe (model) wind speed characteristics using three mixture probability density functions Weibull-extreme value distribution (GEV), Weibull-lognormal, and GEV-lognormal which were not tried before. Statistical parameters such as maximum error in the Kolmogorov-Smirnov test, root mean square error, Chi-square error, coefficient of determination, and power density error are considered as judgment criteria to assess the fitness of the probability density functions. Results indicate that Weibull- GEV PDF is able to describe unimodal as well as bimodal wind distributions accurately whereas GEV-lognormal PDF is able to describe familiar bell-shaped unimodal distribution well. Results show that mixture probability functions are better alternatives to conventional Weibull, two-component mixture Weibull, gamma, and lognormal PDFs to describe wind speed characteristics. (orig.)

  16. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  17. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  18. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  19. Efficient simulation of tail probabilities of sums of correlated lognormals

    DEFF Research Database (Denmark)

    Asmussen, Søren; Blanchet, José; Juneja, Sandeep

    We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown to be eff......We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown...... to be efficient as the tail probability of interest decreases to zero. The first estimator, based on importance sampling, involves a scaling of the whole covariance matrix and can be shown to be asymptotically optimal. A further study, based on the Cross-Entropy algorithm, is also performed in order to adaptively...... optimize the scaling parameter of the covariance. The second estimator decomposes the probability of interest in two contributions and takes advantage of the fact that large deviations for a sum of correlated lognormals are (asymptotically) caused by the largest increment. Importance sampling...

  20. A Course on Elementary Probability Theory

    OpenAIRE

    Lo, Gane Samb

    2017-01-01

    This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related t...

  1. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  2. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  3. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  4. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  5. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  6. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  7. Survival paths through the forest

    DEFF Research Database (Denmark)

    Mogensen, Ulla Brasch

    when the information is high-dimensional e.g. when there are many thousands of genes or markers. In these situations machine learning methods such as the random forest can still be applied and provide reasonable prediction accuracy. The main focus in this talk is the performance of random forest...... in particular when the response is three-dimensional. In a diagnostic study of inflammatory bowel disease three classes of patients have to be diagnosed based on microarray gene-expression data. The performance of random forest is compared on a probability scale and on a classification scale to elastic net....... In survival analysis with competing risks I present an extension of random forest using time-dependent pseudo-values to build event risk prediction models. This approach is evaluated with data from Copenhagen stroke study. Further, I will explain how to use the R-package "pec" to evaluate random forests using...

  8. Multi-parton interactions and rapidity gap survival probability in jet-gap-jet processes

    Science.gov (United States)

    Babiarz, Izabela; Staszewski, Rafał; Szczurek, Antoni

    2017-08-01

    We discuss an application of dynamical multi-parton interaction model, tuned to measurements of underlying event topology, for a description of destroying rapidity gaps in the jet-gap-jet processes at the LHC. We concentrate on the dynamical origin of the mechanism of destroying the rapidity gap. The cross section for jet-gap-jet is calculated within LL BFKL approximation. We discuss the topology of final states without and with the MPI effects. We discuss some examples of selected kinematical situations (fixed jet rapidities and transverse momenta) as distributions averaged over the dynamics of the jet-gap-jet scattering. The colour-singlet ladder exchange amplitude for the partonic subprocess is implemented into the PYTHIA 8 generator, which is then used for hadronisation and for the simulation of the MPI effects. Several differential distributions are shown and discussed. We present the ratio of cross section calculated with and without MPI effects as a function of rapidity gap in between the jets.

  9. Increasing the probability of surviving loss of consciousness underwater when using a rebreather.

    Science.gov (United States)

    Haynes, Paul

    2016-12-01

    Re-circulating underwater breathing apparatus (rebreathers) have become increasingly popular amongst sport divers. In comparison to open-circuit scuba, rebreathers are complex life support equipment that incorporates many inherent failure modes and potential for human error. This individually or in combination can lead to an inappropriate breathing gas. Analysis of rebreather diving incidents suggests that inappropriate breathing gas is the most prevalent disabling agent. This can result in spontaneous loss of consciousness (LoC), water aspiration and drowning. Protecting the airway by maintaining the diver/rebreather oral interface may delay water aspiration following LoC underwater; the possibility of a successful rescue is, thus, increased. One means of protecting the airway following LoC underwater is the use of a full-face mask (FFM). However, such masks are complex and expensive; therefore, they have not been widely adopted by the sport diving community. An alternative to the FFM used extensively throughout the global military diving community is the mouthpiece retaining strap (MRS). A recent study documented 54 LoC events in military rebreather diving with only three consequent drownings; all divers were reported to be using a MRS. Even allowing for the concomitant use of a tethered diving partner system in most cases, the low number of fatalities in this large series is circumstantially supportive of the efficacy of the MRS. Despite drowning featuring as a final common pathway in the vast majority of rebreather fatalities, the MRS has not been widely adopted by the sport rebreather diving community.

  10. Surviving probability indicators of landing juvenile magellanic penguins arriving along the southern Brazilian coast

    OpenAIRE

    Sandra Carvalho Rodrigues; Andréa Corrado Adornes; Euclydes Antônio dos Santos Filho; Rodolfo Pinho Silva Filho; Elton Pinto Colares

    2010-01-01

    The aim of this work was to monitor and study the hematocrit and weight of juvenile penguins, with and without oil cover, found alive along the southern coast of Brazil, after capture, as well as before eventual death or release. Released juvenile penguins showed higher weight and hematocrit (3.65 ± 0.06 kg and 44.63 ± 0.29%, respectively) than those that died (2.88 ± 0.08 kg and 34.42 ± 1.70%, respectively). Penguins with higher hematocrit and weight after capture had higher mean weight gain...

  11. Survival under uncertainty an introduction to probability models of social structure and evolution

    CERN Document Server

    Volchenkov, Dimitri

    2016-01-01

    This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...

  12. MORBIDITY AND SURVIVAL PROBABILITY IN BURN PATIENTS IN MODERN BURN CARE

    Science.gov (United States)

    Jeschke, Marc G.; Pinto, Ruxandra; Kraft, Robert; Nathens, Avery B.; Finnerty, Celeste C.; Gamelli, Richard L.; Gibran, Nicole S.; Klein, Matthew B.; Arnoldo, Brett D.; Tompkins, Ronald G.; Herndon, David N.

    2014-01-01

    Objective Characterizing burn sizes that are associated with an increased risk of mortality and morbidity is critical because it would allow identifying patients who might derive the greatest benefit from individualized, experimental, or innovative therapies. Although scores have been established to predict mortality, few data addressing other outcomes exist. The objective of this study was to determine burn sizes that are associated with increased mortality and morbidity after burn. Design and Patients Burn patients were prospectively enrolled as part of the multicenter prospective cohort study, Inflammation and the Host Response to Injury Glue Grant, with the following inclusion criteria: 0–99 years of age, admission within 96 hours after injury, and >20% total body surface area burns requiring at least one surgical intervention. Setting Six major burn centers in North America. Measurements and Main Results Burn size cutoff values were determined for mortality, burn wound infection (at least two infections), sepsis (as defined by ABA sepsis criteria), pneumonia, acute respiratory distress syndrome, and multiple organ failure (DENVER2 score >3) for both children (patients were enrolled, of which 226 patients were children. Twenty-three patients were older than 65 years and were excluded from the cutoff analysis. In children, the cutoff burn size for mortality, sepsis, infection, and multiple organ failure was approximately 60% total body surface area burned. In adults, the cutoff for these outcomes was lower, at approximately 40% total body surface area burned. Conclusions In the modern burn care setting, adults with over 40% total body surface area burned and children with over 60% total body surface area burned are at high risk for morbidity and mortality, even in highly specialized centers. PMID:25559438

  13. Effects of amphibian chytrid fungus on individual survival probability in wild boreal toads

    Science.gov (United States)

    David S. Pilliod; Erin Muths; Rick D. Scherer; Paul E. Bartelt; Paul Stephen Corn; Blake R. Hossack; Brad A. Lambert; Rebecca McCaffery; Christopher Gaughan

    2010-01-01

    Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture-recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect...

  14. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  15. Probability Analysis of a Quantum Computer

    OpenAIRE

    Einarsson, Göran

    2003-01-01

    The quantum computer algorithm by Peter Shor for factorization of integers is studied. The quantum nature of a QC makes its outcome random. The output probability distribution is investigated and the chances of a successful operation is determined

  16. Nanoformulations and Clinical Trial Candidates as Probably ...

    African Journals Online (AJOL)

    Nanoformulations and Clinical Trial Candidates as Probably Effective and Safe Therapy for Tuberculosis. Madeeha Laghari, Yusrida Darwis, Abdul Hakeem Memon, Arshad Ali Khan, Ibrahim Mohammed Tayeb Abdulbaqi, Reem Abou Assi ...

  17. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  18. Zika Probably Not Spread Through Saliva: Study

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_167531.html Zika Probably Not Spread Through Saliva: Study Research with ... HealthDay News) -- Scientists have some interesting news about Zika: You're unlikely to get the virus from ...

  19. Perfect Precision Detecting Probability Of An Atom Via Sgc Mechanism

    Science.gov (United States)

    Hamedi, H. R.

    2015-06-01

    This letter investigates a scheme of high efficient two-dimensional (2D) atom localization via scanning probe absorption in a Y-type four-level atomic scheme with two orthogonal standing waves. It is shown that because of the position dependent atom-field interaction, the spatial probability distribution of the atom can be directly determined via monitoring the probe absorption and gain spectra. The impact of different controlling parameters of the system on 2D localization is studied. We find that owning the effect of spontaneously generated coherence (SGC), the atom can be localized at a particular position and the maximal probability of detecting the atom within the sub-wavelength domain of the two orthogonal standing waves reaches to hundred percent. Phase controlling of position dependent probe absorption is then discussed. The presented scheme may be helpful in laser cooling or atom nanolithography via high precision and high resolution atom localization.

  20. On $\\varphi$-families of probability distributions

    OpenAIRE

    Rui F. Vigelis; Cavalcante, Charles C.

    2011-01-01

    We generalize the exponential family of probability distributions. In our approach, the exponential function is replaced by a $\\varphi$-function, resulting in a $\\varphi$-family of probability distributions. We show how $\\varphi$-families are constructed. In a $\\varphi$-family, the analogue of the cumulant-generating function is a normalizing function. We define the $\\varphi$-divergence as the Bregman divergence associated to the normalizing function, providing a generalization of the Kullbac...

  1. An Illustrative Problem in Computational Probability.

    Science.gov (United States)

    1980-06-01

    easily evaluated. In general, the (n)probabilities #j (t) my be comuted by the numerical solution of the simple differential equations d Cu) * (n) Rt#n...algorithmically tractable solutions to problem in probability adds an interesting new dimension to their analysis. Zn the con- struction of efficient...signif icence. This serves to Illustrate out first point. )hatbematica3lk equivalent solutions sma be vastly diLfferent In their sutIlIty for

  2. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  3. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  4. Survivability of systems under multiple factor impact

    Energy Technology Data Exchange (ETDEWEB)

    Korczak, Edward [Telecommunications Research Institute, Warsaw (Poland); Levitin, Gregory [Israel Electric Corporation Ltd., Haifa (Israel)]. E-mail: levitin@iec.co.il

    2007-02-15

    The paper considers vulnerable multi-state series-parallel systems operating under influence of external impacts. Both the external impacts and internal failures affect system survivability, which is determined as the probability of meeting a given demand. The external impacts are characterized by several destructive factors affecting the system or its parts simultaneously. In order to increase the system's survivability a multilevel protection against the destructive factors can be applied to its subsystems. In such systems, the protected subsystems can be destroyed only if all of the levels of their protection are destroyed. The paper presents an algorithm for evaluating the survivability of series-parallel systems with arbitrary configuration of multilevel protection against multiple destructive factor impacts. The algorithm is based on a composition of Boolean and the Universal Generating Function techniques. Illustrative examples are presented.

  5. Estimação de parâmetros genéticos para probabilidade de prenhez aos 14 meses e altura na garupa em bovinos da raça Nelore Estimation of genetic parameters for probability of pregnancy at 14 months and hip height in Nelore beef cattle

    Directory of Open Access Journals (Sweden)

    Josineudson Augusto II de Vasconcelos Silva

    2003-10-01

    Full Text Available Registros de 24.703 animais da raça Nelore, provenientes de seis fazendas, foram utilizados para estimar os coeficientes de herdabilidade e a correlação genética relativos às características probabilidade de prenhez aos 14 meses (PP14 e altura na garupa aos 450 dias de idade (AG450. O modelo matemático incluiu os efeitos fixos de grupo de contemporâneo (181 grupos e classe de idade da mãe ao parto (7 classes para PP14 e o efeito fixo de grupo de contemporâneo (584 grupos para AG450. Os efeitos aleatórios incluídos em ambos os modelos foram genético aditivo do touro e residual. Os componentes de variância e covariância foram obtidos pelo método Â. As estimativas de herdabilidade foram iguais a 0,73 ± 0,01 e 0,30 ± 0,00 para PP14 e AG450, respectivamente. A estimativa da correlação genética entre as características foi 0,10 ± 0,01. Os resultados mostram que PP14 pode ser usada em programas de seleção de touros com o intuito de aumentar a precocidade das novilhas, dado o alto valor de herdabilidade estimado para esta característica. A baixa correlação genética entre PP14 e AG450 sugere que seleção para crescimento, medida na altura da garupa, pouco poderá afetar a precocidade das novilhas medida pela PP14.Records of 24,703 Nelore cattle from six herds were used to estimate coefficients of heritability for probability of pregnancy at 14 months (PP14 and hip height (HH traits. For PP14, the mathematical model included as fixed effects the contemporary groups (181 groups and dam age class at calving (7 classes and for HH only contemporary groups (584 groups. Random effects included were genetic additive of sire and residual effects. Variance and covariance components were obtained by method Â. Heritability estimates were 0.73 ± 0.01 and 0.30 ± 0.00 for PP14 and HH, respectively. Genetic correlation estimate between PP14 and HH was 0.10 ± 0.01. Results reveal that PP14 is of high heritability and, therefore, it can

  6. Ill-posed problem and regularization in reconstruction of radiobiological parameters from serial tumor imaging data

    Science.gov (United States)

    Chvetsov, Alevei V.; Sandison, George A.; Schwartz, Jeffrey L.; Rengan, Ramesh

    2015-11-01

    The main objective of this article is to improve the stability of reconstruction algorithms for estimation of radiobiological parameters using serial tumor imaging data acquired during radiation therapy. Serial images of tumor response to radiation therapy represent a complex summation of several exponential processes as treatment induced cell inactivation, tumor growth rates, and the rate of cell loss. Accurate assessment of treatment response would require separation of these processes because they define radiobiological determinants of treatment response and, correspondingly, tumor control probability. However, the estimation of radiobiological parameters using imaging data can be considered an inverse ill-posed problem because a sum of several exponentials would produce the Fredholm integral equation of the first kind which is ill posed. Therefore, the stability of reconstruction of radiobiological parameters presents a problem even for the simplest models of tumor response. To study stability of the parameter reconstruction problem, we used a set of serial CT imaging data for head and neck cancer and a simplest case of a two-level cell population model of tumor response. Inverse reconstruction was performed using a simulated annealing algorithm to minimize a least squared objective function. Results show that the reconstructed values of cell surviving fractions and cell doubling time exhibit significant nonphysical fluctuations if no stabilization algorithms are applied. However, after applying a stabilization algorithm based on variational regularization, the reconstruction produces statistical distributions for survival fractions and doubling time that are comparable to published in vitro data. This algorithm is an advance over our previous work where only cell surviving fractions were reconstructed. We conclude that variational regularization allows for an increase in the number of free parameters in our model which enables development of more

  7. Age-specific survival of tundra swans on the lower Alaska Peninsula

    Science.gov (United States)

    Meixell, Brandt W.; Lindberg, Mark S.; Conn, Paul B.; Dau, Christian P.; Sarvis, John E.; Sowl, Kristine M.

    2013-01-01

    The population of Tundra Swans (Cygnus columbianus columbianus) breeding on the lower Alaska Peninsula represents the southern extremity of the species' range and is uniquely nonmigratory. We used data on recaptures, resightings, and recoveries of neck-collared Tundra Swans on the lower Alaska Peninsula to estimate collar loss, annual apparent survival, and other demographic parameters for the years 1978–1989. Annual collar loss was greater for adult males fitted with either the thinner collar type (0.34) or the thicker collar type (0.15) than for other age/sex classes (thinner: 0.10, thicker: 0.04). The apparent mean probability of survival of adults (0.61) was higher than that of immatures (0.41) and for both age classes varied considerably by year (adult range: 0.44–0.95, immature range: 0.25–0.90). To assess effects of permanent emigration by age and breeding class, we analyzed post hoc the encounter histories of swans known to breed in our study area. The apparent mean survival of known breeders (0.65) was generally higher than that of the entire marked sample but still varied considerably by year (range 0.26–1.00) and indicated that permanent emigration of breeding swans was likely. We suggest that reductions in apparent survival probability were influenced primarily by high and variable rates of permanent emigration and that immigration by swans from elsewhere may be important in sustaining a breeding population at and near Izembek National Wildlife Refuge.

  8. Effect of VDRA on survival in incident hemodialysis patients: results of the FARO-2 observational study.

    Science.gov (United States)

    Messa, Piergiorgio; Cozzolino, Mario; Brancaccio, Diego; Cannella, Giuseppe; Malberti, Fabio; Costanzo, Anna Maria; di Luzio Paparatti, Umberto; Festa, Vincenzo; Gualberti, Giuliana; Mazzaferro, Sandro

    2015-02-06

    Mortality rate among patients with stage five chronic kidney disease (CKD) maintained on hemodialysis (HD) is high. Although evidence suggests that use of Vitamin D Receptor Activators (VDRA) in CKD patients increases survival, few studies have examined the effect of VDRA in incident HD patients. The FARO-2 study evaluated the clinical outcome of VDRA therapy on mortality in incident HD patients. FARO-2 was a longitudinal epidemiological study performed on 568 incident HD patients followed prospectively from 26 dialysis centers over a 3-year period. Data were collected every 6 months using a questionnaire, obtaining clinical, biochemical and therapeutic parameters. Kaplan-Meier curves and Cox proportional hazard regression models were used to determine cumulative probability of time-to-death and adjusted hazard ratios. 568 patients (68% male) with an average age of 65.5 years were followed up. Mean dialysis duration at study entry was 3 months. VDRA use increased from 46% at 6 months to 54.7% at 36 months of follow-up (p = 0.08). No difference was observed in the presence of comorbid diseases at baseline in patients with and without VDRA therapy. Cumulative probability of survival at 24 months was 74.5% (95% CI: 70.2-78.3). Patients receiving VDRA therapy showed a significant increase in survival at 24 months (80.7%; 95% CI: 75.7-84.8) compared to those without (63.3%; 95% CI: 54.8-70.7, p FARO-2 indicate that in incident HD patients VDRA therapy was associated with increased survival.

  9. Prostate Cancer Probability Prediction By Machine Learning Technique.

    Science.gov (United States)

    Jović, Srđan; Miljković, Milica; Ivanović, Miljan; Šaranović, Milena; Arsić, Milena

    2017-11-26

    The main goal of the study was to explore possibility of prostate cancer prediction by machine learning techniques. In order to improve the survival probability of the prostate cancer patients it is essential to make suitable prediction models of the prostate cancer. If one make relevant prediction of the prostate cancer it is easy to create suitable treatment based on the prediction results. Machine learning techniques are the most common techniques for the creation of the predictive models. Therefore in this study several machine techniques were applied and compared. The obtained results were analyzed and discussed. It was concluded that the machine learning techniques could be used for the relevant prediction of prostate cancer.

  10. A propagation model of computer virus with nonlinear vaccination probability

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi

    2014-01-01

    This paper is intended to examine the effect of vaccination on the spread of computer viruses. For that purpose, a novel computer virus propagation model, which incorporates a nonlinear vaccination probability, is proposed. A qualitative analysis of this model reveals that, depending on the value of the basic reproduction number, either the virus-free equilibrium or the viral equilibrium is globally asymptotically stable. The results of simulation experiments not only demonstrate the validity of our model, but also show the effectiveness of nonlinear vaccination strategies. Through parameter analysis, some effective strategies for eradicating viruses are suggested.

  11. Continuation of probability density functions using a generalized Lyapunov approach

    Energy Technology Data Exchange (ETDEWEB)

    Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)

    2017-05-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  12. Meta-analysis of survival prediction with Palliative Performance Scale.

    Science.gov (United States)

    Downing, Michael; Lau, Francis; Lesperance, Mary; Karlson, Nicholas; Shaw, Jack; Kuziemsky, Craig; Bernard, Steve; Hanson, Laura; Olajide, Lola; Head, Barbara; Ritchie, Christine; Harrold, Joan; Casarett, David

    2007-01-01

    This paper aims to reconcile the use of Palliative Performance Scale (PPSv2) for survival prediction in palliative care through an international collaborative study by five research groups. The study involves an individual patient data meta-analysis on 1,808 patients from four original datasets to reanalyze their survival patterns by age, gender, cancer status, and initial PPS score. Our findings reveal a strong association between PPS and survival across the four datasets. The Kaplan-Meier survival curves show each PPS level as distinct, with a strong ordering effect in which higher PPS levels are associated with increased length of survival. Using a stratified Cox proportional hazard model to adjust for study differences, we found females lived significantly longer than males, with a further decrease in hazard for females not diagnosed with cancer. Further work is needed to refine the reporting of survival times/probabilities and to improve prediction accuracy with the inclusion of other variables in the models.

  13. Body size and survival in premenopausal breast cancer.

    Science.gov (United States)

    Greenberg, E. R.; Vessey, M. P.; McPherson, K.; Doll, R.; Yeates, D.

    1985-01-01

    The survival experience of 582 women with premenopausal breast cancer was examined to determine whether prognosis was related to body size or to demographic and reproductive factors. During the follow-up period 228 patients died and 18 emigrated or were lost to follow-up. Usual body weight, reported at the time of diagnosis, was a strong predictor of survival, with a statistically significant trend towards lower survival with increasing weight. Height and obesity (Quetelet index) were not significantly related to survival, although the tallest women and the most obese women appeared to fare worst. Other characteristics of prognostic importance were disease stage and reproductive history (women who were older when their first child was born fared better). Women aged 46-50 when diagnosed also appeared more likely to survive but no clear trend with age was evident. Other characteristics of the women including social class, cigarette use and oral contraceptive use were not significantly related to survival probability. PMID:3994912

  14. Probability of pregnancy in beef heifers

    Directory of Open Access Journals (Sweden)

    D.P. Faria

    2014-12-01

    Full Text Available This study aimed to evaluate the influence of initial weight, initial age, average daily gain in initial weight, average daily gain in total weight and genetic group on the probability of pregnancy in primiparous females of the Nellore, 1/2 Simmental + 1/2 Nellore, and 3/4 Nellore + 1/4 Simmental genetic groups. Data were collected from the livestock file of the Farpal Farm, located in the municipality of Jaíba, Minas Gerais State, Brazil. The pregnancy diagnosis results (success = 1 and failure = 0 were used to determine the probability of pregnancy that was modeled using logistic regression by the Proc Logistic procedure available on SAS (Statistical..., 2004 software, from the regressor variables initial weight, average daily gain in initial weight, average daily gain in total weight, and genetic group. Initial weight (IW was the most important variable in the probability of pregnancy in heifers, and 1-kg increments in IW allowed for increases of 5.8, 9.8 and 3.4% in the probability of pregnancy in Nellore, 1/2 Simmental + 1/2 Nellore and, 3/4 Nellore + 1/4 Simmental heifers, respectively. The initial age influenced the probability of pregnancy in Nellore heifers. From the estimates of the effects of each variable it was possible to determine the minimum initial weights for each genetic group. This information can be used to monitor the development of heifers until the breeding season and increase the pregnancy rate.

  15. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Michael C. Wittmann

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  16. Failure probability of regional flood defences

    Directory of Open Access Journals (Sweden)

    Lendering Kasper

    2016-01-01

    Full Text Available Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This paper proposed a methodology to quantify the probability of flooding of regional flood defence systems, which required several additions to the methodology used for the primary flood defence system. These additions focused on a method to account for regulation of regional water levels, the possibility of (reduced intrusion resistance due to maintenance dredging in regional water, the probability of traffic loads and the influence of dependence between regional water levels and the phreatic surface of a regional flood defence. In addition, reliability updating is used to demonstrate the potential for updating the probability of failure of regional flood defences with performance observations. The results demonstrated that the proposed methodology can be used to determine the probability of flooding of a regional flood defence system. In doing so, the methodology contributes to improving flood risk management in these systems.

  17. The role of probabilities in physics.

    Science.gov (United States)

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  19. Actuarial survival of a large Canadian cohort of preterm infants

    Directory of Open Access Journals (Sweden)

    Ohlsson Arne

    2005-11-01

    Full Text Available Abstract Background The increased survival of preterm and very low birth weight infants in recent years has been well documented but continued surveillance is required in order to monitor the effects of new therapeutic interventions. Gestation and birth weight specific survival rates most accurately reflect the outcome of perinatal care. Our aims were to determine survival to discharge for a large Canadian cohort of preterm infants admitted to the neonatal intensive care unit (NICU, and to examine the effect of gender on survival and the effect of increasing postnatal age on predicted survival. Methods Outcomes for all 19,507 infants admitted to 17 NICUs throughout Canada between January 1996 and October 1997 were collected prospectively. Babies with congenital anomalies were excluded from the study population. Gestation and birth weight specific survival for all infants with birth weight Results Survival to discharge at 24 weeks gestation was 54%, compared to 82% at 26 weeks and 95% at 30 weeks. In infants with birth weights 600–699, survival to discharge was 62%, compared to 79% at 700–799 g and 96% at 1,000–1,099 g. In infants born at 24 weeks gestational age, survival was higher in females but there were no significant gender differences above 24 weeks gestation. Actuarial analysis showed that risk of death was highest in the first 5 days. For infants born at 24 weeks gestation, estimated survival probability to 48 hours, 7 days and 4 weeks were 88 (CI 84,92%, 70 (CI 64, 76% and 60 (CI 53,66% respectively. For smaller birth weights, female survival probabilities were higher than males for the first 40 days of life. Conclusion Actuarial analysis provides useful information when counseling parents and highlights the importance of frequently revising the prediction for long term survival particularly after the first few days of life.

  20. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  1. Classicality versus quantumness in Born's probability

    Science.gov (United States)

    Luo, Shunlong

    2017-11-01

    Born's rule, which postulates the probability of a measurement outcome in a quantum state, is pivotal to interpretations and applications of quantum mechanics. By exploiting the departure of the product of two Hermitian operators in Born's rule from Hermiticity, we prescribe an intrinsic and natural scheme to decompose Born's probability into a classical part and a quantum part, which have significant implications in quantum information theory. The classical part constitutes the information compatible with the associated measurement operator, while the quantum part represents the quantum coherence of the state with respect to the measurement operator. Fundamental properties of the decomposition are revealed. As applications, we establish several trade-off relations for the classicality and quantumness in Born's probability, which may be interpreted as alternative realizations of Heisenberg's uncertainty principle. The results shed physical lights on related issues concerning quantification of complementarity, coherence, and uncertainty, as well as the classical-quantum interplay.

  2. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  3. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  4. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  5. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  6. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  7. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  8. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  9. Transition probability generating function of a transitionless quantum parametric oscillator

    Science.gov (United States)

    Mishima, Hiroaki; Izumida, Yuki

    2017-07-01

    The transitionless tracking (TT) algorithm enables the exact tracking of quantum adiabatic dynamics in an arbitrary short time by adding a counterdiabatic Hamiltonian to the original adiabatic Hamiltonian. By applying Husimi's method originally developed for a quantum parametric oscillator (QPO) to the transitionless QPO achieved using the TT algorithm, we obtain the transition probability generating function with a time-dependent parameter constituted with solutions of the corresponding classical parametric oscillator (CPO). By obtaining the explicit solutions of this CPO using the phase-amplitude method, we find that the time-dependent parameter can be reduced to the frequency ratio between the Hamiltonians without and with the counterdiabatic Hamiltonian, from which we can easily characterize the result achieved by the TT algorithm. We illustrate our theory by showing the trajectories of the CPO on the classical phase space, which elucidate the effect of the counterdiabatic Hamiltonian of the QPO.

  10. Explosion probability of unexploded ordnance: expert beliefs.

    Science.gov (United States)

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  11. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  12. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  13. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  14. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  15. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  16. Probability Inequalities for a Gladiator Game

    OpenAIRE

    Yosef Rinott; Marco Scarsini; Yaming Yu

    2011-01-01

    Based on a model introduced by Kaminsky, Luks, and Nelson (1984), we consider a zero-sum allocation game called the Gladiator Game, where two teams of gladiators engage in a sequence of one-to-one fights in which the probability of winning is a function of the gladiators' strengths. Each team's strategy consist the allocation of its total strength among its gladiators. We find the Nash equilibria of the game and compute its value. To do this, we study interesting majorization-type probability...

  17. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  18. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  19. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  20. Concepts of probability in radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    Bernhard Weninger

    2011-12-01

    Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.