WorldWideScience

Sample records for estimated extinction probabilities

  1. Estimating extinction using unsupervised machine learning

    Science.gov (United States)

    Meingast, Stefan; Lombardi, Marco; Alves, João

    2017-05-01

    Dust extinction is the most robust tracer of the gas distribution in the interstellar medium, but measuring extinction is limited by the systematic uncertainties involved in estimating the intrinsic colors to background stars. In this paper we present a new technique, Pnicer, that estimates intrinsic colors and extinction for individual stars using unsupervised machine learning algorithms. This new method aims to be free from any priors with respect to the column density and intrinsic color distribution. It is applicable to any combination of parameters and works in arbitrary numbers of dimensions. Furthermore, it is not restricted to color space. Extinction toward single sources is determined by fitting Gaussian mixture models along the extinction vector to (extinction-free) control field observations. In this way it becomes possible to describe the extinction for observed sources with probability densities, rather than a single value. Pnicer effectively eliminates known biases found in similar methods and outperforms them in cases of deep observational data where the number of background galaxies is significant, or when a large number of parameters is used to break degeneracies in the intrinsic color distributions. This new method remains computationally competitive, making it possible to correctly de-redden millions of sources within a matter of seconds. With the ever-increasing number of large-scale high-sensitivity imaging surveys, Pnicer offers a fast and reliable way to efficiently calculate extinction for arbitrary parameter combinations without prior information on source characteristics. The Pnicer software package also offers access to the well-established Nicer technique in a simple unified interface and is capable of building extinction maps including the Nicest correction for cloud substructure. Pnicer is offered to the community as an open-source software solution and is entirely written in Python.

  2. On the probability of extinction of the Haiti cholera epidemic

    Science.gov (United States)

    Bertuzzo, Enrico; Finger, Flavio; Mari, Lorenzo; Gatto, Marino; Rinaldo, Andrea

    2014-05-01

    Nearly 3 years after its appearance in Haiti, cholera has already exacted more than 8,200 deaths and 670,000 reported cases and it is feared to become endemic. However, no clear evidence of a stable environmental reservoir of pathogenic Vibrio cholerae, the infective agent of the disease, has emerged so far, suggesting that the transmission cycle of the disease is being maintained by bacteria freshly shed by infected individuals. Thus in principle cholera could possibly be eradicated from Haiti. Here, we develop a framework for the estimation of the probability of extinction of the epidemic based on current epidemiological dynamics and health-care practice. Cholera spreading is modelled by an individual-based spatially-explicit stochastic model that accounts for the dynamics of susceptible, infected and recovered individuals hosted in different local communities connected through hydrologic and human mobility networks. Our results indicate that the probability that the epidemic goes extinct before the end of 2016 is of the order of 1%. This low probability of extinction highlights the need for more targeted and effective interventions to possibly stop cholera in Haiti.

  3. Evaluating herbivore extinction probabilities in Addo Elephant ...

    African Journals Online (AJOL)

    Abstract. Population extinction evaluations, based on the model developed by Dennis et al. (1991) that did not take density dependence into account and that were based on census data, suggest that many of the herbivore species in Addo Elephant National Park (AENP) are vulnerable to local extinction. As a result of low ...

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  5. DETERMINING TYPE Ia SUPERNOVA HOST GALAXY EXTINCTION PROBABILITIES AND A STATISTICAL APPROACH TO ESTIMATING THE ABSORPTION-TO-REDDENING RATIO R{sub V}

    Energy Technology Data Exchange (ETDEWEB)

    Cikota, Aleksandar [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748 Garching b. München (Germany); Deustua, Susana [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Marleau, Francine, E-mail: acikota@eso.org [Institute for Astro- and Particle Physics, University of Innsbruck, Technikerstrasse 25/8, A-6020 Innsbruck (Austria)

    2016-03-10

    We investigate limits on the extinction values of Type Ia supernovae (SNe Ia) to statistically determine the most probable color excess, E(B – V), with galactocentric distance, and use these statistics to determine the absorption-to-reddening ratio, R{sub V}, for dust in the host galaxies. We determined pixel-based dust mass surface density maps for 59 galaxies from the Key Insight on Nearby Galaxies: a Far-infrared Survey with Herschel (KINGFISH). We use SN Ia spectral templates to develop a Monte Carlo simulation of color excess E(B – V) with R{sub V} = 3.1 and investigate the color excess probabilities E(B – V) with projected radial galaxy center distance. Additionally, we tested our model using observed spectra of SN 1989B, SN 2002bo, and SN 2006X, which occurred in three KINGFISH galaxies. Finally, we determined the most probable reddening for Sa–Sap, Sab–Sbp, Sbc–Scp, Scd–Sdm, S0, and irregular galaxy classes as a function of R/R{sub 25}. We find that the largest expected reddening probabilities are in Sab–Sb and Sbc–Sc galaxies, while S0 and irregular galaxies are very dust poor. We present a new approach for determining the absorption-to-reddening ratio R{sub V} using color excess probability functions and find values of R{sub V} = 2.71 ± 1.58 for 21 SNe Ia observed in Sab–Sbp galaxies, and R{sub V} = 1.70 ± 0.38, for 34 SNe Ia observed in Sbc–Scp galaxies.

  6. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  7. The extinction probability in systems randomly varying in time

    Directory of Open Access Journals (Sweden)

    Imre Pázsit

    2017-09-01

    Full Text Available The extinction probability of a branching process (a neutron chain in a multiplying medium is calculated for a system randomly varying in time. The evolution of the first two moments of such a process was calculated previously by the authors in a system randomly shifting between two states of different multiplication properties. The same model is used here for the investigation of the extinction probability. It is seen that the determination of the extinction probability is significantly more complicated than that of the moments, and it can only be achieved by pure numerical methods. The numerical results indicate that for systems fluctuating between two subcritical or two supercritical states, the extinction probability behaves as expected, but for systems fluctuating between a supercritical and a subcritical state, there is a crucial and unexpected deviation from the predicted behaviour. The results bear some significance not only for neutron chains in a multiplying medium, but also for the evolution of biological populations in a time-varying environment.

  8. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  9. Time dependent non-extinction probability for prompt critical systems

    International Nuclear Information System (INIS)

    Gregson, M. W.; Prinja, A. K.

    2009-01-01

    The time dependent non-extinction probability equation is presented for slab geometry. Numerical solutions are provided for a nested inner/outer iteration routine where the fission terms (both linear and non-linear) are updated and then held fixed over the inner scattering iteration. Time dependent results are presented highlighting the importance of the injection position and angle. The iteration behavior is also described as the steady state probability of initiation is approached for both small and large time steps. Theoretical analysis of the nested iteration scheme is shown and highlights poor numerical convergence for marginally prompt critical systems. An acceleration scheme for the outer iterations is presented to improve convergence of such systems. Theoretical analysis of the acceleration scheme is also provided and the associated decrease in computational run time addressed. (authors)

  10. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  11. A stochastic model for the probability of malaria extinction by mass drug administration.

    Science.gov (United States)

    Pemberton-Ross, Peter; Chitnis, Nakul; Pothin, Emilie; Smith, Thomas A

    2017-09-18

    Mass drug administration (MDA) has been proposed as an intervention to achieve local extinction of malaria. Although its effect on the reproduction number is short lived, extinction may subsequently occur in a small population due to stochastic fluctuations. This paper examines how the probability of stochastic extinction depends on population size, MDA coverage and the reproduction number under control, R c . A simple compartmental model is developed which is used to compute the probability of extinction using probability generating functions. The expected time to extinction in small populations after MDA for various scenarios in this model is calculated analytically. The results indicate that mass drug administration (Firstly, R c must be sustained at R c  95% to have a non-negligible probability of successful elimination. Stochastic fluctuations only significantly affect the probability of extinction in populations of about 1000 individuals or less. The expected time to extinction via stochastic fluctuation is less than 10 years only in populations less than about 150 individuals. Clustering of secondary infections and of MDA distribution both contribute positively to the potential probability of success, indicating that MDA would most effectively be administered at the household level. There are very limited circumstances in which MDA will lead to local malaria elimination with a substantial probability.

  12. Extinction probability in a birth-death process with killing

    NARCIS (Netherlands)

    van Doorn, Erik A.; Zeifman, Alexander I.

    2005-01-01

    We study birth-death processes on the nonnegative integers, where {1,2,...} is an irreducible class and 0 an absorbing state, with the additional feature that a transition to state 0 may occur from any state. We give a condition for absorption (extinction) to be certain and obtain the eventual

  13. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  14. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  15. Quantifying the severity of hurricanes on extinction probabilities of a primate population: Insights into "Island" extirpations.

    Science.gov (United States)

    Ameca y Juárez, Eric I; Ellis, Edward A; Rodríguez-Luna, Ernesto

    2015-07-01

    Long-term studies quantifying impacts of hurricane activity on growth and trajectory of primate populations are rare. Using a 14-year monitored population of Alouatta palliata mexicana as a study system, we developed a modeling framework to assess the relative contribution of hurricane disturbance and two types of human impacts, habitat loss, and hunting, on quasi-extinction risk. We found that the scenario with the highest level of disturbance generated a 21% increase in quasi-extinction risk by 40 years compared to scenarios of intermediate disturbance, and around 67% increase relative to that found in low disturbance scenarios. We also found that the probability of reaching quasi-extinction due to human disturbance alone was below 1% by 40 years, although such scenarios reduced population size by 70%, whereas the risk of quasi-extinction ranged between 3% and 65% for different scenarios of hurricane severity alone, in absence of human impacts. Our analysis moreover found that the quasi-extinction risk driven by hunting and hurricane disturbance was significantly lower than the quasi-extinction risk posed by human-driven habitat loss and hurricane disturbance. These models suggest that hurricane disturbance has the potential to exceed the risk posed by human impacts, and, in particular, to substantially increase the speed of the extinction vortex driven by habitat loss relative to that driven by hunting. Early mitigation of habitat loss constituted the best method for reducing quasi-extinction risk: the earlier habitat loss is halted, the less vulnerable the population becomes to hurricane disturbance. By using a well-studied population of A. p. mexicana, we help understand the demographic impacts that extreme environmental disturbance can trigger on isolated populations of taxa already endangered in other systems where long-term demographic data are not available. For those experiencing heavy anthropogenic pressure and lacking sufficiently evolved coping

  16. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  17. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  18. Probability of initiation and extinction in the Mercury Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    McKinley, M. S.; Brantley, P. S. [Lawrence Livermore National Laboratory, 7000 East Ave., Livermore, CA 94551 (United States)

    2013-07-01

    A Monte Carlo method for computing the probability of initiation has previously been implemented in Mercury. Recently, a new method based on the probability of extinction has been implemented as well. The methods have similarities from counting progeny to cycling in time, but they also have differences such as population control and statistical uncertainty reporting. The two methods agree very well for several test problems. Since each method has advantages and disadvantages, we currently recommend that both methods are used to compute the probability of criticality. (authors)

  19. On the regularity of the extinction probability of a branching process in varying and random environments

    International Nuclear Information System (INIS)

    Alili, Smail; Rugh, Hans Henrik

    2008-01-01

    We consider a supercritical branching process in time-dependent environment ξ. We assume that the offspring distributions depend regularly (C k or real-analytically) on real parameters λ. We show that the extinction probability q λ (ξ), given the environment ξ 'inherits' this regularity whenever the offspring distributions satisfy a condition of contraction-type. Our proof makes use of the Poincaré metric on the complex unit disc and a real-analytic implicit function theorem

  20. Estimating Age-Dependent Extinction: Contrasting Evidence from Fossils and Phylogenies.

    Science.gov (United States)

    Hagen, Oskar; Andermann, Tobias; Quental, Tiago B; Antonelli, Alexandre; Silvestro, Daniele

    2018-05-01

    The estimation of diversification rates is one of the most vividly debated topics in modern systematics, with considerable controversy surrounding the power of phylogenetic and fossil-based approaches in estimating extinction. Van Valen's seminal work from 1973 proposed the "Law of constant extinction," which states that the probability of extinction of taxa is not dependent on their age. This assumption of age-independent extinction has prevailed for decades with its assessment based on survivorship curves, which, however, do not directly account for the incompleteness of the fossil record, and have rarely been applied at the species level. Here, we present a Bayesian framework to estimate extinction rates from the fossil record accounting for age-dependent extinction (ADE). Our approach, unlike previous implementations, explicitly models unobserved species and accounts for the effects of fossil preservation on the observed longevity of sampled lineages. We assess the performance and robustness of our method through extensive simulations and apply it to a fossil data set of terrestrial Carnivora spanning the past 40 myr. We find strong evidence of ADE, as we detect the extinction rate to be highest in young species and declining with increasing species age. For comparison, we apply a recently developed analogous ADE model to a dated phylogeny of extant Carnivora. Although the phylogeny-based analysis also infers ADE, it indicates that the extinction rate, instead, increases with increasing taxon age. The estimated mean species longevity also differs substantially, with the fossil-based analyses estimating 2.0 myr, in contrast to 9.8 myr derived from the phylogeny-based inference. Scrutinizing these discrepancies, we find that both fossil and phylogeny-based ADE models are prone to high error rates when speciation and extinction rates increase or decrease through time. However, analyses of simulated and empirical data show that fossil-based inferences are more

  1. Comparison of density estimators. [Estimation of probability density functions

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.; Monahan, J.F.

    1977-09-01

    Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)

  2. The Probability of Extinction of Infectious Salmon Anemia Virus in One and Two Patches.

    Science.gov (United States)

    Milliken, Evan

    2017-12-01

    Single-type and multitype branching processes have been used to study the dynamics of a variety of stochastic birth-death type phenomena in biology and physics. Their use in epidemiology goes back to Whittle's study of a susceptible-infected-recovered (SIR) model in the 1950s. In the case of an SIR model, the presence of only one infectious class allows for the use of single-type branching processes. Multitype branching processes allow for multiple infectious classes and have latterly been used to study metapopulation models of disease. In this article, we develop a continuous time Markov chain (CTMC) model of infectious salmon anemia virus in two patches, two CTMC models in one patch and companion multitype branching process (MTBP) models. The CTMC models are related to deterministic models which inform the choice of parameters. The probability of extinction is computed for the CTMC via numerical methods and approximated by the MTBP in the supercritical regime. The stochastic models are treated as toy models, and the parameter choices are made to highlight regions of the parameter space where CTMC and MTBP agree or disagree, without regard to biological significance. Partial extinction events are defined and their relevance discussed. A case is made for calculating the probability of such events, noting that MTBPs are not suitable for making these calculations.

  3. On the choice of statistical models for estimating occurrence and extinction from animal surveys

    Science.gov (United States)

    Dorazio, R.M.

    2007-01-01

    In surveys of natural animal populations the number of animals that are present and available to be detected at a sample location is often low, resulting in few or no detections. Low detection frequencies are especially common in surveys of imperiled species; however, the choice of sampling method and protocol also may influence the size of the population that is vulnerable to detection. In these circumstances, probabilities of animal occurrence and extinction will generally be estimated more accurately if the models used in data analysis account for differences in abundance among sample locations and for the dependence between site-specific abundance and detection. Simulation experiments are used to illustrate conditions wherein these types of models can be expected to outperform alternative estimators of population site occupancy and extinction. ?? 2007 by the Ecological Society of America.

  4. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  5. Estimating the ecology of extinct species with paleoecological data assimilation

    Science.gov (United States)

    Raiho, A.; McLachlan, J. S.; Dietze, M.

    2017-12-01

    In order to understand long term, unobservable ecosystem processes, ecologists must use both paleoecoloigcal data and ecosystem models. Models parameterize species competitive interactions using modern data. But, modern ecological or physiological observations are not available for extinct species, making it difficult for models to conceptualize their ecology. For instance, American chestnut (Castanea dentata), who played a large role in forests of northeastern US, was decimated by disease to virtual extinction. Since chestnut's demise, defining its ecology has been controversial. Models typically assume that chestnut's ecology was very similar to oak; They parameterize chestnut like oak species. These assumptions are drawn from paleoecological data, but these data are often reported without uncertainty. Since the paleoecological data are often reported without uncertainty, paleoecological data has never been directly incorporated with ecosystem models. We developed a Bayesian statistical model to estimate fractional composition from paleoecological data with uncertainty. Then, we assimilated this data product into an ecosystem model for long term forest succession using a generalized ensemble adjustment filter to determine which species demographic parameters lead to changes in species composition over the last 2,000 years at Harvard Forest. We found that chestnut was strongly negatively correlated with white pine (Pinus strobus) and red oak (Quercus rubra) in the process covariance matrix, suggesting a strong competitive interaction that is not currently understood by models for forest succession. These findings provide support for utilizing a data assimilation framework to ecologically interpret paleoecological data or data products to learn about the ecology of extinct species.

  6. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  7. Estimation of Apollo Lunar Dust Transport using Optical Extinction Measurements

    Science.gov (United States)

    Lane, John E.; Metzger, Philip T.

    2015-04-01

    A technique to estimate mass erosion rate of surface soil during landing of the Apollo Lunar Module (LM) and total mass ejected due to the rocket plume interaction is proposed and tested. The erosion rate is proportional to the product of the second moment of the lofted particle size distribution N(D), and third moment of the normalized soil size distribution S(D), divided by the integral of S(D)ṡD2/v(D), where D is particle diameter and v(D) is the vertical component of particle velocity. The second moment of N(D) is estimated by optical extinction analysis of the Apollo cockpit video. Because of the similarity between mass erosion rate of soil as measured by optical extinction and rainfall rate as measured by radar reflectivity, traditional NWS radar/rainfall correlation methodology can be applied to the lunar soil case where various S(D) models are assumed corresponding to specific lunar sites.

  8. Adaptive estimation of binomial probabilities under misclassification

    NARCIS (Netherlands)

    Albers, Willem/Wim; Veldman, H.J.

    1984-01-01

    If misclassification occurs the standard binomial estimator is usually seriously biased. It is known that an improvement can be achieved by using more than one observer in classifying the sample elements. Here it will be investigated which number of observers is optimal given the total number of

  9. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  10. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  11. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  12. Accounting Fraud: an estimation of detection probability

    Directory of Open Access Journals (Sweden)

    Artur Filipe Ewald Wuerges

    2014-12-01

    Full Text Available Financial statement fraud (FSF is costly for investors and can damage the credibility of the audit profession. To prevent and detect fraud, it is helpful to know its causes. The binary choice models (e.g. logit and probit commonly used in the extant literature, however, fail to account for undetected cases of fraud and thus present unreliable hypotheses tests. Using a sample of 118 companies accused of fraud by the Securities and Exchange Commission (SEC, we estimated a logit model that corrects the problems arising from undetected frauds in U.S. companies. To avoid multicollinearity problems, we extracted seven factors from 28 variables using the principal factors method. Our results indicate that only 1.43 percent of the instances of FSF were publicized by the SEC. Of the six significant variables included in the traditional, uncorrected logit model, three were found to be actually non-significant in the corrected model. The likelihood of FSF is 5.12 times higher when the firm’s auditor issues an adverse or qualified report.

  13. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  14. Internal Medicine residents use heuristics to estimate disease probability

    Directory of Open Access Journals (Sweden)

    Sen Phang

    2015-12-01

    Conclusions: Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  15. Fisher classifier and its probability of error estimation

    Science.gov (United States)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  16. The estimation of collision probabilities in complicated geometries

    International Nuclear Information System (INIS)

    Roth, M.J.

    1969-04-01

    This paper demonstrates how collision probabilities in complicated geometries may be estimated. It is assumed that the reactor core may be divided into a number of cells each with simple geometry so that a collision probability matrix can be calculated for each cell by standard methods. It is then shown how these may be joined together. (author)

  17. Estimated probability of the number of buildings damaged by the ...

    African Journals Online (AJOL)

    The analysis shows that the probability estimator of the building damage ... and homeowners) should reserve the cost of repair at least worth the risk of loss, to face ... Keywords: Citarum River; logistic regression; genetic algorithm; losses risk; ...

  18. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  20. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  1. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  2. Estimates of the magnitudes of major marine mass extinctions in earth history

    Science.gov (United States)

    Stanley, Steven M.

    2016-10-01

    Procedures introduced here make it possible, first, to show that background (piecemeal) extinction is recorded throughout geologic stages and substages (not all extinction has occurred suddenly at the ends of such intervals); second, to separate out background extinction from mass extinction for a major crisis in earth history; and third, to correct for clustering of extinctions when using the rarefaction method to estimate the percentage of species lost in a mass extinction. Also presented here is a method for estimating the magnitude of the Signor-Lipps effect, which is the incorrect assignment of extinctions that occurred during a crisis to an interval preceding the crisis because of the incompleteness of the fossil record. Estimates for the magnitudes of mass extinctions presented here are in most cases lower than those previously published. They indicate that only ˜81% of marine species died out in the great terminal Permian crisis, whereas levels of 90-96% have frequently been quoted in the literature. Calculations of the latter numbers were incorrectly based on combined data for the Middle and Late Permian mass extinctions. About 90 orders and more than 220 families of marine animals survived the terminal Permian crisis, and they embodied an enormous amount of morphological, physiological, and ecological diversity. Life did not nearly disappear at the end of the Permian, as has often been claimed.

  3. Estimates of the magnitudes of major marine mass extinctions in earth history.

    Science.gov (United States)

    Stanley, Steven M

    2016-10-18

    Procedures introduced here make it possible, first, to show that background (piecemeal) extinction is recorded throughout geologic stages and substages (not all extinction has occurred suddenly at the ends of such intervals); second, to separate out background extinction from mass extinction for a major crisis in earth history; and third, to correct for clustering of extinctions when using the rarefaction method to estimate the percentage of species lost in a mass extinction. Also presented here is a method for estimating the magnitude of the Signor-Lipps effect, which is the incorrect assignment of extinctions that occurred during a crisis to an interval preceding the crisis because of the incompleteness of the fossil record. Estimates for the magnitudes of mass extinctions presented here are in most cases lower than those previously published. They indicate that only ∼81% of marine species died out in the great terminal Permian crisis, whereas levels of 90-96% have frequently been quoted in the literature. Calculations of the latter numbers were incorrectly based on combined data for the Middle and Late Permian mass extinctions. About 90 orders and more than 220 families of marine animals survived the terminal Permian crisis, and they embodied an enormous amount of morphological, physiological, and ecological diversity. Life did not nearly disappear at the end of the Permian, as has often been claimed.

  4. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  5. Estimating rates of local species extinction, colonization and turnover in animal communities

    Science.gov (United States)

    Nichols, James D.; Boulinier, T.; Hines, J.E.; Pollock, K.H.; Sauer, J.R.

    1998-01-01

    Species richness has been identified as a useful state variable for conservation and management purposes. Changes in richness over time provide a basis for predicting and evaluating community responses to management, to natural disturbance, and to changes in factors such as community composition (e.g., the removal of a keystone species). Probabilistic capture-recapture models have been used recently to estimate species richness from species count and presence-absence data. These models do not require the common assumption that all species are detected in sampling efforts. We extend this approach to the development of estimators useful for studying the vital rates responsible for changes in animal communities over time; rates of local species extinction, turnover, and colonization. Our approach to estimation is based on capture-recapture models for closed animal populations that permit heterogeneity in detection probabilities among the different species in the sampled community. We have developed a computer program, COMDYN, to compute many of these estimators and associated bootstrap variances. Analyses using data from the North American Breeding Bird Survey (BBS) suggested that the estimators performed reasonably well. We recommend estimators based on probabilistic modeling for future work on community responses to management efforts as well as on basic questions about community dynamics.

  6. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    Science.gov (United States)

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. The probability of reinforcement per trial affects posttrial responding and subsequent extinction but not within-trial responding.

    Science.gov (United States)

    Harris, Justin A; Kwok, Dorothy W S

    2018-01-01

    During magazine approach conditioning, rats do not discriminate between a conditional stimulus (CS) that is consistently reinforced with food and a CS that is occasionally (partially) reinforced, as long as the CSs have the same overall reinforcement rate per second. This implies that rats are indifferent to the probability of reinforcement per trial. However, in the same rats, the per-trial reinforcement rate will affect subsequent extinction-responding extinguishes more rapidly for a CS that was consistently reinforced than for a partially reinforced CS. Here, we trained rats with consistently and partially reinforced CSs that were matched for overall reinforcement rate per second. We measured conditioned responding both during and immediately after the CSs. Differences in the per-trial probability of reinforcement did not affect the acquisition of responding during the CS but did affect subsequent extinction of that responding, and also affected the post-CS response rates during conditioning. Indeed, CSs with the same probability of reinforcement per trial evoked the same amount of post-CS responding even when they differed in overall reinforcement rate and thus evoked different amounts of responding during the CS. We conclude that reinforcement rate per second controls rats' acquisition of responding during the CS, but at the same time, rats also learn specifically about the probability of reinforcement per trial. The latter learning affects the rats' expectation of reinforcement as an outcome of the trial, which influences their ability to detect retrospectively that an opportunity for reinforcement was missed, and, in turn, drives extinction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  9. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  10. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...

  11. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  12. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  13. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  14. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  15. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  16. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold.

  17. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  18. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  19. Methods for estimating drought streamflow probabilities for Virginia streams

    Science.gov (United States)

    Austin, Samuel H.

    2014-01-01

    Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.

  20. COMPARATIVE ANALYSIS OF ESTIMATION METHODS OF PHARMACY ORGANIZATION BANKRUPTCY PROBABILITY

    Directory of Open Access Journals (Sweden)

    V. L. Adzhienko

    2014-01-01

    Full Text Available A purpose of this study was to determine the probability of bankruptcy by various methods in order to predict the financial crisis of pharmacy organization. Estimating the probability of pharmacy organization bankruptcy was conducted using W. Beaver’s method adopted in the Russian Federation, with integrated assessment of financial stability use on the basis of scoring analysis. The results obtained by different methods are comparable and show that the risk of bankruptcy of the pharmacy organization is small.

  1. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  2. On the extinction paradox, the finiteness of resources, and the nature of probability

    International Nuclear Information System (INIS)

    Munera, H.

    1991-01-01

    A talk by Lewins is discussed. Lewins addressed the subject of randomness in nature, and some implications for nuclear reactors. In particular, Lewins described the extinction paradox: a critical ''reactor is self-sustaining in the mean, nevertheless it will shut down with certainty!'' According to Lewins, the paradox arises because the idealized chain process leaves out the fact that resources are always finite (ie, the number of initial neutrons is finite, and the amount of fissile atoms in a reactor depends upon Avogadro's number). This explanation, however, only implies that the model used to reach the paradox is too naive to represent a real reactor (indeed, Lewins immediately explains more realistic models), but the paradox still remains in the context of the idealized chain reaction. Two ways of explaining the paradox are considered. (author)

  3. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  4. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  5. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  6. Estimating the exceedance probability of rain rate by logistic regression

    Science.gov (United States)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  7. Probability Density Estimation Using Neural Networks in Monte Carlo Calculations

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Cho, Jin Young; Song, Jae Seung; Kim, Chang Hyo

    2008-01-01

    The Monte Carlo neutronics analysis requires the capability for a tally distribution estimation like an axial power distribution or a flux gradient in a fuel rod, etc. This problem can be regarded as a probability density function estimation from an observation set. We apply the neural network based density estimation method to an observation and sampling weight set produced by the Monte Carlo calculations. The neural network method is compared with the histogram and the functional expansion tally method for estimating a non-smooth density, a fission source distribution, and an absorption rate's gradient in a burnable absorber rod. The application results shows that the neural network method can approximate a tally distribution quite well. (authors)

  8. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  9. Human error probability estimation using licensee event reports

    International Nuclear Information System (INIS)

    Voska, K.J.; O'Brien, J.N.

    1984-07-01

    Objective of this report is to present a method for using field data from nuclear power plants to estimate human error probabilities (HEPs). These HEPs are then used in probabilistic risk activities. This method of estimating HEPs is one of four being pursued in NRC-sponsored research. The other three are structured expert judgment, analysis of training simulator data, and performance modeling. The type of field data analyzed in this report is from Licensee Event reports (LERs) which are analyzed using a method specifically developed for that purpose. However, any type of field data or human errors could be analyzed using this method with minor adjustments. This report assesses the practicality, acceptability, and usefulness of estimating HEPs from LERs and comprehensively presents the method for use

  10. Estimation of the probability of success in petroleum exploration

    Science.gov (United States)

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum

  11. Estimating Model Probabilities using Thermodynamic Markov Chain Monte Carlo Methods

    Science.gov (United States)

    Ye, M.; Liu, P.; Beerli, P.; Lu, D.; Hill, M. C.

    2014-12-01

    Markov chain Monte Carlo (MCMC) methods are widely used to evaluate model probability for quantifying model uncertainty. In a general procedure, MCMC simulations are first conducted for each individual model, and MCMC parameter samples are then used to approximate marginal likelihood of the model by calculating the geometric mean of the joint likelihood of the model and its parameters. It has been found the method of evaluating geometric mean suffers from the numerical problem of low convergence rate. A simple test case shows that even millions of MCMC samples are insufficient to yield accurate estimation of the marginal likelihood. To resolve this problem, a thermodynamic method is used to have multiple MCMC runs with different values of a heating coefficient between zero and one. When the heating coefficient is zero, the MCMC run is equivalent to a random walk MC in the prior parameter space; when the heating coefficient is one, the MCMC run is the conventional one. For a simple case with analytical form of the marginal likelihood, the thermodynamic method yields more accurate estimate than the method of using geometric mean. This is also demonstrated for a case of groundwater modeling with consideration of four alternative models postulated based on different conceptualization of a confining layer. This groundwater example shows that model probabilities estimated using the thermodynamic method are more reasonable than those obtained using the geometric method. The thermodynamic method is general, and can be used for a wide range of environmental problem for model uncertainty quantification.

  12. Estimating the extinction date of the thylacine with mixed certainty data.

    Science.gov (United States)

    Carlson, Colin J; Bond, Alexander L; Burgio, Kevin R

    2018-04-01

    The thylacine (Thylacinus cynocephalus), one of Australia's most characteristic megafauna, was the largest marsupial carnivore until hunting, and potentially disease, drove it to extinction in 1936. Although thylacines were restricted to Tasmania for 2 millennia prior to their extinction, recent so-called plausible sightings on the Cape York Peninsula in northern Queensland have emerged, leading some to speculate the species may have persisted undetected. We compiled a data set that included physical evidence, expert-validated sightings, and unconfirmed sightings up to the present day and implemented a range of extinction models (focusing on a Bayesian approach that incorporates all 3 types of data by modeling valid and invalid sightings as independent processes) to evaluate the likelihood of the thylacine's persistence. Although the last captive individual died in September 1936, our results suggested that the most likely extinction date would be 1940. Our other extinction models estimated the thylacine's extinction date between 1936 and 1943, and the most optimistic scenario indicated that the species did not persist beyond 1956. The search for the thylacine, much like similar efforts to rediscover other recently extinct charismatic taxa, is likely to be fruitless, especially given that persistence on Tasmania would have been no guarantee the species could reappear in regions that had been unoccupied for millennia. The search for the thylacine may become a rallying point for conservation and wildlife biology and could indirectly help fund and support critical research in understudied areas such as Cape York. However, our results suggest that attempts to rediscover the thylacine will be unsuccessful and that the continued survival of the thylacine is entirely implausible based on most current mathematical theories of extinction. © 2017 Society for Conservation Biology.

  13. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  14. ESTIMATING SPECIATION AND EXTINCTION RATES FROM DIVERSITY DATA AND THE FOSSIL RECORD

    NARCIS (Netherlands)

    Etienne, Rampal S.; Apol, M. Emile F.

    Understanding the processes that underlie biodiversity requires insight into the evolutionary history of the taxa involved. Accurate estimation of speciation, extinction, and diversification rates is a prerequisite for gaining this insight. Here, we develop a stochastic birth-death model of

  15. Site Specific Probable Maximum Precipitation Estimates and Professional Judgement

    Science.gov (United States)

    Hayes, B. D.; Kao, S. C.; Kanney, J. F.; Quinlan, K. R.; DeNeale, S. T.

    2015-12-01

    State and federal regulatory authorities currently rely upon the US National Weather Service Hydrometeorological Reports (HMRs) to determine probable maximum precipitation (PMP) estimates (i.e., rainfall depths and durations) for estimating flooding hazards for relatively broad regions in the US. PMP estimates for the contributing watersheds upstream of vulnerable facilities are used to estimate riverine flooding hazards while site-specific estimates for small water sheds are appropriate for individual facilities such as nuclear power plants. The HMRs are often criticized due to their limitations on basin size, questionable applicability in regions affected by orographic effects, their lack of consist methods, and generally by their age. HMR-51 for generalized PMP estimates for the United States east of the 105th meridian, was published in 1978 and is sometimes perceived as overly conservative. The US Nuclear Regulatory Commission (NRC), is currently reviewing several flood hazard evaluation reports that rely on site specific PMP estimates that have been commercially developed. As such, NRC has recently investigated key areas of expert judgement via a generic audit and one in-depth site specific review as they relate to identifying and quantifying actual and potential storm moisture sources, determining storm transposition limits, and adjusting available moisture during storm transposition. Though much of the approach reviewed was considered a logical extension of HMRs, two key points of expert judgement stood out for further in-depth review. The first relates primarily to small storms and the use of a heuristic for storm representative dew point adjustment developed for the Electric Power Research Institute by North American Weather Consultants in 1993 in order to harmonize historic storms for which only 12 hour dew point data was available with more recent storms in a single database. The second issue relates to the use of climatological averages for spatially

  16. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  17. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  18. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  19. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  20. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  1. [Survival analysis with competing risks: estimating failure probability].

    Science.gov (United States)

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  2. Estimation of Apollo lunar dust transport using optical extinction measurements

    OpenAIRE

    Lane, John E.; Metzger, Philip T.

    2015-01-01

    A technique to estimate mass erosion rate of surface soil during landing of the Apollo Lunar Module (LM) and total mass ejected due to the rocket plume interaction is proposed and tested. The erosion rate is proportional to the product of the second moment of the lofted particle size distribution N(D), and third moment of the normalized soil size distribution S(D), divided by the integral of S(D)D^2/v(D), where D is particle diameter and v(D) is the vertical component of particle velocity. Th...

  3. Estimating Stellar Parameters and Interstellar Extinction from Evolutionary Tracks

    Directory of Open Access Journals (Sweden)

    Sichevsky S.

    2016-03-01

    Full Text Available Developing methods for analyzing and extracting information from modern sky surveys is a challenging task in astrophysical studies. We study possibilities of parameterizing stars and interstellar medium from multicolor photometry performed in three modern photometric surveys: GALEX, SDSS, and 2MASS. For this purpose, we have developed a method to estimate stellar radius from effective temperature and gravity with the help of evolutionary tracks and model stellar atmospheres. In accordance with the evolution rate at every point of the evolutionary track, star formation rate, and initial mass function, a weight is assigned to the resulting value of radius that allows us to estimate the radius more accurately. The method is verified for the most populated areas of the Hertzsprung-Russell diagram: main-sequence stars and red giants, and it was found to be rather precise (for main-sequence stars, the average relative error of radius and its standard deviation are 0.03% and 3.87%, respectively.

  4. Selection of anchor values for human error probability estimation

    International Nuclear Information System (INIS)

    Buffardi, L.C.; Fleishman, E.A.; Allen, J.A.

    1989-01-01

    There is a need for more dependable information to assist in the prediction of human errors in nuclear power environments. The major objective of the current project is to establish guidelines for using error probabilities from other task settings to estimate errors in the nuclear environment. This involves: (1) identifying critical nuclear tasks, (2) discovering similar tasks in non-nuclear environments, (3) finding error data for non-nuclear tasks, and (4) establishing error-rate values for the nuclear tasks based on the non-nuclear data. A key feature is the application of a classification system to nuclear and non-nuclear tasks to evaluate their similarities and differences in order to provide a basis for generalizing human error estimates across tasks. During the first eight months of the project, several classification systems have been applied to a sample of nuclear tasks. They are discussed in terms of their potential for establishing task equivalence and transferability of human error rates across situations

  5. The estimation of probable maximum precipitation: the case of Catalonia.

    Science.gov (United States)

    Casas, M Carmen; Rodríguez, Raül; Nieto, Raquel; Redaño, Angel

    2008-12-01

    A brief overview of the different techniques used to estimate the probable maximum precipitation (PMP) is presented. As a particular case, the 1-day PMP over Catalonia has been calculated and mapped with a high spatial resolution. For this purpose, the annual maximum daily rainfall series from 145 pluviometric stations of the Instituto Nacional de Meteorología (Spanish Weather Service) in Catalonia have been analyzed. In order to obtain values of PMP, an enveloping frequency factor curve based on the actual rainfall data of stations in the region has been developed. This enveloping curve has been used to estimate 1-day PMP values of all the 145 stations. Applying the Cressman method, the spatial analysis of these values has been achieved. Monthly precipitation climatological data, obtained from the application of Geographic Information Systems techniques, have been used as the initial field for the analysis. The 1-day PMP at 1 km(2) spatial resolution over Catalonia has been objectively determined, varying from 200 to 550 mm. Structures with wavelength longer than approximately 35 km can be identified and, despite their general concordance, the obtained 1-day PMP spatial distribution shows remarkable differences compared to the annual mean precipitation arrangement over Catalonia.

  6. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  7. Structural health monitoring and probability of detection estimation

    Science.gov (United States)

    Forsyth, David S.

    2016-02-01

    Structural health monitoring (SHM) methods are often based on nondestructive testing (NDT) sensors and are often proposed as replacements for NDT to lower cost and/or improve reliability. In order to take advantage of SHM for life cycle management, it is necessary to determine the Probability of Detection (POD) of the SHM system just as for traditional NDT to ensure that the required level of safety is maintained. Many different possibilities exist for SHM systems, but one of the attractive features of SHM versus NDT is the ability to take measurements very simply after the SHM system is installed. Using a simple statistical model of POD, some authors have proposed that very high rates of SHM system data sampling can result in high effective POD even in situations where an individual test has low POD. In this paper, we discuss the theoretical basis for determining the effect of repeated inspections, and examine data from SHM experiments against this framework to show how the effective POD from multiple tests can be estimated.

  8. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  9. Extinction Correction Significantly Influences the Estimate of the Lyα Escape Fraction

    Science.gov (United States)

    An, Fang Xia; Zheng, Xian Zhong; Hao, Cai-Na; Huang, Jia-Sheng; Xia, Xiao-Yang

    2017-02-01

    The Lyα escape fraction is a key measure to constrain the neutral state of the intergalactic medium and then to understand how the universe was fully reionized. We combine deep narrowband imaging data from the custom-made filter NB393 and the {{{H}}}2S1 filter centered at 2.14 μm to examine the Lyα emitters and Hα emitters at the same redshift z = 2.24. The combination of these two populations allows us to determine the Lyα escape fraction at z = 2.24. Over an area of 383 arcmin2 in the Extended Chandra Deep Field South (ECDFS), 124 Lyα emitters are detected down to NB393 = 26.4 mag at the 5σ level, and 56 Hα emitters come from An et al. Of these, four have both Lyα and Hα emissions (LAHAEs). We also collect the Lyα emitters and Hα emitters at z = 2.24 in the COSMOS field from the literature, and increase the number of LAHAEs to 15 in total. About one-third of them are AGNs. We measure the individual/volumetric Lyα escape fraction by comparing the observed Lyα luminosity/luminosity density to the extinction-corrected Hα luminosity/luminosity density. We revisit the extinction correction for Hα emitters using the Galactic extinction law with color excess for nebular emission. We also adopt the Calzetti extinction law together with an identical color excess for stellar and nebular regions to explore how the uncertainties in extinction correction affect the estimate of individual and global Lyα escape fractions. In both cases, an anti-correlation between the Lyα escape fraction and dust attenuation is found among the LAHAEs, suggesting that dust absorption is responsible for the suppression of the escaping Lyα photons. However, the estimated Lyα escape fraction of individual LAHAEs varies by up to ˜3 percentage points between the two methods of extinction correction. We find the global Lyα escape fraction at z = 2.24 to be (3.7 ± 1.4)% in the ECDFS. The variation in the color excess of the extinction causes a discrepancy of ˜1 percentage point

  10. Dental age estimation: the role of probability estimates at the 10 year threshold.

    Science.gov (United States)

    Lucas, Victoria S; McDonald, Fraser; Neil, Monica; Roberts, Graham

    2014-08-01

    The use of probability at the 18 year threshold has simplified the reporting of dental age estimates for emerging adults. The availability of simple to use widely available software has enabled the development of the probability threshold for individual teeth in growing children. Tooth development stage data from a previous study at the 10 year threshold were reused to estimate the probability of developing teeth being above or below the 10 year thresh-hold using the NORMDIST Function in Microsoft Excel. The probabilities within an individual subject are averaged to give a single probability that a subject is above or below 10 years old. To test the validity of this approach dental panoramic radiographs of 50 female and 50 male children within 2 years of the chronological age were assessed with the chronological age masked. Once the whole validation set of 100 radiographs had been assessed the masking was removed and the chronological age and dental age compared. The dental age was compared with chronological age to determine whether the dental age correctly or incorrectly identified a validation subject as above or below the 10 year threshold. The probability estimates correctly identified children as above or below on 94% of occasions. Only 2% of the validation group with a chronological age of less than 10 years were assigned to the over 10 year group. This study indicates the very high accuracy of assignment at the 10 year threshold. Further work at other legally important age thresholds is needed to explore the value of this approach to the technique of age estimation. Copyright © 2014. Published by Elsevier Ltd.

  11. First hitting probabilities for semi markov chains and estimation

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos

    2017-01-01

    We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain...

  12. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears.

    Science.gov (United States)

    Laufenberg, Jared S; Clark, Joseph D; Chandler, Richard B

    2018-01-01

    Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus) was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR) data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years ([Formula: see text]) was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when [Formula: see text], suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  13. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears.

    Directory of Open Access Journals (Sweden)

    Jared S Laufenberg

    Full Text Available Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years ([Formula: see text] was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when [Formula: see text], suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  14. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears

    Science.gov (United States)

    Laufenberg, Jared S.; Clark, Joseph D.; Chandler, Richard B.

    2018-01-01

    Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus) was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR) data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years () was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when , suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  15. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  16. average probability of failure on demand estimation for burner

    African Journals Online (AJOL)

    HOD

    Pij – Probability from state i to j. 1. INTRODUCTION. In the process .... the numerical value of the PFD as result of components, sub-system ... ignored in probabilistic risk assessment it may lead to ...... Markov chains for a holistic modeling of SIS.

  17. Estimated probability of stroke among medical outpatients in Enugu ...

    African Journals Online (AJOL)

    Risk factors for stroke were evaluated using a series of laboratory tests, medical history and physical examinations. The 10‑year probability of stroke was determined by applying the Framingham stroke risk equation. Statistical analysis was performed with the use of the SPSS 17.0 software package (SPSS Inc., Chicago, IL, ...

  18. Naive Probability: Model-based Estimates of Unique Events

    Science.gov (United States)

    2014-05-04

    of inference. Argument and Computation, 1–17, iFirst. Khemlani, S., & Johnson-Laird, P.N. (2012b). Theories of the syllogism: A meta -analysis...is the probability that… 1 space tourism will achieve widespread popularity in the next 50 years? advances in material science will lead to the... governments dedicate more resources to contacting extra-terrestrials? 8 the United States adopts an open border policy of universal acceptance? English is

  19. Estimating the Probability of Wind Ramping Events: A Data-driven Approach

    OpenAIRE

    Wang, Cheng; Wei, Wei; Wang, Jianhui; Qiu, Feng

    2016-01-01

    This letter proposes a data-driven method for estimating the probability of wind ramping events without exploiting the exact probability distribution function (PDF) of wind power. Actual wind data validates the proposed method.

  20. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  1. Improved Variable Window Kernel Estimates of Probability Densities

    OpenAIRE

    Hall, Peter; Hu, Tien Chung; Marron, J. S.

    1995-01-01

    Variable window width kernel density estimators, with the width varying proportionally to the square root of the density, have been thought to have superior asymptotic properties. The rate of convergence has been claimed to be as good as those typical for higher-order kernels, which makes the variable width estimators more attractive because no adjustment is needed to handle the negativity usually entailed by the latter. However, in a recent paper, Terrell and Scott show that these results ca...

  2. Sharp probability estimates for Shor's order-finding algorithm

    OpenAIRE

    Bourdon, P. S.; Williams, H. T.

    2006-01-01

    Let N be a (large positive integer, let b > 1 be an integer relatively prime to N, and let r be the order of b modulo N. Finally, let QC be a quantum computer whose input register has the size specified in Shor's original description of his order-finding algorithm. We prove that when Shor's algorithm is implemented on QC, then the probability P of obtaining a (nontrivial) divisor of r exceeds 0.7 whenever N exceeds 2^{11}-1 and r exceeds 39, and we establish that 0.7736 is an asymptotic lower...

  3. Percentile estimation using the normal and lognormal probability distribution

    International Nuclear Information System (INIS)

    Bement, T.R.

    1980-01-01

    Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution

  4. METAPHOR: Probability density estimation for machine learning based photometric redshifts

    Science.gov (United States)

    Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-06-01

    We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).

  5. Development of an integrated system for estimating human error probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  6. Fusion probability and survivability in estimates of heaviest nuclei production

    International Nuclear Information System (INIS)

    Sagaidak, Roman

    2012-01-01

    A number of theoretical models have been recently developed to predict production cross sections for the heaviest nuclei in fusion-evaporation reactions. All the models reproduce cross sections obtained in experiments quite well. At the same time they give fusion probability values P fus ≡ P CN differed within several orders of the value. This difference implies a corresponding distinction in the calculated values of survivability. The production of the heaviest nuclei (from Cm to the region of superheavy elements (SHE) close to Z = 114 and N = 184) in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing (fusion) model coupled with the standard statistical model (SSM) of the compound nucleus (CN) decay. Both models are incorporated into the HIVAP code. Available data on the excitation functions for fission and evaporation residues (ER) produced in very asymmetric combinations can be described rather well within the framework of HIVAP. Cross-section data obtained in these reactions allow one to choose model parameters quite definitely. Thus one can scale and fix macroscopic (liquid-drop) fission barriers for nuclei involved in the evaporation-fission cascade. In less asymmetric combinations (with 22 Ne and heavier projectiles) effects of fusion suppression caused by quasi-fission are starting to appear in the entrance channel of reactions. The P fus values derived from the capture-fission and fusion-fission cross-sections obtained at energies above the Bass barrier were plotted as a function of the Coulomb parameter. For more symmetric combinations one can deduce the P fus values semi-empirically, using the ER and fission excitation functions measured in experiments, and applying SSM model with parameters obtained in the analysis of a very asymmetric combination leading to the production of (nearly) the same CN, as was done for reactions leading to the pre-actinide nuclei formation

  7. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  8. Fusion probability and survivability in estimates of heaviest nuclei production

    Directory of Open Access Journals (Sweden)

    Sagaidak Roman N.

    2012-02-01

    Full Text Available Production of the heavy and heaviest nuclei (from Po to the region of superheavy elements close to Z=114 and N=184 in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing model coupled with the statistical model (SM of de-excitation of a compound nucleus (CN. Excitation functions for fission and evaporation residues (ER measured in very asymmetric combinations can be described rather well. One can scale and fix macroscopic (liquid-drop fission barriers for nuclei involved in the calculation of survivability with SM. In less asymmetric combinations, effects of fusion suppression caused by quasi-fission (QF are starting to appear in the entrance channel of reactions. QF effects could be semi-empirically taken into account using fusion probabilities deduced as the ratio of measured ER cross sections to the ones obtained in the assumption of absence of the fusion suppression in corresponding reactions. SM parameters (fission barriers obtained at the analysis of a very asymmetric combination leading to the production of (nearly the same CN should be used for this evaluation.

  9. Impact of alternative metrics on estimates of extent of occurrence for extinction risk assessment.

    Science.gov (United States)

    Joppa, Lucas N; Butchart, Stuart H M; Hoffmann, Michael; Bachman, Steve P; Akçakaya, H Resit; Moat, Justin F; Böhm, Monika; Holland, Robert A; Newton, Adrian; Polidoro, Beth; Hughes, Adrian

    2016-04-01

    In International Union for Conservation of Nature (IUCN) Red List assessments, extent of occurrence (EOO) is a key measure of extinction risk. However, the way assessors estimate EOO from maps of species' distributions is inconsistent among assessments of different species and among major taxonomic groups. Assessors often estimate EOO from the area of mapped distribution, but these maps often exclude areas that are not habitat in idiosyncratic ways and are not created at the same spatial resolutions. We assessed the impact on extinction risk categories of applying different methods (minimum convex polygon, alpha hull) for estimating EOO for 21,763 species of mammals, birds, and amphibians. Overall, the percentage of threatened species requiring down listing to a lower category of threat (taking into account other Red List criteria under which they qualified) spanned 11-13% for all species combined (14-15% for mammals, 7-8% for birds, and 12-15% for amphibians). These down listings resulted from larger estimates of EOO and depended on the EOO calculation method. Using birds as an example, we found that 14% of threatened and near threatened species could require down listing based on the minimum convex polygon (MCP) approach, an approach that is now recommended by IUCN. Other metrics (such as alpha hull) had marginally smaller impacts. Our results suggest that uniformly applying the MCP approach may lead to a one-time down listing of hundreds of species but ultimately ensure consistency across assessments and realign the calculation of EOO with the theoretical basis on which the metric was founded. © 2015 Society for Conservation Biology.

  10. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  11. Use of probabilistic methods for estimating failure probabilities and directing ISI-efforts

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, F; Brickstad, B [University of Uppsala, (Switzerland)

    1988-12-31

    Some general aspects of the role of Non Destructive Testing (NDT) efforts on the resulting probability of core damage is discussed. A simple model for the estimation of the pipe break probability due to IGSCC is discussed. It is partly based on analytical procedures, partly on service experience from the Swedish BWR program. Estimates of the break probabilities indicate that further studies are urgently needed. It is found that the uncertainties about the initial crack configuration are large contributors to the total uncertainty. Some effects of the inservice inspection are studied and it is found that the detection probabilities influence the failure probabilities. (authors).

  12. Artificial neural networks can learn to estimate extinction rates from molecular phylogenies

    NARCIS (Netherlands)

    Bokma, Folmer

    2006-01-01

    Molecular phylogenies typically consist of only extant species, yet they allow inference of past rates of extinction, because. recently originated species are less likely to be extinct than ancient species. Despite the simple structure of the assumed underlying speciation-extinction process,

  13. Does adaptive strategy for delayed seed dispersion affect extinction probability of a desert species? an assessment using the population viability analysis and glass house experiment

    Directory of Open Access Journals (Sweden)

    Manish Mathur

    2014-10-01

    Full Text Available Canopy seed bank is an important adaptive evolutionary trait that provides various types of protection to the seeds. However, costing of such evolutionary trait on plant survival is largely unknown. Present investigation provided a new insight on the serotonious habit of Blepharis sindica associated with its endangerment status. Extinction probabilities of two available population of B. sindica were quantified using two types of census data, i.e., fruiting body number and actual population size. Population Viability Analysis (PVA revealed that delayed seed release tendency (higher fruiting body number was not synchronized with actual ground conditions (lower population size. PVA analysis based on actual population size indicated that both the available populations would vanish within 20 years. The mean time of extinction calculated from both type census data indicated its extinction within 48 years. For assessing the conservation criteria, a glass house experiment was carried out with different soil types and compositions. Pure sand and higher proportions of sand -silt were more suitable compared to clay; further, gravelly surface was the most unsuitable habitat for this species. Collection of the seeds from mature fruits/capsule and their sowing with moderate moisture availability with sandy soil could be recommended.

  14. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower

  15. Estimation and asymptotic theory for transition probabilities in Markov Renewal Multi–state models

    NARCIS (Netherlands)

    Spitoni, C.; Verduijn, M.; Putter, H.

    2012-01-01

    In this paper we discuss estimation of transition probabilities for semi–Markov multi–state models. Non–parametric and semi–parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional

  16. Estimating the probability that the Taser directly causes human ventricular fibrillation.

    Science.gov (United States)

    Sun, H; Haemmerich, D; Rahko, P S; Webster, J G

    2010-04-01

    This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.

  17. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  18. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  19. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Estimating Bird / Aircraft Collision Probabilities and Risk Utilizing Spatial Poisson Processes

    Science.gov (United States)

    2012-06-10

    ESTIMATING BIRD/AIRCRAFT COLLISION PROBABILITIES AND RISK UTILIZING SPATIAL POISSON PROCESSES GRADUATE...AND RISK UTILIZING SPATIAL POISSON PROCESSES GRADUATE RESEARCH PAPER Presented to the Faculty Department of Operational Sciences...COLLISION PROBABILITIES AND RISK UTILIZING SPATIAL POISSON PROCESSES Brady J. Vaira, BS, MS Major, USAF Approved

  1. A procedure for estimation of pipe break probabilities due to IGSCC

    International Nuclear Information System (INIS)

    Bergman, M.; Brickstad, B.; Nilsson, F.

    1998-06-01

    A procedure has been developed for estimation of the failure probability of welds joints in nuclear piping susceptible to intergranular stress corrosion cracking. The procedure aims at a robust and rapid estimate of the failure probability for a specific weld with known stress state. Random properties are taken into account of the crack initiation rate, the initial crack length, the in-service inspection efficiency and the leak rate. A computer realization of the procedure has been developed for user friendly applications by design engineers. Some examples are considered to investigate the sensitivity of the failure probability to different input quantities. (au)

  2. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    Science.gov (United States)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  3. Two-step estimation in ratio-of-mediator-probability weighted causal mediation analysis.

    Science.gov (United States)

    Bein, Edward; Deutsch, Jonah; Hong, Guanglei; Porter, Kristin E; Qin, Xu; Yang, Cheng

    2018-04-15

    This study investigates appropriate estimation of estimator variability in the context of causal mediation analysis that employs propensity score-based weighting. Such an analysis decomposes the total effect of a treatment on the outcome into an indirect effect transmitted through a focal mediator and a direct effect bypassing the mediator. Ratio-of-mediator-probability weighting estimates these causal effects by adjusting for the confounding impact of a large number of pretreatment covariates through propensity score-based weighting. In step 1, a propensity score model is estimated. In step 2, the causal effects of interest are estimated using weights derived from the prior step's regression coefficient estimates. Statistical inferences obtained from this 2-step estimation procedure are potentially problematic if the estimated standard errors of the causal effect estimates do not reflect the sampling uncertainty in the estimation of the weights. This study extends to ratio-of-mediator-probability weighting analysis a solution to the 2-step estimation problem by stacking the score functions from both steps. We derive the asymptotic variance-covariance matrix for the indirect effect and direct effect 2-step estimators, provide simulation results, and illustrate with an application study. Our simulation results indicate that the sampling uncertainty in the estimated weights should not be ignored. The standard error estimation using the stacking procedure offers a viable alternative to bootstrap standard error estimation. We discuss broad implications of this approach for causal analysis involving propensity score-based weighting. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Psychological scaling of expert estimates of human error probabilities: application to nuclear power plant operation

    International Nuclear Information System (INIS)

    Comer, K.; Gaddy, C.D.; Seaver, D.A.; Stillwell, W.G.

    1985-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories sponsored a project to evaluate psychological scaling techniques for use in generating estimates of human error probabilities. The project evaluated two techniques: direct numerical estimation and paired comparisons. Expert estimates were found to be consistent across and within judges. Convergent validity was good, in comparison to estimates in a handbook of human reliability. Predictive validity could not be established because of the lack of actual relative frequencies of error (which will be a difficulty inherent in validation of any procedure used to estimate HEPs). Application of expert estimates in probabilistic risk assessment and in human factors is discussed

  5. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  6. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  7. Steam generator tubes rupture probability estimation - study of the axially cracked tube case

    International Nuclear Information System (INIS)

    Mavko, B.; Cizelj, L.; Roussel, G.

    1992-01-01

    The objective of the present study is to estimate the probability of a steam generator tube rupture due to the unstable propagation of axial through-wall cracks during a hypothetical accident. For this purpose the probabilistic fracture mechanics model was developed taking into account statistical distributions of influencing parameters. A numerical example considering a typical steam generator seriously affected by axial stress corrosion cracking in the roll transition area, is presented; it indicates the change of rupture probability with different assumptions focusing mostly on tubesheet reinforcing factor, crack propagation rate and crack detection probability. 8 refs., 4 figs., 4 tabs

  8. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    Science.gov (United States)

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  9. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  10. First Passage Probability Estimation of Wind Turbines by Markov Chain Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2013-01-01

    Markov Chain Monte Carlo simulation has received considerable attention within the past decade as reportedly one of the most powerful techniques for the first passage probability estimation of dynamic systems. A very popular method in this direction capable of estimating probability of rare events...... of the method by modifying the conditional sampler. In this paper, applicability of the original SS is compared to the recently introduced modifications of the method on a wind turbine model. The model incorporates a PID pitch controller which aims at keeping the rotational speed of the wind turbine rotor equal...... to its nominal value. Finally Monte Carlo simulations are performed which allow assessment of the accuracy of the first passage probability estimation by the SS methods....

  11. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  12. EMPIRICALLY ESTIMATED FAR-UV EXTINCTION CURVES FOR CLASSICAL T TAURI STARS

    Energy Technology Data Exchange (ETDEWEB)

    McJunkin, Matthew; France, Kevin [Laboratory for Atmospheric and Space Physics, University of Colorado, 600 UCB, Boulder, CO 80303-7814 (United States); Schindhelm, Eric [Southwest Research Institute, 1050 Walnut Street, Suite 300, Boulder, CO 80302 (United States); Herczeg, Gregory [Kavli Institute for Astronomy and Astrophysics, Peking University, Yi He Yuan Lu 5, Haidian Qu, 100871 Beijing (China); Schneider, P. Christian [ESA/ESTEC, Keplerlaan 1, 2201 AZ Noordwijk (Netherlands); Brown, Alex, E-mail: matthew.mcjunkin@colorado.edu [Center for Astrophysics and Space Astronomy, University of Colorado, 593 UCB, Boulder, CO 80309-0593 (United States)

    2016-09-10

    Measurements of extinction curves toward young stars are essential for calculating the intrinsic stellar spectrophotometric radiation. This flux determines the chemical properties and evolution of the circumstellar region, including the environment in which planets form. We develop a new technique using H{sub 2} emission lines pumped by stellar Ly α photons to characterize the extinction curve by comparing the measured far-ultraviolet H{sub 2} line fluxes with model H{sub 2} line fluxes. The difference between model and observed fluxes can be attributed to the dust attenuation along the line of sight through both the interstellar and circumstellar material. The extinction curves are fit by a Cardelli et al. (1989) model and the A {sub V} (H{sub 2}) for the 10 targets studied with good extinction fits range from 0.5 to 1.5 mag, with R {sub V} values ranging from 2.0 to 4.7. A {sub V} and R {sub V} are found to be highly degenerate, suggesting that one or the other needs to be calculated independently. Column densities and temperatures for the fluorescent H{sub 2} populations are also determined, with averages of log{sub 10}( N (H{sub 2})) = 19.0 and T = 1500 K. This paper explores the strengths and limitations of the newly developed extinction curve technique in order to assess the reliability of the results and improve the method in the future.

  13. Allelic drop-out probabilities estimated by logistic regression--Further considerations and practical implementation

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  14. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  15. A framework to estimate probability of diagnosis error in NPP advanced MCR

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Kim, Jong Hyun; Jang, Inseok; Seong, Poong Hyun

    2018-01-01

    Highlights: •As new type of MCR has been installed in NPPs, the work environment is considerably changed. •A new framework to estimate operators’ diagnosis error probabilities should be proposed. •Diagnosis error data were extracted from the full-scope simulator of the advanced MCR. •Using Bayesian inference, a TRC model was updated for use in advanced MCR. -- Abstract: Recently, a new type of main control room (MCR) has been adopted in nuclear power plants (NPPs). The new MCR, known as the advanced MCR, consists of digitalized human-system interfaces (HSIs), computer-based procedures (CPS), and soft controls while the conventional MCR includes many alarm tiles, analog indicators, hard-wired control devices, and paper-based procedures. These changes significantly affect the generic activities of the MCR operators, in relation to diagnostic activities. The aim of this paper is to suggest a framework to estimate the probabilities of diagnosis errors in the advanced MCR by updating a time reliability correlation (TRC) model. Using Bayesian inference, the TRC model was updated with the probabilities of diagnosis errors. Here, the diagnosis error data were collected from a full-scope simulator of the advanced MCR. To do this, diagnosis errors were determined based on an information processing model and their probabilities were calculated. However, these calculated probabilities of diagnosis errors were largely affected by context factors such as procedures, HSI, training, and others, known as PSFs (Performance Shaping Factors). In order to obtain the nominal diagnosis error probabilities, the weightings of PSFs were also evaluated. Then, with the nominal diagnosis error probabilities, the TRC model was updated. This led to the proposal of a framework to estimate the nominal probabilities of diagnosis errors in the advanced MCR.

  16. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    Science.gov (United States)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  17. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Directory of Open Access Journals (Sweden)

    Michael F Sloma

    2017-11-01

    Full Text Available Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  18. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Science.gov (United States)

    Sloma, Michael F; Mathews, David H

    2017-11-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  19. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  20. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  1. A method for estimating failure rates for low probability events arising in PSA

    International Nuclear Information System (INIS)

    Thorne, M.C.; Williams, M.M.R.

    1995-01-01

    The authors develop a method for predicting failure rates and failure probabilities per event when, over a given test period or number of demands, no failures have occurred. A Bayesian approach is adopted to calculate a posterior probability distribution for the failure rate or failure probability per event subsequent to the test period. This posterior is then used to estimate effective failure rates or probabilities over a subsequent period of time or number of demands. In special circumstances, the authors results reduce to the well-known rules of thumb, viz: 1/N and 1/T, where N is the number of demands during the test period for no failures and T is the test period for no failures. However, the authors are able to give strict conditions on the validity of these rules of thumb and to improve on them when necessary

  2. Estimation of the defect detection probability for ultrasonic tests on thick sections steel weldments. Technical report

    International Nuclear Information System (INIS)

    Johnson, D.P.; Toomay, T.L.; Davis, C.S.

    1979-02-01

    An inspection uncertainty analysis of published PVRC Specimen 201 data is reported to obtain an estimate of the probability of recording an indication as a function of imperfection height for ASME Section XI Code ultrasonic inspections of the nuclear reactor vessel plate seams and to demonstrate the advantages of inspection uncertainty analysis over conventional detection/nondetection counting analysis. This analysis found the probability of recording a significant defect with an ASME Section XI Code ultrasonic inspection to be very high, if such a defect should exist in the plate seams of a nuclear reactor vessel. For a one-inch high crack, for example, this analysis gives a best estimate recording probability of .985 and a 90% lower confidence bound recording probabilty of .937. It is also shown that inspection uncertainty analysis gives more accurate estimates and gives estimates over a much greater flaw size range than is possible with conventional analysis. There is reason to believe that the estimation procedure used is conservative, the estimation is based on data generated several years ago, on very small defects, in an environment that is different from the actual in-service inspection environment

  3. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  4. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    Science.gov (United States)

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  5. Estimating the Probability of a Rare Event Over a Finite Time Horizon

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; L'Ecuyer, Pierre; Rubino, Gerardo; Tuffin, Bruno

    2007-01-01

    We study an approximation for the zero-variance change of measure to estimate the probability of a rare event in a continuous-time Markov chain. The rare event occurs when the chain reaches a given set of states before some fixed time limit. The jump rates of the chain are expressed as functions of

  6. Estimating success probability of a rugby goal kick and developing a ...

    African Journals Online (AJOL)

    The objective of this study was firstly to derive a formula to estimate the success probability of a particular rugby goal kick and, secondly to derive a goal kicker rating measure that could be used to rank rugby union goal kickers. Various factors that could influence the success of a particular goal kick were considered.

  7. Current extinction rates of reptiles and amphibians.

    Science.gov (United States)

    Alroy, John

    2015-10-20

    There is broad concern that a mass extinction of amphibians and reptiles is now underway. Here I apply an extremely conservative Bayesian method to estimate the number of recent amphibian and squamate extinctions in nine important tropical and subtropical regions. The data stem from a combination of museum collection databases and published site surveys. The method computes an extinction probability for each species by considering its sighting frequency and last sighting date. It infers hardly any extinction when collection dates are randomized and it provides underestimates when artificial extinction events are imposed. The method also appears to be insensitive to trends in sampling; therefore, the counts it provides are absolute minimums. Extinctions or severe population crashes have accumulated steadily since the 1970s and 1980s, and at least 3.1% of frog species have already disappeared. Based on these data and this conservative method, the best estimate of the global grand total is roughly 200 extinctions. Consistent with previous results, frog losses are heavy in Latin America, which has been greatly affected by the pathogenic chytrid fungus Batrachochytrium dendrobatidis. Extinction rates are now four orders-of-magnitude higher than background, and at least another 6.9% of all frog species may be lost within the next century, even if there is no acceleration in the growth of environmental threats.

  8. An optimized Line Sampling method for the estimation of the failure probability of nuclear passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2010-01-01

    The quantitative reliability assessment of a thermal-hydraulic (T-H) passive safety system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. In this work, Line Sampling (LS) is adopted for efficient MC sampling. In the LS method, an 'important direction' pointing towards the failure domain of interest is determined and a number of conditional one-dimensional problems are solved along such direction; this allows for a significant reduction of the variance of the failure probability estimator, with respect, for example, to standard random sampling. Two issues are still open with respect to LS: first, the method relies on the determination of the 'important direction', which requires additional runs of the T-H code; second, although the method has been shown to improve the computational efficiency by reducing the variance of the failure probability estimator, no evidence has been given yet that accurate and precise failure probability estimates can be obtained with a number of samples reduced to below a few hundreds, which may be required in case of long-running models. The work presented in this paper addresses the first issue by (i) quantitatively comparing the efficiency of the methods proposed in the literature to determine the LS important direction; (ii) employing artificial neural network (ANN) regression models as fast-running surrogates of the original, long-running T-H code to reduce the computational cost associated to the

  9. Estimation of the common cause failure probabilities on the component group with mixed testing scheme

    International Nuclear Information System (INIS)

    Hwang, Meejeong; Kang, Dae Il

    2011-01-01

    Highlights: ► This paper presents a method to estimate the common cause failure probabilities on the common cause component group with mixed testing schemes. ► The CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing. ► There are many CCCGs with specific mixed testing schemes in real plant operation. ► Therefore, a general formula which is applicable to both alternate periodic testing scheme and train level mixed testing scheme was derived. - Abstract: This paper presents a method to estimate the common cause failure (CCF) probabilities on the common cause component group (CCCG) with mixed testing schemes such as the train level mixed testing scheme or the alternate periodic testing scheme. In the train level mixed testing scheme, the components are tested in a non-staggered way within the same train, but the components are tested in a staggered way between the trains. The alternate periodic testing scheme indicates that all components in the same CCCG are tested in a non-staggered way during the planned maintenance period, but they are tested in a staggered way during normal plant operation. Since the CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing, CCF estimators have two kinds of formulas in accordance with the testing schemes. Thus, there are general formulas to estimate the CCF probability on the staggered testing scheme and non-staggered testing scheme. However, in real plant operation, there are many CCCGs with specific mixed testing schemes. Recently, Barros () and Kang () proposed a CCF factor estimation method to reflect the alternate periodic testing scheme and the train level mixed testing scheme. In this paper, a general formula which is applicable to both the alternate periodic testing scheme and the train level mixed testing scheme was derived.

  10. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations

    International Nuclear Information System (INIS)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use

  11. Probability of Neutralization Estimation for APR1400 Physical Protection System Design Effectiveness Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Lim, Heoksoon; Na, Janghwan; Chi, Moongoo [Korea Hydro and Nuclear Power Co. Ltd. Central Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    It is focusing on development of a new designing process which can be compatible to international standards such as IAEA1 and NRC2 suggest. Evaluation for the design effectiveness was found as one of the areas to improve. If a design doesn't meet a certain level of effectiveness, it should be re-designed accordingly. The effectiveness can be calculated with combination of probability of Interruption and probability of neutralization. System Analysis of Vulnerability to Intrusion (SAVI) has been developed by Sandia National Laboratories for that purpose. With SNL's timely detection methodology, SAVI has been used by U.S. nuclear utilities to meet the NRC requirements for PPS design effectiveness evaluation. For the SAVI calculation, probability of neutralization is a vital input element that must be supplied. This paper describes the elements to consider for neutralization, probability estimation methodology, and the estimation for APR1400 PPS design effectiveness evaluation process. Markov chain and Monte Carlo simulation are often used for simple numerical calculation to estimate P{sub N}. The results from both methods are not always identical even for the same situation. P{sub N} values for APR1400 evaluation were calculated based on Markov chain method and modified to be applicable for guards/adversaries ratio based analysis.

  12. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  13. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  14. A method for the estimation of the probability of damage due to earthquakes

    International Nuclear Information System (INIS)

    Alderson, M.A.H.G.

    1979-07-01

    The available information on seismicity within the United Kingdom has been combined with building damage data from the United States to produce a method of estimating the probability of damage to structures due to the occurrence of earthquakes. The analysis has been based on the use of site intensity as the major damage producing parameter. Data for structural, pipework and equipment items have been assumed and the overall probability of damage calculated as a function of the design level. Due account is taken of the uncertainties of the seismic data. (author)

  15. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  16. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  17. Demonstration Integrated Knowledge-Based System for Estimating Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, Jack L.

    1999-04-21

    Human Reliability Analysis (HRA) is currently comprised of at least 40 different methods that are used to analyze, predict, and evaluate human performance in probabilistic terms. Systematic HRAs allow analysts to examine human-machine relationships, identify error-likely situations, and provide estimates of relative frequencies for human errors on critical tasks, highlighting the most beneficial areas for system improvements. Unfortunately, each of HRA's methods has a different philosophical approach, thereby producing estimates of human error probabilities (HEPs) that area better or worse match to the error likely situation of interest. Poor selection of methodology, or the improper application of techniques can produce invalid HEP estimates, where that erroneous estimation of potential human failure could have potentially severe consequences in terms of the estimated occurrence of injury, death, and/or property damage.

  18. Real time estimation of generation, extinction and flow of muscle fibre action potentials in high density surface EMG.

    Science.gov (United States)

    Mesin, Luca

    2015-02-01

    Developing a real time method to estimate generation, extinction and propagation of muscle fibre action potentials from bi-dimensional and high density surface electromyogram (EMG). A multi-frame generalization of an optical flow technique including a source term is considered. A model describing generation, extinction and propagation of action potentials is fit to epochs of surface EMG. The algorithm is tested on simulations of high density surface EMG (inter-electrode distance equal to 5mm) from finite length fibres generated using a multi-layer volume conductor model. The flow and source term estimated from interference EMG reflect the anatomy of the muscle, i.e. the direction of the fibres (2° of average estimation error) and the positions of innervation zone and tendons under the electrode grid (mean errors of about 1 and 2mm, respectively). The global conduction velocity of the action potentials from motor units under the detection system is also obtained from the estimated flow. The processing time is about 1 ms per channel for an epoch of EMG of duration 150 ms. A new real time image processing algorithm is proposed to investigate muscle anatomy and activity. Potential applications are proposed in prosthesis control, automatic detection of optimal channels for EMG index extraction and biofeedback. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Estimation of submarine mass failure probability from a sequence of deposits with age dates

    Science.gov (United States)

    Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.

    2013-01-01

    The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.

  20. The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation

    Science.gov (United States)

    Felder, Guido; Zischg, Andreas; Weingartner, Rolf

    2017-07-01

    Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.

  1. Annotated corpus and the empirical evaluation of probability estimates of grammatical forms

    Directory of Open Access Journals (Sweden)

    Ševa Nada

    2003-01-01

    Full Text Available The aim of the present study is to demonstrate the usage of an annotated corpus in the field of experimental psycholinguistics. Specifically, we demonstrate how the manually annotated Corpus of Serbian Language (Kostić, Đ. 2001 can be used for probability estimates of grammatical forms, which allow the control of independent variables in psycholinguistic experiments. We address the issue of processing Serbian inflected forms within two subparadigms of feminine nouns. In regression analysis, almost all processing variability of inflected forms has been accounted for by the amount of information (i.e. bits carried by the presented forms. In spite of the fact that probability distributions of inflected forms for the two paradigms differ, it was shown that the best prediction of processing variability is obtained by the probabilities derived from the predominant subparadigm which encompasses about 80% of feminine nouns. The relevance of annotated corpora in experimental psycholinguistics is discussed more in detail .

  2. Estimating migratory connectivity of birds when re-encounter probabilities are heterogeneous

    Science.gov (United States)

    Cohen, Emily B.; Hostelter, Jeffrey A.; Royle, J. Andrew; Marra, Peter P.

    2014-01-01

    Understanding the biology and conducting effective conservation of migratory species requires an understanding of migratory connectivity – the geographic linkages of populations between stages of the annual cycle. Unfortunately, for most species, we are lacking such information. The North American Bird Banding Laboratory (BBL) houses an extensive database of marking, recaptures and recoveries, and such data could provide migratory connectivity information for many species. To date, however, few species have been analyzed for migratory connectivity largely because heterogeneous re-encounter probabilities make interpretation problematic. We accounted for regional variation in re-encounter probabilities by borrowing information across species and by using effort covariates on recapture and recovery probabilities in a multistate capture–recapture and recovery model. The effort covariates were derived from recaptures and recoveries of species within the same regions. We estimated the migratory connectivity for three tern species breeding in North America and over-wintering in the tropics, common (Sterna hirundo), roseate (Sterna dougallii), and Caspian terns (Hydroprogne caspia). For western breeding terns, model-derived estimates of migratory connectivity differed considerably from those derived directly from the proportions of re-encounters. Conversely, for eastern breeding terns, estimates were merely refined by the inclusion of re-encounter probabilities. In general, eastern breeding terns were strongly connected to eastern South America, and western breeding terns were strongly linked to the more western parts of the nonbreeding range under both models. Through simulation, we found this approach is likely useful for many species in the BBL database, although precision improved with higher re-encounter probabilities and stronger migratory connectivity. We describe an approach to deal with the inherent biases in BBL banding and re-encounter data to demonstrate

  3. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    Science.gov (United States)

    Chiba, Tomoaki; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  4. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    Directory of Open Access Journals (Sweden)

    Tomoaki Chiba

    Full Text Available In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  5. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  6. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  7. Estimating the Probabilities of Default for Callable Bonds: A Duffie-Singleton Approach

    OpenAIRE

    David Wang

    2005-01-01

    This paper presents a model for estimating the default risks implicit in the prices of callable corporate bonds. The model considers three essential ingredients in the pricing of callable corporate bonds: stochastic interest rate, default risk, and call provision. The stochastic interest rate is modeled as a square-root diffusion process. The default risk is modeled as a constant spread, with the magnitude of this spread impacting the probability of a Poisson process governing the arrival of ...

  8. Dictionary-Based Stochastic Expectation–Maximization for SAR Amplitude Probability Density Function Estimation

    OpenAIRE

    Moser , Gabriele; Zerubia , Josiane; Serpico , Sebastiano B.

    2006-01-01

    International audience; In remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of the pixel intensities. This paper deals with the problem of probability density function (pdf) estimation in the context of synthetic aperture radar (SAR) amplitude data analysis. Several theoretical and heuristic models for the pdfs of SAR data have been proposed in the literature, which have been proved to be effective for different land-cov...

  9. The Probability of Default Under IFRS 9: Multi-period Estimation and Macroeconomic Forecast

    Directory of Open Access Journals (Sweden)

    Tomáš Vaněk

    2017-01-01

    Full Text Available In this paper we propose a straightforward, flexible and intuitive computational framework for the multi-period probability of default estimation incorporating macroeconomic forecasts. The concept is based on Markov models, the estimated economic adjustment coefficient and the official economic forecasts of the Czech National Bank. The economic forecasts are taken into account in a separate step to better distinguish between idiosyncratic and systemic risk. This approach is also attractive from the interpretational point of view. The proposed framework can be used especially when calculating lifetime expected credit losses under IFRS 9.

  10. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  11. Estimation of the common cause failure probabilities of the components under mixed testing schemes

    International Nuclear Information System (INIS)

    Kang, Dae Il; Hwang, Mee Jeong; Han, Sang Hoon

    2009-01-01

    For the case where trains or channels of standby safety systems consisting of more than two redundant components are tested in a staggered manner, the standby safety components within a train can be tested simultaneously or consecutively. In this case, mixed testing schemes, staggered and non-staggered testing schemes, are used for testing the components. Approximate formulas, based on the basic parameter method, were developed for the estimation of the common cause failure (CCF) probabilities of the components under mixed testing schemes. The developed formulas were applied to the four redundant check valves of the auxiliary feed water system as a demonstration study for their appropriateness. For a comparison, we estimated the CCF probabilities of the four redundant check valves for the mixed, staggered, and non-staggered testing schemes. The CCF probabilities of the four redundant check valves for the mixed testing schemes were estimated to be higher than those for the staggered testing scheme, and lower than those for the non-staggered testing scheme.

  12. Estimation of probability of failure for damage-tolerant aerospace structures

    Science.gov (United States)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  13. On estimating probability of presence from use-availability or presence-background data.

    Science.gov (United States)

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against

  14. Improved method for estimating particle scattering probabilities to finite detectors for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Mickael, M.; Gardner, R.P.; Verghese, K.

    1988-01-01

    An improved method for calculating the total probability of particle scattering within the solid angle subtended by finite detectors is developed, presented, and tested. The limiting polar and azimuthal angles subtended by the detector are measured from the direction that most simplifies their calculation rather than from the incident particle direction. A transformation of the particle scattering probability distribution function (pdf) is made to match the transformation of the direction from which the limiting angles are measured. The particle scattering probability to the detector is estimated by evaluating the integral of the transformed pdf over the range of the limiting angles measured from the preferred direction. A general formula for transforming the particle scattering pdf is derived from basic principles and applied to four important scattering pdf's; namely, isotropic scattering in the Lab system, isotropic neutron scattering in the center-of-mass system, thermal neutron scattering by the free gas model, and gamma-ray Klein-Nishina scattering. Some approximations have been made to these pdf's to enable analytical evaluations of the final integrals. These approximations are shown to be valid over a wide range of energies and for most elements. The particle scattering probability to spherical, planar circular, and right circular cylindrical detectors has been calculated using the new and previously reported direct approach. Results indicate that the new approach is valid and is computationally faster by orders of magnitude

  15. Evaluation and comparison of estimation methods for failure rates and probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, Jussi K. [Fortum Power and Heat Oy, P.O. Box 23, 07901 Loviisa (Finland)]. E-mail: jussi.vaurio@fortum.com; Jaenkaelae, Kalle E. [Fortum Nuclear Services, P.O. Box 10, 00048 Fortum (Finland)

    2006-02-01

    An updated parametric robust empirical Bayes (PREB) estimation methodology is presented as an alternative to several two-stage Bayesian methods used to assimilate failure data from multiple units or plants. PREB is based on prior-moment matching and avoids multi-dimensional numerical integrations. The PREB method is presented for failure-truncated and time-truncated data. Erlangian and Poisson likelihoods with gamma prior are used for failure rate estimation, and Binomial data with beta prior are used for failure probability per demand estimation. Combined models and assessment uncertainties are accounted for. One objective is to compare several methods with numerical examples and show that PREB works as well if not better than the alternative more complex methods, especially in demanding problems of small samples, identical data and zero failures. False claims and misconceptions are straightened out, and practical applications in risk studies are presented.

  16. Digital Cover Photography for Estimating Leaf Area Index (LAI in Apple Trees Using a Variable Light Extinction Coefficient

    Directory of Open Access Journals (Sweden)

    Carlos Poblete-Echeverría

    2015-01-01

    Full Text Available Leaf area index (LAI is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io and transmitted radiation (I through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID, which was compared with LAI estimated by the proposed digital photography method (LAIM. Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68. However, when k was estimated using an exponential function based on the fraction of foliage cover (ff derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  17. Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient.

    Science.gov (United States)

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-28

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAI(D)), which was compared with LAI estimated by the proposed digital photography method (LAI(M)). Results showed that the LAI(M) was able to estimate LAI(D) with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (f(f)) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  18. Digital Cover Photography for Estimating Leaf Area Index (LAI) in Apple Trees Using a Variable Light Extinction Coefficient

    Science.gov (United States)

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-01

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID), which was compared with LAI estimated by the proposed digital photography method (LAIM). Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (ff) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions. PMID:25635411

  19. Verification of “Channel-Probability Model” of Grain Yield Estimation

    Directory of Open Access Journals (Sweden)

    ZHENG Hong-yan

    2016-07-01

    Full Text Available The "channel-probability model" of grain yield estimation was verified and discussed systematically by using the grain production data from 1949 to 2014 in 16 typical counties, and 6 typical districts, and 31 provinces of China. The results showed as follows:(1Due to the geographical spatial scale was large enough, different climate zones and different meteorological conditions could compensated, and grain yield estimation error was small in the scale of nation. Therefore, it was not necessary to modify the grain yield estimation error by mirco-trend and the climate year types in the scale of nation. However, the grain yield estimation in the scale of province was located at the same of a climate zone,the scale was small, so the impact of the meteorological conditions on grain yield was less complementary than the scale of nation. While the spatial scale of districts and counties was smaller, accordingly the compensation of the impact of the meteorological conditions on grain yield was least. Therefore, it was necessary to use mrico-trend amendment and the climate year types amendment to modify the grain yield estimation in districts and counties.(2Mirco-trend modification had two formulas, generally, when the error of grain yield estimation was less than 10%, it could be modified by Y×(1-K; while the error of grain yield estimation was more than 10%, it could be modified by Y/(1+K.(3Generally, the grain estimation had 5 grades, and some had 7 grades because of large error fluctuation. The parameters modified of super-high yield year and super-low yield year must be depended on the real-time crop growth and the meteorological condition. (4By plenty of demonstration analysis, it was proved that the theory and method of "channel-probability model" was scientific and practical. In order to improve the accuracy of grain yield estimation, the parameters could be modified with micro-trend amendment and the climate year types amendment. If the

  20. Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation

    Directory of Open Access Journals (Sweden)

    Campos-Aranda Daniel Francisco

    2015-09-01

    Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.

  1. Estimating occurrence and detection probabilities for stream-breeding salamanders in the Gulf Coastal Plain

    Science.gov (United States)

    Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.

    2017-01-01

    Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.

  2. Survival chance in papillary thyroid cancer in Hungary: individual survival probability estimation using the Markov method

    International Nuclear Information System (INIS)

    Esik, Olga; Tusnady, Gabor; Daubner, Kornel; Nemeth, Gyoergy; Fuezy, Marton; Szentirmay, Zoltan

    1997-01-01

    Purpose: The typically benign, but occasionally rapidly fatal clinical course of papillary thyroid cancer has raised the need for individual survival probability estimation, to tailor the treatment strategy exclusively to a given patient. Materials and methods: A retrospective study was performed on 400 papillary thyroid cancer patients with a median follow-up time of 7.1 years to establish a clinical database for uni- and multivariate analysis of the prognostic factors related to survival (Kaplan-Meier product limit method and Cox regression). For a more precise prognosis estimation, the effect of the most important clinical events were then investigated on the basis of a Markov renewal model. The basic concept of this approach is that each patient has an individual disease course which (besides the initial clinical categories) is affected by special events, e.g. internal covariates (local/regional/distant relapses). On the supposition that these events and the cause-specific death are influenced by the same biological processes, the parameters of transient survival probability characterizing the speed of the course of the disease for each clinical event and their sequence were determined. The individual survival curves for each patient were calculated by using these parameters and the independent significant clinical variables selected from multivariate studies, summation of which resulted in a mean cause-specific survival function valid for the entire group. On the basis of this Markov model, prediction of the cause-specific survival probability is possible for extrastudy cases, if it is supposed that the clinical events occur within new patients in the same manner and with the similar probability as within the study population. Results: The patient's age, a distant metastasis at presentation, the extent of the surgical intervention, the primary tumor size and extent (pT), the external irradiation dosage and the degree of TSH suppression proved to be

  3. Estimation of delayed neutron emission probability by using the gross theory of nuclear β-decay

    International Nuclear Information System (INIS)

    Tachibana, Takahiro

    1999-01-01

    The delayed neutron emission probabilities (P n -values) of fission products are necessary in the study of reactor physics; e.g. in the calculation of total delayed neutron yields and in the summation calculation of decay heat. In this report, the P n -values estimated by the gross theory for some fission products are compared with experiment, and it is found that, on the average, the semi-gross theory somewhat underestimates the experimental P n -values. A modification of the β-decay strength function is briefly discussed to get more reasonable P n -values. (author)

  4. Estimating the probability of allelic drop-out of STR alleles in forensic genetics

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2009-01-01

    In crime cases with available DNA evidence, the amount of DNA is often sparse due to the setting of the crime. In such cases, allelic drop-out of one or more true alleles in STR typing is possible. We present a statistical model for estimating the per locus and overall probability of allelic drop......-out using the results of all STR loci in the case sample as reference. The methodology of logistic regression is appropriate for this analysis, and we demonstrate how to incorporate this in a forensic genetic framework....

  5. Estimating Probable Maximum Precipitation by Considering Combined Effect of Typhoon and Southwesterly Air Flow

    Directory of Open Access Journals (Sweden)

    Cheng-Chin Liu

    2016-01-01

    Full Text Available Typhoon Morakot hit southern Taiwan in 2009, bringing 48-hr of heavy rainfall [close to the Probable Maximum Precipitation (PMP] to the Tsengwen Reservoir catchment. This extreme rainfall event resulted from the combined (co-movement effect of two climate systems (i.e., typhoon and southwesterly air flow. Based on the traditional PMP estimation method (i.e., the storm transposition method, STM, two PMP estimation approaches, i.e., Amplification Index (AI and Independent System (IS approaches, which consider the combined effect are proposed in this work. The AI approach assumes that the southwesterly air flow precipitation in a typhoon event could reach its maximum value. The IS approach assumes that the typhoon and southwesterly air flow are independent weather systems. Based on these assumptions, calculation procedures for the two approaches were constructed for a case study on the Tsengwen Reservoir catchment. The results show that the PMP estimates for 6- to 60-hr durations using the two approaches are approximately 30% larger than the PMP estimates using the traditional STM without considering the combined effect. This work is a pioneer PMP estimation method that considers the combined effect of a typhoon and southwesterly air flow. Further studies on this issue are essential and encouraged.

  6. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  7. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    Science.gov (United States)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  8. Estimation method for first excursion probability of secondary system with impact and friction using maximum response

    International Nuclear Information System (INIS)

    Shigeru Aoki

    2005-01-01

    The secondary system such as pipings, tanks and other mechanical equipment is installed in the primary system such as building. The important secondary systems should be designed to maintain their function even if they are subjected to destructive earthquake excitations. The secondary system has many nonlinear characteristics. Impact and friction characteristic, which are observed in mechanical supports and joints, are common nonlinear characteristics. As impact damper and friction damper, impact and friction characteristic are used for reduction of seismic response. In this paper, analytical methods of the first excursion probability of the secondary system with impact and friction, subjected to earthquake excitation are proposed. By using the methods, the effects of impact force, gap size and friction force on the first excursion probability are examined. When the tolerance level is normalized by the maximum response of the secondary system without impact or friction characteristics, variation of the first excursion probability is very small for various values of the natural period. In order to examine the effectiveness of the proposed method, the obtained results are compared with those obtained by the simulation method. Some estimation methods for the maximum response of the secondary system with nonlinear characteristics have been developed. (author)

  9. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    Energy Technology Data Exchange (ETDEWEB)

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  10. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

  11. First-passage Probability Estimation of an Earthquake Response of Seismically Isolated Containment Buildings

    International Nuclear Information System (INIS)

    Hahm, Dae-Gi; Park, Kwan-Soon; Koh, Hyun-Moo

    2008-01-01

    The awareness of a seismic hazard and risk is being increased rapidly according to the frequent occurrences of the huge earthquakes such as the 2008 Sichuan earthquake which caused about 70,000 confirmed casualties and a 20 billion U.S. dollars economic loss. Since an earthquake load contains various uncertainties naturally, the safety of a structural system under an earthquake excitation has been assessed by probabilistic approaches. In many structural applications for a probabilistic safety assessment, it is often regarded that the failure of a system will occur when the response of the structure firstly crosses the limit barrier within a specified interval of time. The determination of such a failure probability is usually called the 'first-passage problem' and has been extensively studied during the last few decades. However, especially for the structures which show a significant nonlinear dynamic behavior, an effective and accurate method for the estimation of such a failure probability is not fully established yet. In this study, we presented a new approach to evaluate the first-passage probability of an earthquake response of seismically isolated structures. The proposed method is applied to the seismic isolation system for the containment buildings of a nuclear power plant. From the numerical example, we verified that the proposed method shows accurate results with more efficient computational efforts compared to the conventional approaches

  12. Climate change, elevational range shifts, and bird extinctions.

    Science.gov (United States)

    Sekercioglu, Cagan H; Schneider, Stephen H; Fay, John P; Loarie, Scott R

    2008-02-01

    Limitations imposed on species ranges by the climatic, ecological, and physiological effects of elevation are important determinants of extinction risk. We modeled the effects of elevational limits on the extinction risk of landbirds, 87% of all bird species. Elevational limitation of range size explained 97% of the variation in the probability of being in a World Conservation Union category of extinction risk. Our model that combined elevational ranges, four Millennium Assessment habitat-loss scenarios, and an intermediate estimate of surface warming of 2.8 degrees C, projected a best guess of 400-550 landbird extinctions, and that approximately 2150 additional species would be at risk of extinction by 2100. For Western Hemisphere landbirds, intermediate extinction estimates based on climate-induced changes in actual distributions ranged from 1.3% (1.1 degrees C warming) to 30.0% (6.4 degrees C warming) of these species. Worldwide, every degree of warming projected a nonlinear increase in bird extinctions of about 100-500 species. Only 21% of the species predicted to become extinct in our scenarios are currently considered threatened with extinction. Different habitat-loss and surface-warming scenarios predicted substantially different futures for landbird species. To improve the precision of climate-induced extinction estimates, there is an urgent need for high-resolution measurements of shifts in the elevational ranges of species. Given the accelerating influence of climate change on species distributions and conservation, using elevational limits in a tested, standardized, and robust manner can improve conservation assessments of terrestrial species and will help identify species that are most vulnerable to global climate change. Our climate-induced extinction estimates are broadly similar to those of bird species at risk from other factors, but these estimates largely involve different sets of species.

  13. Estimation of probability of coastal flooding: A case study in the Norton Sound, Alaska

    Science.gov (United States)

    Kim, S.; Chapman, R. S.; Jensen, R. E.; Azleton, M. T.; Eisses, K. J.

    2010-12-01

    Along the Norton Sound, Alaska, coastal communities have been exposed to flooding induced by the extra-tropical storms. Lack of observation data especially with long-term variability makes it difficult to assess the probability of coastal flooding critical in planning for development and evacuation of the coastal communities. We estimated the probability of coastal flooding with the help of an existing storm surge model using ADCIRC and a wave model using WAM for the Western Alaska which includes the Norton Sound as well as the adjacent Bering Sea and Chukchi Sea. The surface pressure and winds as well as ice coverage was analyzed and put in a gridded format with 3 hour interval over the entire Alaskan Shelf by Ocean Weather Inc. (OWI) for the period between 1985 and 2009. The OWI also analyzed the surface conditions for the storm events over the 31 year time period between 1954 and 1984. The correlation between water levels recorded by NOAA tide gage and local meteorological conditions at Nome between 1992 and 2005 suggested strong local winds with prevailing Southerly components period are good proxies for high water events. We also selected heuristically the local winds with prevailing Westerly components at Shaktoolik which locates at the eastern end of the Norton Sound provided extra selection of flood events during the continuous meteorological data record between 1985 and 2009. The frequency analyses were performed using the simulated water levels and wave heights for the 56 year time period between 1954 and 2009. Different methods of estimating return periods were compared including the method according to FEMA guideline, the extreme value statistics, and fitting to the statistical distributions such as Weibull and Gumbel. The estimates are similar as expected but with a variation.

  14. A classification scheme of erroneous behaviors for human error probability estimations based on simulator data

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2017-01-01

    Because it has been indicated that empirical data supporting the estimates used in human reliability analysis (HRA) is insufficient, several databases have been constructed recently. To generate quantitative estimates from human reliability data, it is important to appropriately sort the erroneous behaviors found in the reliability data. Therefore, this paper proposes a scheme to classify the erroneous behaviors identified by the HuREX (Human Reliability data Extraction) framework through a review of the relevant literature. A case study of the human error probability (HEP) calculations is conducted to verify that the proposed scheme can be successfully implemented for the categorization of the erroneous behaviors and to assess whether the scheme is useful for the HEP quantification purposes. Although continuously accumulating and analyzing simulator data is desirable to secure more reliable HEPs, the resulting HEPs were insightful in several important ways with regard to human reliability in off-normal conditions. From the findings of the literature review and the case study, the potential and limitations of the proposed method are discussed. - Highlights: • A taxonomy of erroneous behaviors is proposed to estimate HEPs from a database. • The cognitive models, procedures, HRA methods, and HRA databases were reviewed. • HEPs for several types of erroneous behaviors are calculated as a case study.

  15. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  16. Estimating landholders' probability of participating in a stewardship program, and the implications for spatial conservation priorities.

    Directory of Open Access Journals (Sweden)

    Vanessa M Adams

    Full Text Available The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements--conservation covenants and management agreements--based on payment level and proportion of properties required to be managed. We then spatially predicted landholders' probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation.

  17. Estimation of the nuclear fuel assembly eigenfrequencies in the probability sense

    Directory of Open Access Journals (Sweden)

    Zeman V.

    2014-12-01

    Full Text Available The paper deals with upper and lower limits estimation of the nuclear fuel assembly eigenfrequencies, whose design and operation parameters are random variables. Each parameter is defined by its mean value and standard deviation or by a range of values. The gradient and three sigma criterion approach is applied to the calculation of the upper and lower limits of fuel assembly eigenfrequencies in the probability sense. Presented analytical approach used for the calculation of eigenfrequencies sensitivity is based on the modal synthesis method and the fuel assembly decomposition into six identical revolved fuel rod segments, centre tube and load-bearing skeleton linked by spacer grids. The method is applied for the Russian TVSA-T fuel assembly in the WWER1000/320 type reactor core in the Czech nuclear power plant Temelín.

  18. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions.

    Science.gov (United States)

    Wenger, Seth J; Freeman, Mary C

    2008-10-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  19. An Estimation of a Passive Infra-Red Sensor Probability of Detection

    International Nuclear Information System (INIS)

    Osman, E.A.; El-Gazar, M.I.; Shaat, M.K.; El-Kafas, A.A.; Zidan, W.I.; Wadoud, A.A.

    2009-01-01

    Passive Infera-Red (PIR) sensors are one of many detection sensors are used to detect any intrusion process of the nuclear sites. In this work, an estimation of a PIR Sensor's Probability of Detection of a hypothetical facility is presented. sensor performance testing performed to determine whether a particular sensor will be acceptable in a proposed design. We have access to a sensor test field in which the sensor of interest is already properly installed and the parameters have been set to optimal levels by preliminary testing. The PIR sensor construction, operation and design for the investigated nuclear site are explained. Walking and running intrusion tests were carried out inside the field areas of the PIR sensor to evaluate the sensor performance during the intrusion process. 10 trials experimentally performed for achieving the intrusion process via a passive infra-red sensor's network system. The performance and intrusion senses of PIR sensors inside the internal zones was recorded and evaluated.

  20. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...

  1. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  2. Joint Bayesian Estimation of Quasar Continua and the Lyα Forest Flux Probability Distribution Function

    Science.gov (United States)

    Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan

    2017-08-01

    We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.

  3. Agricultural net primary production in relation to that liberated by the extinction of Pleistocene mega-herbivores: an estimate of agricultural carrying capacity?

    Energy Technology Data Exchange (ETDEWEB)

    Doughty, Christopher E; Field, Christopher B, E-mail: chris.doughty@ouce.ox.ac.uk, E-mail: cfield@ciw.edu [Department of Global Ecology, Carnegie Institution, Stanford, CA 94305 (United States)

    2010-10-15

    Mega-fauna (defined as animals > 44 kg) experienced a global extinction with 97 of 150 genera going extinct by {approx} 10 000 years ago. We estimate the net primary production (NPP) that was liberated following the global extinction of these mega-herbivores. We then explore how humans, through agriculture, gradually appropriated this liberated NPP, with specific calculations for 800, 1850, and 2000 AD. By 1850, most of the liberated NPP had been appropriated by people, but NPP was still available in the Western US, South America and Australia. NPP liberated following the extinction of the mega-herbivores was {approx} 2.5% ({approx}1.4 (between 1.2 and 1.6) Pg yr{sup -1} of 56 Pg C yr{sup -1}; Pg: petagrams) of global terrestrial NPP. Liberated NPP peaked during the onset of agriculture and was sufficient for sustaining human agriculture until {approx} 320 (250-500) years ago. Humans currently use {approx} 6 times more NPP than was utilized by the extinct Pleistocene mega-herbivores.

  4. Agricultural net primary production in relation to that liberated by the extinction of Pleistocene mega-herbivores: an estimate of agricultural carrying capacity?

    International Nuclear Information System (INIS)

    Doughty, Christopher E; Field, Christopher B

    2010-01-01

    Mega-fauna (defined as animals > 44 kg) experienced a global extinction with 97 of 150 genera going extinct by ∼ 10 000 years ago. We estimate the net primary production (NPP) that was liberated following the global extinction of these mega-herbivores. We then explore how humans, through agriculture, gradually appropriated this liberated NPP, with specific calculations for 800, 1850, and 2000 AD. By 1850, most of the liberated NPP had been appropriated by people, but NPP was still available in the Western US, South America and Australia. NPP liberated following the extinction of the mega-herbivores was ∼ 2.5% (∼1.4 (between 1.2 and 1.6) Pg yr -1 of 56 Pg C yr -1 ; Pg: petagrams) of global terrestrial NPP. Liberated NPP peaked during the onset of agriculture and was sufficient for sustaining human agriculture until ∼ 320 (250-500) years ago. Humans currently use ∼ 6 times more NPP than was utilized by the extinct Pleistocene mega-herbivores.

  5. Estimation of peak discharge quantiles for selected annual exceedance probabilities in northeastern Illinois

    Science.gov (United States)

    Over, Thomas M.; Saito, Riki J.; Veilleux, Andrea G.; Sharpe, Jennifer B.; Soong, David T.; Ishii, Audrey L.

    2016-06-28

    This report provides two sets of equations for estimating peak discharge quantiles at annual exceedance probabilities (AEPs) of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002 (recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively) for watersheds in Illinois based on annual maximum peak discharge data from 117 watersheds in and near northeastern Illinois. One set of equations was developed through a temporal analysis with a two-step least squares-quantile regression technique that measures the average effect of changes in the urbanization of the watersheds used in the study. The resulting equations can be used to adjust rural peak discharge quantiles for the effect of urbanization, and in this study the equations also were used to adjust the annual maximum peak discharges from the study watersheds to 2010 urbanization conditions.The other set of equations was developed by a spatial analysis. This analysis used generalized least-squares regression to fit the peak discharge quantiles computed from the urbanization-adjusted annual maximum peak discharges from the study watersheds to drainage-basin characteristics. The peak discharge quantiles were computed by using the Expected Moments Algorithm following the removal of potentially influential low floods defined by a multiple Grubbs-Beck test. To improve the quantile estimates, regional skew coefficients were obtained from a newly developed regional skew model in which the skew increases with the urbanized land use fraction. The drainage-basin characteristics used as explanatory variables in the spatial analysis include drainage area, the fraction of developed land, the fraction of land with poorly drained soils or likely water, and the basin slope estimated as the ratio of the basin relief to basin perimeter.This report also provides the following: (1) examples to illustrate the use of the spatial and urbanization-adjustment equations for estimating peak discharge quantiles at ungaged

  6. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  7. Estimating Recovery Failure Probabilities in Off-normal Situations from Full-Scope Simulator Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    As part of this effort, KAERI developed the Human Reliability data EXtraction (HuREX) framework and is collecting full-scope simulator-based human reliability data into the OPERA (Operator PErformance and Reliability Analysis) database. In this study, with the series of estimation research for HEPs or PSF effects, significant information for a quantitative HRA analysis, recovery failure probabilities (RFPs), were produced from the OPERA database. Unsafe acts can occur at any time in safety-critical systems and the operators often manage the systems by discovering their errors and eliminating or mitigating them. To model the recovery processes or recovery strategies, there were several researches that categorize the recovery behaviors. Because the recent human error trends are required to be considered during a human reliability analysis, Jang et al. can be seen as an essential effort of the data collection. However, since the empirical results regarding soft controls were produced from a controlled laboratory environment with student participants, it is necessary to analyze a wide range of operator behaviors using full-scope simulators. This paper presents the statistics related with human error recovery behaviors obtained from the full-scope simulations that in-site operators participated in. In this study, the recovery effects by shift changes or technical support centers were not considered owing to a lack of simulation data.

  8. Estimating factors influencing the detection probability of semiaquatic freshwater snails using quadrat survey methods

    Science.gov (United States)

    Roesler, Elizabeth L.; Grabowski, Timothy B.

    2018-01-01

    Developing effective monitoring methods for elusive, rare, or patchily distributed species requires extra considerations, such as imperfect detection. Although detection is frequently modeled, the opportunity to assess it empirically is rare, particularly for imperiled species. We used Pecos assiminea (Assiminea pecos), an endangered semiaquatic snail, as a case study to test detection and accuracy issues surrounding quadrat searches. Quadrats (9 × 20 cm; n = 12) were placed in suitable Pecos assiminea habitat and randomly assigned a treatment, defined as the number of empty snail shells (0, 3, 6, or 9). Ten observers rotated through each quadrat, conducting 5-min visual searches for shells. The probability of detecting a shell when present was 67.4 ± 3.0%, but it decreased with the increasing litter depth and fewer number of shells present. The mean (± SE) observer accuracy was 25.5 ± 4.3%. Accuracy was positively correlated to the number of shells in the quadrat and negatively correlated to the number of times a quadrat was searched. The results indicate quadrat surveys likely underrepresent true abundance, but accurately determine the presence or absence. Understanding detection and accuracy of elusive, rare, or imperiled species improves density estimates and aids in monitoring and conservation efforts.

  9. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    Science.gov (United States)

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  10. Estimation of long-term probabilities for inadvertent intrusion into radioactive waste management areas

    International Nuclear Information System (INIS)

    Eedy, W.; Hart, D.

    1988-05-01

    The risk to human health from radioactive waste management sites can be calculated as the product of the probability of accidental exposure (intrusion) times the probability of a health effect from such exposure. This report reviews the literature and evaluates methods used to predict the probabilities for unintentional intrusion into radioactive waste management areas in Canada over a 10,000-year period. Methods to predict such probabilities are available. They generally assume a long-term stability in terms of existing resource uses and society in the management area. The major potential for errors results from the unlikeliness of these assumptions holding true over such lengthy periods of prediction

  11. Nonparametric estimation of transition probabilities in the non-Markov illness-death model: A comparative study.

    Science.gov (United States)

    de Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2015-06-01

    Multi-state models are often used for modeling complex event history data. In these models the estimation of the transition probabilities is of particular interest, since they allow for long-term predictions of the process. These quantities have been traditionally estimated by the Aalen-Johansen estimator, which is consistent if the process is Markov. Several non-Markov estimators have been proposed in the recent literature, and their superiority with respect to the Aalen-Johansen estimator has been proved in situations in which the Markov condition is strongly violated. However, the existing estimators have the drawback of requiring that the support of the censoring distribution contains the support of the lifetime distribution, which is not often the case. In this article, we propose two new methods for estimating the transition probabilities in the progressive illness-death model. Some asymptotic results are derived. The proposed estimators are consistent regardless the Markov condition and the referred assumption about the censoring support. We explore the finite sample behavior of the estimators through simulations. The main conclusion of this piece of research is that the proposed estimators are much more efficient than the existing non-Markov estimators in most cases. An application to a clinical trial on colon cancer is included. Extensions to progressive processes beyond the three-state illness-death model are discussed. © 2015, The International Biometric Society.

  12. Estimation of the four-wave mixing noise probability-density function by the multicanonical Monte Carlo method.

    Science.gov (United States)

    Neokosmidis, Ioannis; Kamalakis, Thomas; Chipouras, Aristides; Sphicopoulos, Thomas

    2005-01-01

    The performance of high-powered wavelength-division multiplexed (WDM) optical networks can be severely degraded by four-wave-mixing- (FWM-) induced distortion. The multicanonical Monte Carlo method (MCMC) is used to calculate the probability-density function (PDF) of the decision variable of a receiver, limited by FWM noise. Compared with the conventional Monte Carlo method previously used to estimate this PDF, the MCMC method is much faster and can accurately estimate smaller error probabilities. The method takes into account the correlation between the components of the FWM noise, unlike the Gaussian model, which is shown not to provide accurate results.

  13. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    Science.gov (United States)

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned

  14. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  15. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  16. Rethinking Extinction

    Science.gov (United States)

    Dunsmoor, Joseph E.; Niv, Yael; Daw, Nathaniel; Phelps, Elizabeth A.

    2015-01-01

    Extinction serves as the leading theoretical framework and experimental model to describe how learned behaviors diminish through absence of anticipated reinforcement. In the past decade, extinction has moved beyond the realm of associative learning theory and behavioral experimentation in animals and has become a topic of considerable interest in the neuroscience of learning, memory, and emotion. Here, we review research and theories of extinction, both as a learning process and as a behavioral technique, and consider whether traditional understandings warrant a re-examination. We discuss the neurobiology, cognitive factors, and major computational theories, and revisit the predominant view that extinction results in new learning that interferes with expression of the original memory. Additionally, we reconsider the limitations of extinction as a technique to prevent the relapse of maladaptive behavior, and discuss novel approaches, informed by contemporary theoretical advances, that augment traditional extinction methods to target and potentially alter maladaptive memories. PMID:26447572

  17. The probability estimate of the defects of the asynchronous motors based on the complex method of diagnostics

    Science.gov (United States)

    Zhukovskiy, Yu L.; Korolev, N. A.; Babanova, I. S.; Boikov, A. V.

    2017-10-01

    This article is devoted to the development of a method for probability estimate of failure of an asynchronous motor as a part of electric drive with a frequency converter. The proposed method is based on a comprehensive method of diagnostics of vibration and electrical characteristics that take into account the quality of the supply network and the operating conditions. The developed diagnostic system allows to increase the accuracy and quality of diagnoses by determining the probability of failure-free operation of the electromechanical equipment, when the parameters deviate from the norm. This system uses an artificial neural networks (ANNs). The results of the system for estimator the technical condition are probability diagrams of the technical state and quantitative evaluation of the defects of the asynchronous motor and its components.

  18. Actions and Beliefs : Estimating Distribution-Based Preferences Using a Large Scale Experiment with Probability Questions on Expectations

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2005-01-01

    We combine the choice data of proposers and responders in the ultimatum game, their expectations elicited in the form of subjective probability questions, and the choice data of proposers ("dictator") in a dictator game to estimate a structural model of decision making under uncertainty.We use a

  19. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri.

    Science.gov (United States)

    2014-01-01

    Regression analysis techniques were used to develop a : set of equations for rural ungaged stream sites for estimating : discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent : annual exceedance probabilities, which are equivalent to : ann...

  20. Rethinking Extinction

    OpenAIRE

    Dunsmoor, Joseph E.; Niv, Yael; Daw, Nathaniel; Phelps, Elizabeth A.

    2015-01-01

    Extinction serves as the leading theoretical framework and experimental model to describe how learned behaviors diminish through absence of anticipated reinforcement. In the past decade, extinction has moved beyond the realm of associative learning theory and behavioral experimentation in animals and has become a topic of considerable interest in the neuroscience of learning, memory, and emotion. Here, we review research and theories of extinction, both as a learning process and as a behavior...

  1. Metapopulation extinction risk: dispersal's duplicity.

    Science.gov (United States)

    Higgins, Kevin

    2009-09-01

    Metapopulation extinction risk is the probability that all local populations are simultaneously extinct during a fixed time frame. Dispersal may reduce a metapopulation's extinction risk by raising its average per-capita growth rate. By contrast, dispersal may raise a metapopulation's extinction risk by reducing its average population density. Which effect prevails is controlled by habitat fragmentation. Dispersal in mildly fragmented habitat reduces a metapopulation's extinction risk by raising its average per-capita growth rate without causing any appreciable drop in its average population density. By contrast, dispersal in severely fragmented habitat raises a metapopulation's extinction risk because the rise in its average per-capita growth rate is more than offset by the decline in its average population density. The metapopulation model used here shows several other interesting phenomena. Dispersal in sufficiently fragmented habitat reduces a metapopulation's extinction risk to that of a constant environment. Dispersal between habitat fragments reduces a metapopulation's extinction risk insofar as local environments are asynchronous. Grouped dispersal raises the effective habitat fragmentation level. Dispersal search barriers raise metapopulation extinction risk. Nonuniform dispersal may reduce the effective fraction of suitable habitat fragments below the extinction threshold. Nonuniform dispersal may make demographic stochasticity a more potent metapopulation extinction force than environmental stochasticity.

  2. Assessing the Adequacy of Probability Distributions for Estimating the Extreme Events of Air Temperature in Dabaa Region

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2015-01-01

    Assessing the adequacy of probability distributions for estimating the extreme events of air temperature in Dabaa region is one of the pre-requisite s for any design purpose at Dabaa site which can be achieved by probability approach. In the present study, three extreme value distributions are considered and compared to estimate the extreme events of monthly and annual maximum and minimum temperature. These distributions include the Gumbel/Frechet distributions for estimating the extreme maximum values and Gumbel /Weibull distributions for estimating the extreme minimum values. Lieblein technique and Method of Moments are applied for estimating the distribution para meters. Subsequently, the required design values with a given return period of exceedance are obtained. Goodness-of-Fit tests involving Kolmogorov-Smirnov and Anderson-Darling are used for checking the adequacy of fitting the method/distribution for the estimation of maximum/minimum temperature. Mean Absolute Relative Deviation, Root Mean Square Error and Relative Mean Square Deviation are calculated, as the performance indicators, to judge which distribution and method of parameters estimation are the most appropriate one to estimate the extreme temperatures. The present study indicated that the Weibull distribution combined with Method of Moment estimators gives the highest fit, most reliable, accurate predictions for estimating the extreme monthly and annual minimum temperature. The Gumbel distribution combined with Method of Moment estimators showed the highest fit, accurate predictions for the estimation of the extreme monthly and annual maximum temperature except for July, August, October and November. The study shows that the combination of Frechet distribution with Method of Moment is the most accurate for estimating the extreme maximum temperature in July, August and November months while t he Gumbel distribution and Lieblein technique is the best for October

  3. Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems

    International Nuclear Information System (INIS)

    Helin, T; Burger, M

    2015-01-01

    A demanding challenge in Bayesian inversion is to efficiently characterize the posterior distribution. This task is problematic especially in high-dimensional non-Gaussian problems, where the structure of the posterior can be very chaotic and difficult to analyse. Current inverse problem literature often approaches the problem by considering suitable point estimators for the task. Typically the choice is made between the maximum a posteriori (MAP) or the conditional mean (CM) estimate. The benefits of either choice are not well-understood from the perspective of infinite-dimensional theory. Most importantly, there exists no general scheme regarding how to connect the topological description of a MAP estimate to a variational problem. The recent results by Dashti and others (Dashti et al 2013 Inverse Problems 29 095017) resolve this issue for nonlinear inverse problems in Gaussian framework. In this work we improve the current understanding by introducing a novel concept called the weak MAP (wMAP) estimate. We show that any MAP estimate in the sense of Dashti et al (2013 Inverse Problems 29 095017) is a wMAP estimate and, moreover, how the wMAP estimate connects to a variational formulation in general infinite-dimensional non-Gaussian problems. The variational formulation enables to study many properties of the infinite-dimensional MAP estimate that were earlier impossible to study. In a recent work by the authors (Burger and Lucka 2014 Maximum a posteriori estimates in linear inverse problems with logconcave priors are proper bayes estimators preprint) the MAP estimator was studied in the context of the Bayes cost method. Using Bregman distances, proper convex Bayes cost functions were introduced for which the MAP estimator is the Bayes estimator. Here, we generalize these results to the infinite-dimensional setting. Moreover, we discuss the implications of our results for some examples of prior models such as the Besov prior and hierarchical prior. (paper)

  4. Effects of population variability on the accuracy of detection probability estimates

    DEFF Research Database (Denmark)

    Ordonez Gloria, Alejandro

    2011-01-01

    Observing a constant fraction of the population over time, locations, or species is virtually impossible. Hence, quantifying this proportion (i.e. detection probability) is an important task in quantitative population ecology. In this study we determined, via computer simulations, the ef- fect of...

  5. Estimating the Probabilities of Low-Weight Differential and Linear Approximations on PRESENT-like Ciphers

    DEFF Research Database (Denmark)

    Abdelraheem, Mohamed Ahmed

    2012-01-01

    We use large but sparse correlation and transition-difference-probability submatrices to find the best linear and differential approximations respectively on PRESENT-like ciphers. This outperforms the branch and bound algorithm when the number of low-weight differential and linear characteristics...

  6. Quantitative estimation of the human error probability during soft control operations

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jung, Wondea

    2013-01-01

    Highlights: ► An HRA method to evaluate execution HEP for soft control operations was proposed. ► The soft control tasks were analyzed and design-related influencing factors were identified. ► An application to evaluate the effects of soft controls was performed. - Abstract: In this work, a method was proposed for quantifying human errors that can occur during operation executions using soft controls. Soft controls of advanced main control rooms have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to identify the human error modes and quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests an evaluation framework for quantifying the execution error probability using soft controls. In the application result, it was observed that the human error probabilities of soft controls showed both positive and negative results compared to the conventional controls according to the design quality of advanced main control rooms

  7. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    Science.gov (United States)

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (nresearchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  8. Estimation of failure probability on real structure utilized by earthquake observation data

    International Nuclear Information System (INIS)

    Matsubara, Masayoshi

    1995-01-01

    The objective of this report is to propose the procedure which estimates the structural response on a real structure by utilizing earthquake observation data using Neural network system. We apply the neural network system to estimate the ground motion of the site by enormous earthquake data published from Japan Meteorological Agency. The proposed procedure has some possibility to estimate the correlation between earthquake and response adequately. (author)

  9. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  10. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    Science.gov (United States)

    De Gregorio, Sofia; Camarda, Marco

    2016-07-26

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  11. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  12. Methods for estimating the probability of cancer from occupational radiation exposure

    International Nuclear Information System (INIS)

    1996-04-01

    The aims of this TECDOC are to present the factors which are generally accepted as being responsible for cancer induction, to examine the role of radiation as a carcinogen, to demonstrate how the probability of cancer causation by radiation may be calculated and to inform the reader of the uncertainties that are associated with the use of various risk factors and models in such calculations. 139 refs, 2 tabs

  13. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  14. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  15. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  16. Inverse estimation of the spheroidal particle size distribution using Ant Colony Optimization algorithms in multispectral extinction technique

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Wang, Yuqing; Ruan, Liming

    2014-10-01

    Four improved Ant Colony Optimization (ACO) algorithms, i.e. the probability density function based ACO (PDF-ACO) algorithm, the Region ACO (RACO) algorithm, Stochastic ACO (SACO) algorithm and Homogeneous ACO (HACO) algorithm, are employed to estimate the particle size distribution (PSD) of the spheroidal particles. The direct problems are solved by the extended Anomalous Diffraction Approximation (ADA) and the Lambert-Beer law. Three commonly used monomodal distribution functions i.e. the Rosin-Rammer (R-R) distribution function, the normal (N-N) distribution function, and the logarithmic normal (L-N) distribution function are estimated under dependent model. The influence of random measurement errors on the inverse results is also investigated. All the results reveal that the PDF-ACO algorithm is more accurate than the other three ACO algorithms and can be used as an effective technique to investigate the PSD of the spheroidal particles. Furthermore, the Johnson's SB (J-SB) function and the modified beta (M-β) function are employed as the general distribution functions to retrieve the PSD of spheroidal particles using PDF-ACO algorithm. The investigation shows a reasonable agreement between the original distribution function and the general distribution function when only considering the variety of the length of the rotational semi-axis.

  17. Estimation of Extreme Responses and Failure Probability of Wind Turbines under Normal Operation by Controlled Monte Carlo Simulation

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri

    of the evolution of the PDF of a stochastic process; hence an alternative to the FPK. The considerable advantage of the introduced method over FPK is that its solution does not require high computational cost which extends its range of applicability to high order structural dynamic problems. The problem...... an alternative approach for estimation of the first excursion probability of any system is based on calculating the evolution of the Probability Density Function (PDF) of the process and integrating it on the specified domain. Clearly this provides the most accurate results among the three classes of the methods....... The solution of the Fokker-Planck-Kolmogorov (FPK) equation for systems governed by a stochastic differential equation driven by Gaussian white noise will give the sought time variation of the probability density function. However the analytical solution of the FPK is available for only a few dynamic systems...

  18. Genetic sex determination and extinction.

    Science.gov (United States)

    Hedrick, Philip W; Gadau, Jürgen; Page, Robert E

    2006-02-01

    Genetic factors can affect the probability of extinction either by increasing the effect of detrimental variants or by decreasing the potential for future adaptive responses. In a recent paper, Zayed and Packer demonstrate that low variation at a specific locus, the complementary sex determination (csd) locus in Hymenoptera (ants, bees and wasps), can result in a sharply increased probability of extinction. Their findings illustrate situations in which there is a feedback process between decreased genetic variation at the csd locus owing to genetic drift and decreased population growth, resulting in an extreme type of extinction vortex for these ecologically important organisms.

  19. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  20. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    Science.gov (United States)

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  1. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    Science.gov (United States)

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    Science.gov (United States)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the

  3. Empirical investigation on using wind speed volatility to estimate the operation probability and power output of wind turbines

    International Nuclear Information System (INIS)

    Liu, Heping; Shi, Jing; Qu, Xiuli

    2013-01-01

    Highlights: ► Ten-minute wind speed and power generation data of an offshore wind turbine are used. ► An ARMA–GARCH-M model is built to simultaneously forecast wind speed mean and volatility. ► The operation probability and expected power output of the wind turbine are predicted. ► The integrated approach produces more accurate wind power forecasting than other conventional methods. - Abstract: In this paper, we introduce a quantitative methodology that performs the interval estimation of wind speed, calculates the operation probability of wind turbine, and forecasts the wind power output. The technological advantage of this methodology stems from the empowered capability of mean and volatility forecasting of wind speed. Based on the real wind speed and corresponding wind power output data from an offshore wind turbine, this methodology is applied to build an ARMA–GARCH-M model for wind speed forecasting, and then to compute the operation probability and the expected power output of the wind turbine. The results show that the developed methodology is effective, the obtained interval estimation of wind speed is reliable, and the forecasted operation probability and expected wind power output of the wind turbine are accurate

  4. Olecranon orientation as an indicator of elbow joint angle in the stance phase, and estimation of forelimb posture in extinct quadruped animals.

    Science.gov (United States)

    Fujiwara, Shin-Ichi

    2009-09-01

    Reconstruction of limb posture is a challenging task in assessing functional morphology and biomechanics of extinct tetrapods, mainly because of the wide range of motions possible at each limb joint and because of our poor knowledge of the relationship between posture and musculoskeletal structure, even in the extant taxa. This is especially true for extinct mammals such as the desmostylian taxa Desmostylus and Paleoparadoxia. This study presents a procedure that how the elbow joint angles of extinct quadruped mammals can be inferred from osteological characteristics. A survey of 67 dried skeletons and 113 step cycles of 32 extant genera, representing 25 families and 13 orders, showed that the olecranon of the ulna and the shaft of the humerus were oriented approximately perpendicular to each other during the stance phase. At this angle, the major extensor muscles maximize their torque at the elbow joint. Based on this survey, I suggest that olecranon orientation can be used for inferring the elbow joint angles of quadruped mammals with prominent olecranons, regardless of taxon, body size, and locomotor guild. By estimating the elbow joint angle, it is inferred that Desmostylus would have had more upright forelimbs than Paleoparadoxia, because their elbow joint angles during the stance phase were approximately 165 degrees and 130 degrees , respectively. Difference in elbow joint angles between these two genera suggests possible differences in stance and gait of these two mammals. Copyright 2009 Wiley-Liss, Inc.

  5. Uniform Estimate of the Finite-Time Ruin Probability for All Times in a Generalized Compound Renewal Risk Model

    Directory of Open Access Journals (Sweden)

    Qingwu Gao

    2012-01-01

    Full Text Available We discuss the uniformly asymptotic estimate of the finite-time ruin probability for all times in a generalized compound renewal risk model, where the interarrival times of successive accidents and all the claim sizes caused by an accident are two sequences of random variables following a wide dependence structure. This wide dependence structure allows random variables to be either negatively dependent or positively dependent.

  6. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    International Nuclear Information System (INIS)

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  7. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    examined, which in turn leads to any of the known stereological estimates, including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...... geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator...

  8. A novel multi-model probability battery state of charge estimation approach for electric vehicles using H-infinity algorithm

    International Nuclear Information System (INIS)

    Lin, Cheng; Mu, Hao; Xiong, Rui; Shen, Weixiang

    2016-01-01

    Highlights: • A novel multi-model probability battery SOC fusion estimation approach was proposed. • The linear matrix inequality-based H∞ technique is employed to estimate the SOC. • The Bayes theorem has been employed to realize the optimal weight for the fusion. • The robustness of the proposed approach is verified by different batteries. • The results show that the proposed method can promote global estimation accuracy. - Abstract: Due to the strong nonlinearity and complex time-variant property of batteries, the existing state of charge (SOC) estimation approaches based on a single equivalent circuit model (ECM) cannot provide the accurate SOC for the entire discharging period. This paper aims to present a novel SOC estimation approach based on a multiple ECMs fusion method for improving the practical application performance. In the proposed approach, three battery ECMs, namely the Thevenin model, the double polarization model and the 3rd order RC model, are selected to describe the dynamic voltage of lithium-ion batteries and the genetic algorithm is then used to determine the model parameters. The linear matrix inequality-based H-infinity technique is employed to estimate the SOC from the three models and the Bayes theorem-based probability method is employed to determine the optimal weights for synthesizing the SOCs estimated from the three models. Two types of lithium-ion batteries are used to verify the feasibility and robustness of the proposed approach. The results indicate that the proposed approach can improve the accuracy and reliability of the SOC estimation against uncertain battery materials and inaccurate initial states.

  9. Empirical estimates in stochastic programs with probability and second order stochastic dominance constraints

    Czech Academy of Sciences Publication Activity Database

    Omelchenko, Vadym; Kaňková, Vlasta

    2015-01-01

    Roč. 84, č. 2 (2015), s. 267-281 ISSN 0862-9544 R&D Projects: GA ČR GA13-14445S Institutional support: RVO:67985556 Keywords : Stochastic programming problems * empirical estimates * light and heavy tailed distributions * quantiles Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/E/omelchenko-0454495.pdf

  10. A Method for Estimating the Probability of Floating Gate Prompt Charge Loss in a Radiation Environment

    Science.gov (United States)

    Edmonds, L. D.

    2016-01-01

    Since advancing technology has been producing smaller structures in electronic circuits, the floating gates in modern flash memories are becoming susceptible to prompt charge loss from ionizing radiation environments found in space. A method for estimating the risk of a charge-loss event is given.

  11. Estimating the benefits of single value and probability forecasting for flood warning

    Directory of Open Access Journals (Sweden)

    J. S. Verkade

    2011-12-01

    Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.

  12. Estimated Probability of Traumatic Abdominal Injury During an International Space Station Mission

    Science.gov (United States)

    Lewandowski, Beth E.; Brooker, John E.; Weavr, Aaron S.; Myers, Jerry G., Jr.; McRae, Michael P.

    2013-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to spaceflight mission planners and medical system designers when assessing risks and optimizing medical systems. The IMM project maintains a database of medical conditions that could occur during a spaceflight. The IMM project is in the process of assigning an incidence rate, the associated functional impairment, and a best and a worst case end state for each condition. The purpose of this work was to develop the IMM Abdominal Injury Module (AIM). The AIM calculates an incidence rate of traumatic abdominal injury per person-year of spaceflight on the International Space Station (ISS). The AIM was built so that the probability of traumatic abdominal injury during one year on ISS could be predicted. This result will be incorporated into the IMM Abdominal Injury Clinical Finding Form and used within the parent IMM model.

  13. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  14. Interstellar dust and extinction

    International Nuclear Information System (INIS)

    Mathis, J.S.

    1990-01-01

    It is noted that the term interstellar dust refers to materials with rather different properties, and that the mean extinction law of Seaton (1979) or Savage and Mathis (1979) should be replaced by the expression given by Cardelli et al. (1989), using the appropriate value of total-to-selective extinction. The older laws were appropriate for the diffuse ISM but dust in clouds differs dramatically in its extinction law. Dust is heavily processed while in the ISM by being included within clouds and cycled back into the diffuse ISM many times during its lifetime. Hence, grains probably reflect only a trace of their origin, although meteoritic inclusions with isotopic anomalies demonstrate that some tiny particles survive intact from a supernova origin to the present. 186 refs

  15. Estimates of probability of severe accidents at European reactors potentially leading to fallout in the UK

    International Nuclear Information System (INIS)

    Mottram, P.R.; Goldemund, M.H.

    2001-08-01

    This study has examined a large number of reactors and data for Nuclear Power Plants (NPPs) in Western Europe, Russia, the seven Central and Eastern European Countries (CEECs) seeking membership of the European Union, and the Newly Independent States (NIS) with operable NPPs. The potential threats from severe accidents at these NPPs causing fallout in the UK has been estimated using IAEA guidelines and Probabilistic Safety Assessments carried out in the specified countries. (author)

  16. Probability and heritability estimates on primary osteoarthritis of the hip leading to total hip arthroplasty

    DEFF Research Database (Denmark)

    Skousgaard, Søren Glud; Hjelmborg, Jacob; Skytthe, Axel

    2015-01-01

    INTRODUCTION: Primary hip osteoarthritis, radiographic as well as symptomatic, is highly associated with increasing age in both genders. However, little is known about the mechanisms behind this, in particular if this increase is caused by genetic factors. This study examined the risk and heritab......INTRODUCTION: Primary hip osteoarthritis, radiographic as well as symptomatic, is highly associated with increasing age in both genders. However, little is known about the mechanisms behind this, in particular if this increase is caused by genetic factors. This study examined the risk...... and heritability of primary osteoarthritis of the hip leading to a total hip arthroplasty, and if this heritability increased with increasing age. METHODS: In a nationwide population-based follow-up study 118,788 twins from the Danish Twin Register and 90,007 individuals from the Danish Hip Arthroplasty Register...... not have had a total hip arthroplasty at the time of follow-up. RESULTS: There were 94,063 twins eligible for analyses, comprising 835 cases of 36 concordant and 763 discordant twin pairs. The probability increased particularly from 50 years of age. After sex and age adjustment a significant additive...

  17. ESTIMATION OF BANKRUPTCY PROBABILITIES BY USING FUZZY LOGIC AND MERTON MODEL: AN APPLICATION ON USA COMPANIES

    Directory of Open Access Journals (Sweden)

    Çiğdem ÖZARİ

    2018-01-01

    Full Text Available In this study, we have worked on developing a brand-new index called Fuzzy-bankruptcy index. The aim of this index is to find out the default probability of any company X, independent from the sector it belongs. Fuzzy logic is used to state the financial ratiointerruption change related with time and inside different sectors, the new index is created to eliminate the number of the relativity of financial ratios. The four input variables inside the five main input variables used for the fuzzy process, are chosen from both factor analysis and clustering and the last input variable calculated from Merton Model. As we analyze in the past cases of the default history of companies, one could explore different reasons such as managerial arrogance, fraud and managerial mistakes, that are responsible for the very poor endings of prestigious companies like Enron, K-Mart. Because of these kind of situations, we try to design a model which one could be able to get a better view of a company’s financial position, and it couldbe prevent credit loan companies from investing in the wrong company and possibly from losing all investments using our Fuzzy-bankruptcy index.

  18. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Directory of Open Access Journals (Sweden)

    William H. Farmer

    2017-10-01

    New hydrological insights for the region: Several methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index and geospatial tools (kriging and topological kriging. These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  19. Probable mode prediction for H.264 advanced video coding P slices using removable SKIP mode distortion estimation

    Science.gov (United States)

    You, Jongmin; Jeong, Jechang

    2010-02-01

    The H.264/AVC (advanced video coding) is used in a wide variety of applications including digital broadcasting and mobile applications, because of its high compression efficiency. The variable block mode scheme in H.264/AVC contributes much to its high compression efficiency but causes a selection problem. In general, rate-distortion optimization (RDO) is the optimal mode selection strategy, but it is computationally intensive. For this reason, the H.264/AVC encoder requires a fast mode selection algorithm for use in applications that require low-power and real-time processing. A probable mode prediction algorithm for the H.264/AVC encoder is proposed. To reduce the computational complexity of RDO, the proposed method selects probable modes among all allowed block modes using removable SKIP mode distortion estimation. Removable SKIP mode distortion is used to estimate whether or not a further divided block mode is appropriate for a macroblock. It is calculated using a no-motion reference block with a few computations. Then the proposed method reduces complexity by performing the RDO process only for probable modes. Experimental results show that the proposed algorithm can reduce encoding time by an average of 55.22% without significant visual quality degradation and increased bit rate.

  20. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers-Part I

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry

    2008-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).

  1. Estimation of default probability for corporate entities in Republic of Serbia

    Directory of Open Access Journals (Sweden)

    Vujnović Miloš

    2016-01-01

    Full Text Available In this paper a quantitative PD model development has been excercised according to the Basel Capital Accord standards. The modeling dataset is based on the financial statements information from the Republic of Serbia. The goal of the paper is to develop a credit scoring model capable of producing PD estimate with high predictive power on the sample of corporate entities. The modeling is based on 5 years of end-of-year financial statements data of available Serbian corporate entities. Weight of evidence (WOE approach has been applied to quantitatively transform and prepare financial ratios. Correlation analysis has been utilized to reduce long list of variables and to remove highly interdependent variables from training and validation datasets. According to the best banking practice and academic literature, the final model is provided by using adjusted stepwise Logistic regression. The finally proposed model and its financial ratio constituents have been discussed and benchmarked against examples from relevant academic literature.

  2. Estimated probability of postwildfire debris flows in the 2012 Whitewater-Baldy Fire burn area, southwestern New Mexico

    Science.gov (United States)

    Tillery, Anne C.; Matherne, Anne Marie; Verdin, Kristine L.

    2012-01-01

    In May and June 2012, the Whitewater-Baldy Fire burned approximately 1,200 square kilometers (300,000 acres) of the Gila National Forest, in southwestern New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 128 basins burned by the Whitewater-Baldy Fire. A pair of empirical hazard-assessment models developed by using data from recently burned basins throughout the intermountain Western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows along the burned area drainage network and for selected drainage basins within the burned area. The models incorporate measures of areal burned extent and severity, topography, soils, and storm rainfall intensity to estimate the probability and volume of debris flows following the fire. In response to the 2-year-recurrence, 30-minute-duration rainfall, modeling indicated that four basins have high probabilities of debris-flow occurrence (greater than or equal to 80 percent). For the 10-year-recurrence, 30-minute-duration rainfall, an additional 14 basins are included, and for the 25-year-recurrence, 30-minute-duration rainfall, an additional eight basins, 20 percent of the total, have high probabilities of debris-flow occurrence. In addition, probability analysis along the stream segments can identify specific reaches of greatest concern for debris flows within a basin. Basins with a high probability of debris-flow occurrence were concentrated in the west and central parts of the burned area, including tributaries to Whitewater Creek, Mineral Creek, and Willow Creek. Estimated debris-flow volumes ranged from about 3,000-4,000 cubic meters (m3) to greater than 500,000 m3 for all design storms modeled. Drainage basins with estimated volumes greater than 500,000 m3 included tributaries to Whitewater Creek, Willow

  3. Evaluation of test-strategies for estimating probability of low prevalence of paratuberculosis in Danish dairy herds

    DEFF Research Database (Denmark)

    Sergeant, E.S.G.; Nielsen, Søren S.; Toft, Nils

    2008-01-01

    of this study was to develop a method to estimate the probability of low within-herd prevalence of paratuberculosis for Danish dairy herds. A stochastic simulation model was developed using the R(R) programming environment. Features of this model included: use of age-specific estimates of test......-sensitivity and specificity; use of a distribution of observed values (rather than a fixed, low value) for design prevalence; and estimates of the probability of low prevalence (Pr-Low) based on a specific number of test-positive animals, rather than for a result less than or equal to a specified cut-point number of reactors....... Using this model, five herd-testing strategies were evaluated: (1) milk-ELISA on all lactating cows; (2) milk-ELISA on lactating cows 4 years old; (4) faecal culture on all lactating cows; and (5) milk-ELISA plus faecal culture in series on all lactating cows. The five testing strategies were evaluated...

  4. Pelagic larval duration predicts extinction risk in a freshwater fish clade.

    Science.gov (United States)

    Douglas, Morgan; Keck, Benjamin P; Ruble, Crystal; Petty, Melissa; Shute, J R; Rakes, Patrick; Hulsey, C Darrin

    2013-01-01

    Pelagic larval duration (PLD) can influence evolutionary processes ranging from dispersal to extinction in aquatic organisms. Using estimates of PLD obtained from species of North American darters (Percidae: Etheostomatinae), we demonstrate that this freshwater fish clade exhibits surprising variation in PLD. Comparative analyses provide some evidence that higher stream gradients favour the evolution of shorter PLD. Additionally, similar to patterns in the marine fossil record in which lower PLD is associated with greater extinction probability, we found a reduced PLD in darter lineages was evolutionarily associated with extinction risk. Understanding the causes and consequences of PLD length could lead to better management and conservation of organisms in our increasingly imperiled aquatic environments.

  5. Required cavity HOM deQing calculated from probability estimates of coupled bunch instabilities in the APS ring

    International Nuclear Information System (INIS)

    Emery, L.

    1993-01-01

    A method of determining the deQing requirement of individual cavity higher-order modes (HOM) for a multi-cavity RF system is presented and applied to the APS ring. Since HOM resonator frequency values are to some degree uncertain, the HOM frequencies should be regarded as random variables in predicting the stability of the coupled bunch beam modes. A Monte Carlo simulation provides a histogram of the growth rates from which one obtains an estimate of the probability of instability. The damping of each HOM type is determined such that the damping effort is economized, i.e. no single HOM dominates the specified growth rate histogram

  6. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  7. Developing an Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.

  8. The Butterflies of Barro Colorado Island, Panama: Local Extinction since the 1930s.

    Directory of Open Access Journals (Sweden)

    Yves Basset

    Full Text Available Few data are available about the regional or local extinction of tropical butterfly species. When confirmed, local extinction was often due to the loss of host-plant species. We used published lists and recent monitoring programs to evaluate changes in butterfly composition on Barro Colorado Island (BCI, Panama between an old (1923-1943 and a recent (1993-2013 period. Although 601 butterfly species have been recorded from BCI during the 1923-2013 period, we estimate that 390 species are currently breeding on the island, including 34 cryptic species, currently only known by their DNA Barcode Index Number. Twenty-three butterfly species that were considered abundant during the old period could not be collected during the recent period, despite a much higher sampling effort in recent times. We consider these species locally extinct from BCI and they conservatively represent 6% of the estimated local pool of resident species. Extinct species represent distant phylogenetic branches and several families. The butterfly traits most likely to influence the probability of extinction were host growth form, wing size and host specificity, independently of the phylogenetic relationships among butterfly species. On BCI, most likely candidates for extinction were small hesperiids feeding on herbs (35% of extinct species. However, contrary to our working hypothesis, extinction of these species on BCI cannot be attributed to loss of host plants. In most cases these host plants remain extant, but they probably subsist at lower or more fragmented densities. Coupled with low dispersal power, this reduced availability of host plants has probably caused the local extinction of some butterfly species. Many more bird than butterfly species have been lost from BCI recently, confirming that small preserves may be far more effective at conserving invertebrates than vertebrates and, therefore, should not necessarily be neglected from a conservation viewpoint.

  9. The Butterflies of Barro Colorado Island, Panama: Local Extinction since the 1930s.

    Science.gov (United States)

    Basset, Yves; Barrios, Héctor; Segar, Simon; Srygley, Robert B; Aiello, Annette; Warren, Andrew D; Delgado, Francisco; Coronado, James; Lezcano, Jorge; Arizala, Stephany; Rivera, Marleny; Perez, Filonila; Bobadilla, Ricardo; Lopez, Yacksecari; Ramirez, José Alejandro

    2015-01-01

    Few data are available about the regional or local extinction of tropical butterfly species. When confirmed, local extinction was often due to the loss of host-plant species. We used published lists and recent monitoring programs to evaluate changes in butterfly composition on Barro Colorado Island (BCI, Panama) between an old (1923-1943) and a recent (1993-2013) period. Although 601 butterfly species have been recorded from BCI during the 1923-2013 period, we estimate that 390 species are currently breeding on the island, including 34 cryptic species, currently only known by their DNA Barcode Index Number. Twenty-three butterfly species that were considered abundant during the old period could not be collected during the recent period, despite a much higher sampling effort in recent times. We consider these species locally extinct from BCI and they conservatively represent 6% of the estimated local pool of resident species. Extinct species represent distant phylogenetic branches and several families. The butterfly traits most likely to influence the probability of extinction were host growth form, wing size and host specificity, independently of the phylogenetic relationships among butterfly species. On BCI, most likely candidates for extinction were small hesperiids feeding on herbs (35% of extinct species). However, contrary to our working hypothesis, extinction of these species on BCI cannot be attributed to loss of host plants. In most cases these host plants remain extant, but they probably subsist at lower or more fragmented densities. Coupled with low dispersal power, this reduced availability of host plants has probably caused the local extinction of some butterfly species. Many more bird than butterfly species have been lost from BCI recently, confirming that small preserves may be far more effective at conserving invertebrates than vertebrates and, therefore, should not necessarily be neglected from a conservation viewpoint.

  10. Development and Validation of a Calculator for Estimating the Probability of Urinary Tract Infection in Young Febrile Children.

    Science.gov (United States)

    Shaikh, Nader; Hoberman, Alejandro; Hum, Stephanie W; Alberty, Anastasia; Muniz, Gysella; Kurs-Lasky, Marcia; Landsittel, Douglas; Shope, Timothy

    2018-06-01

    Accurately estimating the probability of urinary tract infection (UTI) in febrile preverbal children is necessary to appropriately target testing and treatment. To develop and test a calculator (UTICalc) that can first estimate the probability of UTI based on clinical variables and then update that probability based on laboratory results. Review of electronic medical records of febrile children aged 2 to 23 months who were brought to the emergency department of Children's Hospital of Pittsburgh, Pittsburgh, Pennsylvania. An independent training database comprising 1686 patients brought to the emergency department between January 1, 2007, and April 30, 2013, and a validation database of 384 patients were created. Five multivariable logistic regression models for predicting risk of UTI were trained and tested. The clinical model included only clinical variables; the remaining models incorporated laboratory results. Data analysis was performed between June 18, 2013, and January 12, 2018. Documented temperature of 38°C or higher in children aged 2 months to less than 2 years. With the use of culture-confirmed UTI as the main outcome, cutoffs for high and low UTI risk were identified for each model. The resultant models were incorporated into a calculation tool, UTICalc, which was used to evaluate medical records. A total of 2070 children were included in the study. The training database comprised 1686 children, of whom 1216 (72.1%) were female and 1167 (69.2%) white. The validation database comprised 384 children, of whom 291 (75.8%) were female and 200 (52.1%) white. Compared with the American Academy of Pediatrics algorithm, the clinical model in UTICalc reduced testing by 8.1% (95% CI, 4.2%-12.0%) and decreased the number of UTIs that were missed from 3 cases to none. Compared with empirically treating all children with a leukocyte esterase test result of 1+ or higher, the dipstick model in UTICalc would have reduced the number of treatment delays by 10.6% (95% CI

  11. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)

    2016-10-15

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  12. PDE-Foam - a probability-density estimation method using self-adapting phase-space binning

    CERN Document Server

    Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter

    2009-01-01

    Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...

  13. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo

    2016-01-01

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage

  14. Estimation of Probability Density Functions of Damage Parameter for Valve Leakage Detection in Reciprocating Pump Used in Nuclear Power Plants

    Directory of Open Access Journals (Sweden)

    Jong Kyeom Lee

    2016-10-01

    Full Text Available This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  15. The currency and tempo of extinction.

    Science.gov (United States)

    Regan, H M; Lupia, R; Drinnan, A N; Burgman, M A

    2001-01-01

    This study examines estimates of extinction rates for the current purported biotic crisis and from the fossil record. Studies that compare current and geological extinctions sometimes use metrics that confound different sources of error and reflect different features of extinction processes. The per taxon extinction rate is a standard measure in paleontology that avoids some of the pitfalls of alternative approaches. Extinction rates reported in the conservation literature are rarely accompanied by measures of uncertainty, despite many elements of the calculations being subject to considerable error. We quantify some of the most important sources of uncertainty and carry them through the arithmetic of extinction rate calculations using fuzzy numbers. The results emphasize that estimates of current and future rates rely heavily on assumptions about the tempo of extinction and on extrapolations among taxa. Available data are unlikely to be useful in measuring magnitudes or trends in current extinction rates.

  16. Building vulnerability to hydro-geomorphic hazards: Estimating damage probability from qualitative vulnerability assessment using logistic regression

    Science.gov (United States)

    Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida

    2016-10-01

    The focus of this study is an analysis of building vulnerability through investigating impacts from the 8 February 2013 flash flood event along the Avenida Venezuela channel in the city of Arequipa, Peru. On this day, 124.5 mm of rain fell within 3 h (monthly mean: 29.3 mm) triggering a flash flood that inundated at least 0.4 km2 of urban settlements along the channel, affecting more than 280 buildings, 23 of a total of 53 bridges (pedestrian, vehicle and railway), and leading to the partial collapse of sections of the main road, paralyzing central parts of the city for more than one week. This study assesses the aspects of building design and site specific environmental characteristics that render a building vulnerable by considering the example of a flash flood event in February 2013. A statistical methodology is developed that enables estimation of damage probability for buildings. The applied method uses observed inundation height as a hazard proxy in areas where more detailed hydrodynamic modeling data is not available. Building design and site-specific environmental conditions determine the physical vulnerability. The mathematical approach considers both physical vulnerability and hazard related parameters and helps to reduce uncertainty in the determination of descriptive parameters, parameter interdependency and respective contributions to damage. This study aims to (1) enable the estimation of damage probability for a certain hazard intensity, and (2) obtain data to visualize variations in damage susceptibility for buildings in flood prone areas. Data collection is based on a post-flood event field survey and the analysis of high (sub-metric) spatial resolution images (Pléiades 2012, 2013). An inventory of 30 city blocks was collated in a GIS database in order to estimate the physical vulnerability of buildings. As many as 1103 buildings were surveyed along the affected drainage and 898 buildings were included in the statistical analysis. Univariate and

  17. A case of lung cancer in a miner - An estimation of radon exposure and discussion of probable causes

    International Nuclear Information System (INIS)

    Snihs, J.O.; Walinder, Gunnar.

    1977-01-01

    One particular lung cancer case which was brought before the National Swedish Social Insurance Board as a possible case of industrial injury due to exposure to radon is described. The man concerned had worked in two mines during the period 1917-1944 and he was found to be suffering from lung cancer in 1961 when he was 69 years of age. He had been a moderate smoker for the previous 20 years, he had a healed lung tuberculosis and confirmed silicosis in stage 1. The mines in which he worked have been out of use for many years and they have bot been accessible for measurements of radon concentrations. The estimation of the radon concentrations is discussed on the basis of experience of the causes of radon occurrence in other mines with regard to their geology, ventilation and depth and the extent to which mine water was present. The estimated exposure was 600 WLM. With the given conditions there is a discussion on the partial and combined probabilities of lung cancer in the above case taking into account the type of lung cancer, the estimated exposure to radon and his smoking, silicosis, tuberculosis and age

  18. Binomial distribution of Poisson statistics and tracks overlapping probability to estimate total tracks count with low uncertainty

    International Nuclear Information System (INIS)

    Khayat, Omid; Afarideh, Hossein; Mohammadnia, Meisam

    2015-01-01

    In the solid state nuclear track detectors of chemically etched type, nuclear tracks with center-to-center neighborhood of distance shorter than two times the radius of tracks will emerge as overlapping tracks. Track overlapping in this type of detectors causes tracks count losses and it becomes rather severe in high track densities. Therefore, tracks counting in this condition should include a correction factor for count losses of different tracks overlapping orders since a number of overlapping tracks may be counted as one track. Another aspect of the problem is the cases where imaging the whole area of the detector and counting all tracks are not possible. In these conditions a statistical generalization method is desired to be applicable in counting a segmented area of the detector and the results can be generalized to the whole surface of the detector. Also there is a challenge in counting the tracks in densely overlapped tracks because not sufficient geometrical or contextual information are available. It this paper we present a statistical counting method which gives the user a relation between the tracks overlapping probabilities on a segmented area of the detector surface and the total number of tracks. To apply the proposed method one can estimate the total number of tracks on a solid state detector of arbitrary shape and dimensions by approximating the tracks averaged area, whole detector surface area and some orders of tracks overlapping probabilities. It will be shown that this method is applicable in high and ultra high density tracks images and the count loss error can be enervated using a statistical generalization approach. - Highlights: • A correction factor for count losses of different tracks overlapping orders. • For the cases imaging the whole area of the detector is not possible. • Presenting a statistical generalization method for segmented areas. • Giving a relation between the tracks overlapping probabilities and the total tracks

  19. Interstellar Extinction

    OpenAIRE

    Gontcharov, George

    2017-01-01

    This review describes our current understanding of interstellar extinction. This differ substantially from the ideas of the 20th century. With infrared surveys of hundreds of millions of stars over the entire sky, such as 2MASS, SPITZER-IRAC, and WISE, we have looked at the densest and most rarefied regions of the interstellar medium at distances of a few kpc from the sun. Observations at infrared and microwave wavelengths, where the bulk of the interstellar dust absorbs and radiates, have br...

  20. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013 : [summary].

    Science.gov (United States)

    2015-01-01

    Traditionally, the Iowa DOT has used the Iowa Runoff Chart and single-variable regional regression equations (RREs) from a USGS report : (published in 1987) as the primary methods to estimate annual exceedance-probability discharge : (AEPD) for small...

  1. Most probable dimension value and most flat interval methods for automatic estimation of dimension from time series

    International Nuclear Information System (INIS)

    Corana, A.; Bortolan, G.; Casaleggio, A.

    2004-01-01

    We present and compare two automatic methods for dimension estimation from time series. Both methods, based on conceptually different approaches, work on the derivative of the bi-logarithmic plot of the correlation integral versus the correlation length (log-log plot). The first method searches for the most probable dimension values (MPDV) and associates to each of them a possible scaling region. The second one searches for the most flat intervals (MFI) in the derivative of the log-log plot. The automatic procedures include the evaluation of the candidate scaling regions using two reliability indices. The data set used to test the methods consists of time series from known model attractors with and without the addition of noise, structured time series, and electrocardiographic signals from the MIT-BIH ECG database. Statistical analysis of results was carried out by means of paired t-test, and no statistically significant differences were found in the large majority of the trials. Consistent results are also obtained dealing with 'difficult' time series. In general for a more robust and reliable estimate, the use of both methods may represent a good solution when time series from complex systems are analyzed. Although we present results for the correlation dimension only, the procedures can also be used for the automatic estimation of generalized q-order dimensions and pointwise dimension. We think that the proposed methods, eliminating the need of operator intervention, allow a faster and more objective analysis, thus improving the usefulness of dimension analysis for the characterization of time series obtained from complex dynamical systems

  2. Call Arrival Rate Prediction and Blocking Probability Estimation for Infrastructure based Mobile Cognitive Radio Personal Area Network

    Directory of Open Access Journals (Sweden)

    Neeta Nathani

    2017-08-01

    Full Text Available The Cognitive Radio usage has been estimated as non-emergency service with low volume traffic. Present work proposes an infrastructure based Cognitive Radio network and probability of success of CR traffic in licensed band. The Cognitive Radio nodes will form cluster. The cluster nodes will communicate on Industrial, Scientific and Medical band using IPv6 over Low-Power Wireless Personal Area Network based protocol from sensor to Gateway Cluster Head. For Cognitive Radio-Media Access Control protocol for Gateway to Cognitive Radio-Base Station communication, it will use vacant channels of licensed band. Standalone secondary users of Cognitive Radio Network shall be considered as a Gateway with one user. The Gateway will handle multi-channel multi radio for communication with Base Station. Cognitive Radio Network operators shall define various traffic data accumulation counters at Base Station for storing signal strength, Carrier-to-Interference and Noise Ratio, etc. parameters and record channel occupied/vacant status. The researches has been done so far using hour as interval is too long for parameters like holding time expressed in minutes and hence channel vacant/occupied status time is only probabilistically calculated. In the present work, an infrastructure based architecture has been proposed which polls channel status each minute in contrary to hourly polling of data. The Gateways of the Cognitive Radio Network shall monitor status of each Primary User periodically inside its working range and shall inform to Cognitive Radio- Base Station for preparation of minutewise database. For simulation, the occupancy data for all primary user channels were pulled in one minute interval from a live mobile network. Hourly traffic data and minutewise holding times has been analyzed to optimize the parameters of Seasonal Auto Regressive Integrated Moving Average prediction model. The blocking probability of an incoming Cognitive Radio call has been

  3. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  4. Impossible Extinction

    Science.gov (United States)

    Cockell, Charles S.

    2003-03-01

    Every 225 million years the Earth, and all the life on it, completes one revolution around the Milky Way Galaxy. During this remarkable journey, life is influenced by calamitous changes. Comets and asteroids strike the surface of the Earth, stars explode, enormous volcanoes erupt, and, more recently, humans litter the planet with waste. Many animals and plants become extinct during the voyage, but humble microbes, simple creatures made of a single cell, survive this journey. This book takes a tour of the microbial world, from the coldest and deepest places on Earth to the hottest and highest, and witnesses some of the most catastrophic events that life can face. Impossible Extinction tells this remarkable story to the general reader by explaining how microbes have survived on Earth for over three billion years. Charles Cockell received his doctorate from the University of Oxford, and is currently a microbiologist with rhe Search for Extraterrestrial Intelligence Institute (SETI), based at the British Antarctic Survey in Cambridge, UK. His research focusses on astrobiology, life in the extremes and the human exploration of Mars. Cockell has been on expeditions to the Arctic, Antarctic, Mongolia, and in 1993 he piloted a modified insect-collecting ultra-light aircraft over the Indonesian rainforests. He is Chair of the Twenty-one Eleven Foundation for Exploration, a charity that supports expeditions that forge links between space exploration and environmentalism.

  5. Estimation of probability for the presence of claw and digital skin diseases by combining cow- and herd-level information using a Bayesian network

    DEFF Research Database (Denmark)

    Ettema, Jehan Frans; Østergaard, Søren; Kristensen, Anders Ringgaard

    2009-01-01

    , the data has been used to estimate the random effect of herd on disease prevalence and to find conditional probabilities of cows being lame, given the presence of the three diseases. By considering the 50 herds representative for the Danish population, the estimates for risk factors, conditional...

  6. Methods for estimating annual exceedance-probability streamflows for streams in Kansas based on data through water year 2015

    Science.gov (United States)

    Painter, Colin C.; Heimann, David C.; Lanning-Rush, Jennifer L.

    2017-08-14

    A study was done by the U.S. Geological Survey in cooperation with the Kansas Department of Transportation and the Federal Emergency Management Agency to develop regression models to estimate peak streamflows of annual exceedance probabilities of 50, 20, 10, 4, 2, 1, 0.5, and 0.2 percent at ungaged locations in Kansas. Peak streamflow frequency statistics from selected streamgages were related to contributing drainage area and average precipitation using generalized least-squares regression analysis. The peak streamflow statistics were derived from 151 streamgages with at least 25 years of streamflow data through 2015. The developed equations can be used to predict peak streamflow magnitude and frequency within two hydrologic regions that were defined based on the effects of irrigation. The equations developed in this report are applicable to streams in Kansas that are not substantially affected by regulation, surface-water diversions, or urbanization. The equations are intended for use for streams with contributing drainage areas ranging from 0.17 to 14,901 square miles in the nonirrigation effects region and, 1.02 to 3,555 square miles in the irrigation-affected region, corresponding to the range of drainage areas of the streamgages used in the development of the regional equations.

  7. Extinction rates in North American freshwater fishes, 1900-2010

    Science.gov (United States)

    Burkhead, Noel M.

    2012-01-01

    Widespread evidence shows that the modern rates of extinction in many plants and animals exceed background rates in the fossil record. In the present article, I investigate this issue with regard to North American freshwater fishes. From 1898 to 2006, 57 taxa became extinct, and three distinct populations were extirpated from the continent. Since 1989, the numbers of extinct North American fishes have increased by 25%. From the end of the nineteenth century to the present, modern extinctions varied by decade but significantly increased after 1950 (post-1950s mean = 7.5 extinct taxa per decade). In the twentieth century, freshwater fishes had the highest extinction rate worldwide among vertebrates. The modern extinction rate for North American freshwater fishes is conservatively estimated to be 877 times greater than the background extinction rate for freshwater fishes (one extinction every 3 million years). Reasonable estimates project that future increases in extinctions will range from 53 to 86 species by 2050.

  8. Multi-color light curves of type Ia supernovae on the color-magnitude diagram: A novel step toward more precise distance and extinction estimates

    International Nuclear Information System (INIS)

    Wang, Lifan; Goldhaber, Gerson; Aldering, Greg; Perlmutter, Saul

    2003-01-01

    We show empirically that fits to the color-magnitude relation of Type Ia supernovae after optical maximum can provide accurate relative extragalactic distances. We report the discovery of an empirical color relation for Type Ia light curves: During much of the first month past maximum, the magnitudes of Type Ia supernovae defined at a given value of color index have a very small magnitude dispersion; moreover, during this period the relation between B magnitude and B-V color (or B-Ror B-I color) is strikingly linear, to the accuracy of existing well-measured data. These linear relations can provide robust distance estimates, in particular, by using the magnitudes when the supernova reaches a given color. After correction for light curve stretch factor or decline rate, the dispersion of the magnitudes taken at the intercept of the linear color-magnitude relation are found to be around 0 m .08 for the sub-sample of supernovae with (B max - V max ) (le) 0 m 0.5, and around 0 m .11 for the sub-sample with (B max - V max ) (le) 0 m .2. This small dispersion is consistent with being mostly due to observational errors. The method presented here and the conventional light curve fitting methods can be combined to further improve statistical dispersions of distance estimates. It can be combined with the magnitude at maximum to deduce dust extinction. The slopes of the color-magnitude relation may also be used to identify intrinsically different SN Ia systems. The method provides a tool that is fundamental to using SN Ia to estimate cosmological parameters such as the Hubble constant and the mass and dark energy content of the universe

  9. Estimation of flashover voltage probability of overhead line insulators under industrial pollution, based on maximum likelihood method

    International Nuclear Information System (INIS)

    Arab, M.N.; Ayaz, M.

    2004-01-01

    The performance of transmission line insulator is greatly affected by dust, fumes from industrial areas and saline deposit near the coast. Such pollutants in the presence of moisture form a coating on the surface of the insulator, which in turn allows the passage of leakage current. This leakage builds up to a point where flashover develops. The flashover is often followed by permanent failure of insulation resulting in prolong outages. With the increase in system voltage owing to the greater demand of electrical energy over the past few decades, the importance of flashover due to pollution has received special attention. The objective of the present work was to study the performance of overhead line insulators in the presence of contaminants such as induced salts. A detailed review of the literature and the mechanisms of insulator flashover due to the pollution are presented. Experimental investigations on the behavior of overhead line insulators under industrial salt contamination are carried out. A special fog chamber was designed in which the contamination testing of insulators was carried out. Flashover behavior under various degrees of contamination of insulators with the most common industrial fume components such as Nitrate and Sulphate compounds was studied. Substituting the normal distribution parameter in the probability distribution function based on maximum likelihood develops a statistical method. The method gives a high accuracy in the estimation of the 50% flashover voltage, which is then used to evaluate the critical flashover index at various contamination levels. The critical flashover index is a valuable parameter in insulation design for numerous applications. (author)

  10. Biological correlates of extinction risk in bats.

    Science.gov (United States)

    Jones, Kate E; Purvis, Andy; Gittleman, John L

    2003-04-01

    We investigated patterns and processes of extinction and threat in bats using a multivariate phylogenetic comparative approach. Of nearly 1,000 species worldwide, 239 are considered threatened by the International Union for Conservation of Nature and Natural Resources (IUCN) and 12 are extinct. Small geographic ranges and low wing aspect ratios are independently found to predict extinction risk in bats, which explains 48% of the total variance in IUCN assessments of threat. The pattern and correlates of extinction risk in the two bat suborders are significantly different. A higher proportion (4%) of megachiropteran species have gone extinct in the last 500 years than microchiropteran bats (0.3%), and a higher proportion is currently at risk of extinction (Megachiroptera: 34%; Microchiroptera: 22%). While correlates of microchiropteran extinction risk are the same as in the order as a whole, megachiropteran extinction is correlated more with reproductive rate and less with wing morphology. Bat extinction risk is not randomly distributed phylogenetically: closely related species have more similar levels of threat than would be expected if extinction risk were random. Given the unbalanced nature of the evolutionary diversification of bats, it is probable that the amount of phylogenetic diversity lost if currently threatened taxa disappear may be greater than in other clades with numerically more threatened species.

  11. NEW EXTINCTION AND MASS ESTIMATES FROM OPTICAL PHOTOMETRY OF THE VERY LOW MASS BROWN DWARF COMPANION CT CHAMAELEONTIS B WITH THE MAGELLAN AO SYSTEM

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Ya-Lin; Close, Laird M.; Males, Jared R.; Morzinski, Katie M.; Follette, Katherine B.; Bailey, Vanessa; Rodigas, Timothy J.; Hinz, Philip [Steward Observatory, University of Arizona, Tucson, AZ 85721 (United States); Barman, Travis S. [Lunar and Planetary Laboratory, University of Arizona, Tucson, AZ 85721 (United States); Puglisi, Alfio; Xompero, Marco; Briguglio, Runa, E-mail: yalinwu@email.arizona.edu [INAF-Osservatorio Astrofisico di Arcetri, Largo E. Fermi 5, I-50125 Firenze (Italy)

    2015-03-01

    We used the Magellan adaptive optics system and its VisAO CCD camera to image the young low mass brown dwarf companion CT Chamaeleontis B for the first time at visible wavelengths. We detect it at r', i', z', and Y{sub S}. With our new photometry and T {sub eff} ∼ 2500 K derived from the shape of its K-band spectrum, we find that CT Cha B has A{sub V} = 3.4 ± 1.1 mag, and a mass of 14-24 M{sub J} according to the DUSTY evolutionary tracks and its 1-5 Myr age. The overluminosity of our r' detection indicates that the companion has significant Hα emission and a mass accretion rate ∼6 × 10{sup –10} M {sub ☉} yr{sup –1}, similar to some substellar companions. Proper motion analysis shows that another point source within 2'' of CT Cha A is not physical. This paper demonstrates how visible wavelength adaptive optics photometry (r', i', z', Y{sub S}) allows for a better estimate of extinction, luminosity, and mass accretion rate of young substellar companions.

  12. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni; Nobile, Fabio; Tempone, Raul

    2015-01-01

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability

  13. A Method to Estimate the Probability That Any Individual Lightning Stroke Contacted the Surface Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.

    2010-01-01

    A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].

  14. Estimation of failure probability of the end induced current depending on uncertain parameters of a transmission line

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper treats about the risk analysis of an EMC default using a statistical approach based on reliability methods. A probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is computed by taking into account uncertainties on input parameters influencing extreme levels of interference in the context of transmission lines. Results are compared to Monte Carlo simulation (MCS). (authors)

  15. Probability estimation of potential harm to human health and life caused by a hypothetical nuclear accident at the nuclear power plant

    International Nuclear Information System (INIS)

    Soloviov, Vladyslav; Pysmenniy, Yevgen

    2015-01-01

    This paper describes some general methodological aspects of the assessment of the damage to human life and health caused by a hypothetical nuclear accident at the nuclear power plant (NPP). Probability estimation of death (due to cancer and non-cancer effects of radiation injury), disability and incapacity of individuals were made by taking into account the regulations of Ukraine. According to the assessment, the probability of death due to cancer and non-cancer effects of radiation damage to individuals who received radiation dose of 1 Sv is equal to 0.09. Probability of disability of 1, 2 or 3 group regardless of the radiation dose is 0.009, 0.0054, 0.027, respectively. Probability of temporary disability of the individual who received dose equal to 33 mSv (the level of potential exposure in a hypothetical nuclear accident at the NPP) is equal 0.16. This probability estimation of potential harm to human health and life caused by a hypothetical nuclear accident can be used for NPP in different countries using requirements of regulations in these countries. And also to estimate the amount of insurance payments due to the nuclear damage in the event of a nuclear accident at the NPP or other nuclear industry enterprise. (author)

  16. Seasonal atmospheric extinction

    International Nuclear Information System (INIS)

    Mikhail, J.S.

    1979-01-01

    Mean monochromatic extinction coefficients at various wavelengths at the Kottamia Observatory site have shown the existence of a seasonal variation of atmospheric extinction. The extinction of aerosol compontnts with wavelengths at winter represent exceedingly good conditions. Spring gives the highest extinction due to aerosol. (orig.)

  17. Thermal Transgressions and Phanerozoic Extinctions

    Science.gov (United States)

    Worsley, T. R.; Kidder, D. L.

    2007-12-01

    A number of significant Phanerozoic extinctions are associated with marine transgressions that were probably driven by rapid ocean warming. The conditions associated with what we call thermal transgressions are extremely stressful to life on Earth. The Earth system setting associated with end-Permian extinction exemplifies an end-member case of our model. The conditions favoring extreme warmth and sea-level increases driven by thermal expansion are also conducive to changes in ocean circulation that foster widespread anoxia and sulfidic subsurface ocean waters. Equable climates are characterized by reduced wind shear and weak surface ocean circulation. Late Permian and Early Triassic thermohaline circulation differs considerably from today's world, with minimal polar sinking and intensified mid-latitude sinking that delivers sulfate from shallow evaporative areas to deeper water where it is reduced to sulfide. Reduced nutrient input to oceans from land at many of the extinction intervals results from diminished silicate weathering and weakened delivery of iron via eolian dust. The falloff in iron-bearing dust leads to minimal nitrate production, weakening food webs and rendering faunas and floras more susceptible to extinction when stressed. Factors such as heat, anoxia, ocean acidification, hypercapnia, and hydrogen sulfide poisoning would significantly affect these biotas. Intervals of tectonic quiescence set up preconditions favoring extinctions. Reductions in chemical silicate weathering lead to carbon dioxide buildup, oxygen drawdown, nutrient depletion, wind and ocean current abatement, long-term global warming, and ocean acidification. The effects of extinction triggers such as large igneous provinces, bolide impacts, and episodes of sudden methane release are more potent against the backdrop of our proposed preconditions. Extinctions that have characteristics we call for in the thermal transgressions include the Early Cambrian Sinsk event, as well as

  18. Performance Analysis of Secrecy Outage Probability for AF-Based Partial Relay Selection with Outdated Channel Estimates

    Directory of Open Access Journals (Sweden)

    Kyu-Sung Hwang

    2017-01-01

    Full Text Available We study the secrecy outage probability of the amplify-and-forward (AF relaying protocol, which consists of one source, one destination, multiple relays, and multiple eavesdroppers. In this system, the aim is to transmit the confidential messages from a source to a destination via the selected relay in presence of eavesdroppers. Moreover, partial relay selection scheme is utilized for relay selection based on outdated channel state information where only neighboring channel information (source-relays is available and passive eavesdroppers are considered where a transmitter does not have any knowledge of eavesdroppers’ channels. Specifically, we offer the exact secrecy outage probability of the proposed system in a one-integral form as well as providing the asymptotic secrecy outage probability in a closed-form. Numerical examples are given to verify our provided analytical results for different system conditions.

  19. Estimation of PHI (γ,n) average probability for complex nuclei in the quasi-deuteron region

    International Nuclear Information System (INIS)

    Ferreira, M.C. da S.

    1977-01-01

    The average probabilities of (γ,n) reactions for complexe nuclei of 6 C 12 , 19 F 19 , 25 Mn 55 , 79 Au 197 and 92 U 238 , in the energy range from giant resonance end to photomesonic threshold (quasi-deuteron region), using values of cross sections per quantum equivalent to 300 Mev produced by Bremsstrahlung photons in the Frascati and Orsay accelerators were determined. The probabilities were also calculated using nuclear transparence for protons and neutrons, resultants from quasi-deuteron disintegration. The transparence formulaes were determined by optical model. (M.C.K.) [pt

  20. An approach for estimating the breach probabilities of moraine-dammed lakes in the Chinese Himalayas using remote-sensing data

    Directory of Open Access Journals (Sweden)

    X. Wang

    2012-10-01

    Full Text Available To make first-order estimates of the probability of moraine-dammed lake outburst flood (MDLOF and prioritize the probabilities of breaching posed by potentially dangerous moraine-dammed lakes (PDMDLs in the Chinese Himalayas, an objective approach is presented. We first select five indicators to identify PDMDLs according to four predesigned criteria. The climatic background was regarded as the climatic precondition of the moraine-dam failure, and under different climatic preconditions, we distinguish the trigger mechanisms of MDLOFs and subdivide them into 17 possible breach modes, with each mode having three or four components; we combined the precondition, modes and components to construct a decision-making tree of moraine-dam failure. Conversion guidelines were established so as to quantify the probabilities of components of a breach mode employing the historic performance method combined with expert knowledge and experience. The region of the Chinese Himalayas was chosen as a study area where there have been frequent MDLOFs in recent decades. The results show that the breaching probabilities (P of 142 PDMDLs range from 0.037 to 0.345, and they can be further categorized as 43 lakes with very high breach probabilities (P ≥ 0.24, 47 lakes with high breach probabilities (0.18 ≤ P < 0.24, 24 lakes with mid-level breach probabilities (0.12 ≤ P < 0.18, 24 lakes with low breach probabilities (0.06 ≤ P < 0.12, and four lakes with very low breach probabilities (p < 0.06.

  1. First human-caused extinction of a cetacean species?

    Science.gov (United States)

    Turvey, Samuel T; Pitman, Robert L; Taylor, Barbara L; Barlow, Jay; Akamatsu, Tomonari; Barrett, Leigh A; Zhao, Xiujiang; Reeves, Randall R; Stewart, Brent S; Wang, Kexiong; Wei, Zhuo; Zhang, Xianfeng; Pusser, L T; Richlen, Michael; Brandon, John R; Wang, Ding

    2007-10-22

    The Yangtze River dolphin or baiji (Lipotes vexillifer), an obligate freshwater odontocete known only from the middle-lower Yangtze River system and neighbouring Qiantang River in eastern China, has long been recognized as one of the world's rarest and most threatened mammal species. The status of the baiji has not been investigated since the late 1990s, when the surviving population was estimated to be as low as 13 individuals. An intensive six-week multi-vessel visual and acoustic survey carried out in November-December 2006, covering the entire historical range of the baiji in the main Yangtze channel, failed to find any evidence that the species survives. We are forced to conclude that the baiji is now likely to be extinct, probably due to unsustainable by-catch in local fisheries. This represents the first global extinction of a large vertebrate for over 50 years, only the fourth disappearance of an entire mammal family since AD 1500, and the first cetacean species to be driven to extinction by human activity. Immediate and extreme measures may be necessary to prevent the extinction of other endangered cetaceans, including the sympatric Yangtze finless porpoise (Neophocaena phocaenoides asiaeorientalis).

  2. Estimation of Partial Safety Factors and Target Failure Probability Based on Cost Optimization of Rubble Mound Breakwaters

    DEFF Research Database (Denmark)

    Kim, Seung-Woo; Suh, Kyung-Duck; Burcharth, Hans F.

    2010-01-01

    The breakwaters are designed by considering the cost optimization because a human risk is seldom considered. Most breakwaters, however, were constructed without considering the cost optimization. In this study, the optimum return period, target failure probability and the partial safety factors...

  3. Estimates of mean consequences and confidence bounds on the mean associated with low-probability seismic events in total system performance assessments

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James

    2007-01-01

    An approach is described to estimate mean consequences and confidence bounds on the mean of seismic events with low probability of breaching components of the engineered barrier system. The approach is aimed at complementing total system performance assessment models used to understand consequences of scenarios leading to radionuclide releases in geologic nuclear waste repository systems. The objective is to develop an efficient approach to estimate mean consequences associated with seismic events of low probability, employing data from a performance assessment model with a modest number of Monte Carlo realizations. The derived equations and formulas were tested with results from a specific performance assessment model. The derived equations appear to be one method to estimate mean consequences without having to use a large number of realizations. (authors)

  4. Current recommendations on the estimation of transition probabilities in Markov cohort models for use in health care decision-making: a targeted literature review

    Directory of Open Access Journals (Sweden)

    Olariu E

    2017-09-01

    Full Text Available Elena Olariu,1 Kevin K Cadwell,1 Elizabeth Hancock,1 David Trueman,1 Helene Chevrou-Severac2 1PHMR Ltd, London, UK; 2Takeda Pharmaceuticals International AG, Zurich, Switzerland Objective: Although Markov cohort models represent one of the most common forms of decision-analytic models used in health care decision-making, correct implementation of such models requires reliable estimation of transition probabilities. This study sought to identify consensus statements or guidelines that detail how such transition probability matrices should be estimated. Methods: A literature review was performed to identify relevant publications in the following databases: Medline, Embase, the Cochrane Library, and PubMed. Electronic searches were supplemented by manual-searches of health technology assessment (HTA websites in Australia, Belgium, Canada, France, Germany, Ireland, Norway, Portugal, Sweden, and the UK. One reviewer assessed studies for eligibility. Results: Of the 1,931 citations identified in the electronic searches, no studies met the inclusion criteria for full-text review, and no guidelines on transition probabilities in Markov models were identified. Manual-searching of the websites of HTA agencies identified ten guidelines on economic evaluations (Australia, Belgium, Canada, France, Germany, Ireland, Norway, Portugal, Sweden, and UK. All identified guidelines provided general guidance on how to develop economic models, but none provided guidance on the calculation of transition probabilities. One relevant publication was identified following review of the reference lists of HTA agency guidelines: the International Society for Pharmacoeconomics and Outcomes Research taskforce guidance. This provided limited guidance on the use of rates and probabilities. Conclusions: There is limited formal guidance available on the estimation of transition probabilities for use in decision-analytic models. Given the increasing importance of cost

  5. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  6. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    OpenAIRE

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective: To examine sociodemographic and behavioural differences between men whohave sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey.\\ud Methods: We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men inthe same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European...

  7. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki; Park, Kihong; Alouini, Mohamed-Slim

    2017-01-01

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  8. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  9. Superstar Extinction

    OpenAIRE

    Pierre Azoulay; Joshua S. Graff Zivin; Jialan Wang

    2008-01-01

    We estimate the magnitude of spillovers generated by 112 academic "superstars" who died pre- maturely and unexpectedly, thus providing an exogenous source of variation in the structure of their collaborators' coauthorship networks. Following the death of a superstar, we find that collaborators experience, on average, a lasting 5 to 8% decline in their quality-adjusted publication rates. By exploring interactions of the treatment effect with a variety of star, coauthor and star/coauthor dyad c...

  10. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys.

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-09-01

    To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. We compared 148 MSM aged 18-64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010-2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%-95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. Methods We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. Results MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%–95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. Conclusions National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. PMID:26965869

  12. The accuracy of clinical and biochemical estimates in defining the pre-test probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Garvie, N.W.; Salehzahi, F.; Kuitert, L.

    2002-01-01

    Full text: The PIOPED survey confirmed the significance of the high probability ventilation/perfusion scan (HP V/Q scan) in establishing the diagnosis of pulmonary embolism (PE). In an interesting sentence, however, the authors indicated that 'the clinicians' assessment of the likelihood of PE (prior probability)' can substantially increase the predictive value of the investigation. The criteria used for this assessment were not published, and this statement conflicts with the belief that the clinical diagnosis of pulmonary embolism is unreliable. A medical history was obtained from 668 patients undergoing V/Q lung scans for suspected PE, and certain clinical features linked to PE were, when present, documented. These included pleuritic chest pain, haemoptysis, dyspnoea, clinical evidence of DVT, recent surgery and right ventricular strain pattern an ECG. D-Dimer levels and initial arterial oxygen saturation (PaO2) levels were also obtained. The prevalence of these clinical and biochemical criteria was then compared between HP (61) and normal (171) scans after exclusion of all equivocal or intermediate scan outcomes (436), (where lung scintigraphy was unable to provide a definite diagnosis). D-Dimer and/or oxygen saturation levels, were similarly compared in each group. A true positive result was scored for each clinical or biochemical criterion when linked with a high probability scan and, conversely, a false positive score when the scan outcome was normal. In this fashion, the positive predictive value (PPV) and, when appropriate, the negative predictive value (NPV) was obtained for each risk factor. In the context of PE, DVT and post-operative status prove the more reliable predictors of a high probability outcome. Where both features were present, the PPV rose to 0.57. A normal D-Dimer level was a better excluder of PE than a normal oxygen saturation level (NPV 0.78-v-0.44). Conversely, a raised D-Dimer, or reduced oxygen saturation, were both a little value in

  13. FuzzyStatProb: An R Package for the Estimation of Fuzzy Stationary Probabilities from a Sequence of Observations of an Unknown Markov Chain

    Directory of Open Access Journals (Sweden)

    Pablo J. Villacorta

    2016-07-01

    Full Text Available Markov chains are well-established probabilistic models of a wide variety of real systems that evolve along time. Countless examples of applications of Markov chains that successfully capture the probabilistic nature of real problems include areas as diverse as biology, medicine, social science, and engineering. One interesting feature which characterizes certain kinds of Markov chains is their stationary distribution, which stands for the global fraction of time the system spends in each state. The computation of the stationary distribution requires precise knowledge of the transition probabilities. When the only information available is a sequence of observations drawn from the system, such probabilities have to be estimated. Here we review an existing method to estimate fuzzy transition probabilities from observations and, with them, obtain the fuzzy stationary distribution of the resulting fuzzy Markov chain. The method also works when the user directly provides fuzzy transition probabilities. We provide an implementation in the R environment that is the first available to the community and serves as a proof of concept. We demonstrate the usefulness of our proposal with computational experiments on a toy problem, namely a time-homogeneous Markov chain that guides the randomized movement of an autonomous robot that patrols a small area.

  14. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    Science.gov (United States)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  15. Accelerated modern human?induced species losses: Entering the sixth mass extinction

    OpenAIRE

    Ceballos, Gerardo; Ehrlich, Paul R.; Barnosky, Anthony D.; Garc?a, Andr?s; Pringle, Robert M.; Palmer, Todd M.

    2015-01-01

    The oft-repeated claim that Earth?s biota is entering a sixth ?mass extinction? depends on clearly demonstrating that current extinction rates are far above the ?background? rates prevailing between the five previous mass extinctions. Earlier estimates of extinction rates have been criticized for using assumptions that might overestimate the severity of the extinction crisis. We assess, using extremely conservative assumptions, whether human activities are causing a mass extinction. First, we...

  16. Using Multiple and Logistic Regression to Estimate the Median WillCost and Probability of Cost and Schedule Overrun for Program Managers

    Science.gov (United States)

    2017-03-23

    Logistic Regression to Estimate the Median Will-Cost and Probability of Cost and Schedule Overrun for Program Managers Ryan C. Trudelle, B.S...not the other. We are able to give logistic regression models to program managers that identify several program characteristics for either...considered acceptable. We recommend the use of our logistic models as a tool to manage a portfolio of programs in order to gain potential elusive

  17. Modeling galactic extinction

    OpenAIRE

    Cecchi-Pestellini, C.; Mulas, G.; Casu, S.; Iatì, M. A.; Saija, R.; Cacciola, A.; Borghese, F.; Denti, P.

    2011-01-01

    We present a model for interstellar extinction dust, in which we assume a bimodal distribution of extinction carriers, a dispersion of core-mantle grains, supplemented by a collection of PAHs in free molecular form. We use state-of-the-art methods to calculate the extinction due to macroscopic dust particles, and the absorption cross-sections of PAHs in four different charge states. While successfull for most of observed Galactic extinction curves, in few cases the model cannot provide reliab...

  18. An allometric approach to quantify the extinction vulnerability of birds and mammals.

    Science.gov (United States)

    Hilbers, J P; Schipper, A M; Hendriks, A J; Verones, F; Pereira, H M; Huijbregts, M A J

    2016-03-01

    Methods to quantify the vulnerability of species to extinction are typically limited by the availability of species-specific input data pertaining to life-history characteristics and population dynamics. This lack of data hampers global biodiversity assessments and conservation planning. Here, we developed a new framework that systematically quantifies extinction risk based on allometric relationships between various wildlife demographic parameters and body size. These allometric relationships have a solid theoretical and ecological foundation. Extinction risk indicators included are (1) the probability of extinction, (2) the mean time to extinction, and (3) the critical patch size. We applied our framework to assess the global extinction vulnerability of terrestrial carnivorous and non-carnivorous birds and mammals. Irrespective of the indicator used, large-bodied species were found to be more vulnerable to extinction than their smaller counterparts. The patterns with body size were confirmed for all species groups by a comparison with IUCN data on the proportion of extant threatened species: the models correctly predicted a multimodal distribution with body size for carnivorous birds and a monotonic distribution for mammals and non-carnivorous birds. Carnivorous mammals were found to have higher extinction risks than non-carnivores, while birds were more prone to extinction than mammals. These results are explained by the allometric relationships, predicting the vulnerable species groups to have lower intrinsic population growth rates, smaller population sizes, lower carrying capacities, or larger dispersal distances, which, in turn, increase the importance of losses due to environmental stochastic effects and dispersal activities. Our study is the first to integrate population viability analysis and allometry into a novel, process-based framework that is able to quantify extinction risk of a large number of species without requiring data-intensive, species

  19. Modern examples of extinctions

    DEFF Research Database (Denmark)

    Lövei, Gabor L

    2013-01-01

    No species lives forever, and extinction is the ultimate fate of all living species. The fossil record indicates that a recent extinction wave affecting terrestrial vertebrates was parallel with the arrival of modern humans to areas formerly uninhabited by them. These modern instances of extinction...

  20. Mass Extinctions and Supernova Explosions

    Science.gov (United States)

    Korschinek, Gunther

    A nearby supernova (SN) explosion could have negatively influenced life on Earth, maybe even been responsible for mass extinctions. Mass extinction poses a significant extinction of numerous species on Earth, as recorded in the paleontologic, paleoclimatic, and geological record of our planet. Depending on the distance between the Sun and the SN, different types of threats have to be considered, such as ozone depletion on Earth, causing increased exposure to the Sun's ultraviolet radiation or the direct exposure of lethal X-rays. Another indirect effect is cloud formation, induced by cosmic rays in the atmosphere which result in a drop in the Earth's temperature, causing major glaciations of the Earth. The discovery of highly intensive gamma-ray bursts (GRBs), which could be connected to SNe, initiated further discussions on possible life-threatening events in the Earth's history. The probability that GRBs hit the Earth is very low. Nevertheless, a past interaction of Earth with GRBs and/or SNe cannot be excluded and might even have been responsible for past extinction events.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Probabilistic measures of persistence and extinction in measles (meta)populations.

    Science.gov (United States)

    Gunning, Christian E; Wearing, Helen J

    2013-08-01

    Persistence and extinction are fundamental processes in ecological systems that are difficult to accurately measure due to stochasticity and incomplete observation. Moreover, these processes operate on multiple scales, from individual populations to metapopulations. Here, we examine an extensive new data set of measles case reports and associated demographics in pre-vaccine era US cities, alongside a classic England & Wales data set. We first infer the per-population quasi-continuous distribution of log incidence. We then use stochastic, spatially implicit metapopulation models to explore the frequency of rescue events and apparent extinctions. We show that, unlike critical community size, the inferred distributions account for observational processes, allowing direct comparisons between metapopulations. The inferred distributions scale with population size. We use these scalings to estimate extinction boundary probabilities. We compare these predictions with measurements in individual populations and random aggregates of populations, highlighting the importance of medium-sized populations in metapopulation persistence. © 2013 John Wiley & Sons Ltd/CNRS.

  3. Estimated probabilities, volumes, and inundation areas depths of potential postwildfire debris flows from Carbonate, Slate, Raspberry, and Milton Creeks, near Marble, Gunnison County, Colorado

    Science.gov (United States)

    Stevens, Michael R.; Flynn, Jennifer L.; Stephens, Verlin C.; Verdin, Kristine L.

    2011-01-01

    During 2009, the U.S. Geological Survey, in cooperation with Gunnison County, initiated a study to estimate the potential for postwildfire debris flows to occur in the drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble, Colorado. Currently (2010), these drainage basins are unburned but could be burned by a future wildfire. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of postwildfire debris-flow occurrence and debris-flow volumes for drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble. Data for the postwildfire debris-flow models included drainage basin area; area burned and burn severity; percentage of burned area; soil properties; rainfall total and intensity for the 5- and 25-year-recurrence, 1-hour-duration-rainfall; and topographic and soil property characteristics of the drainage basins occupied by the four creeks. A quasi-two-dimensional floodplain computer model (FLO-2D) was used to estimate the spatial distribution and the maximum instantaneous depth of the postwildfire debris-flow material during debris flow on the existing debris-flow fans that issue from the outlets of the four major drainage basins. The postwildfire debris-flow probabilities at the outlet of each drainage basin range from 1 to 19 percent for the 5-year-recurrence, 1-hour-duration rainfall, and from 3 to 35 percent for 25-year-recurrence, 1-hour-duration rainfall. The largest probabilities for postwildfire debris flow are estimated for Raspberry Creek (19 and 35 percent), whereas estimated debris-flow probabilities for the three other creeks range from 1 to 6 percent. The estimated postwildfire debris-flow volumes at the outlet of each creek range from 7,500 to 101,000 cubic meters for the 5-year-recurrence, 1-hour-duration rainfall, and from 9,400 to 126,000 cubic meters for

  4. Influence of the level of fit of a density probability function to wind-speed data on the WECS mean power output estimation

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Velazquez, Sergio

    2008-01-01

    Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error ε made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R 2 statistic (R a 2 ). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R a 2 statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R a 2 increases

  5. Influence of the level of fit of a density probability function to wind-speed data on the WECS mean power output estimation

    Energy Technology Data Exchange (ETDEWEB)

    Carta, Jose A. [Department of Mechanical Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Las Palmas de Gran Canaria, Canary Islands (Spain); Ramirez, Penelope; Velazquez, Sergio [Department of Renewable Energies, Technological Institute of the Canary Islands, Pozo Izquierdo Beach s/n, 35119 Santa Lucia, Gran Canaria, Canary Islands (Spain)

    2008-10-15

    Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error {epsilon} made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R{sup 2} statistic (R{sub a}{sup 2}). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R{sub a}{sup 2} statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R{sub a}{sup 2} increases. (author)

  6. Studies on the radioactive contamination due to nuclear detonations III. On the method of estimating the probable time of nuclear detonation from the measurements of gross-activity

    Energy Technology Data Exchange (ETDEWEB)

    Nishiwaki, Yasushi [Nuclear Reactor Laboratory, Tokyo Institute of Technology, Tokyo (Japan); Nuclear Reactor Laboratoroy, Kinki University, Fuse City, Osaka Precture (Japan)

    1961-11-25

    Since it has been observed in Spring of 1954 that a considerable amount of fission products mixture fell with the rain following a large scale nuclear detonation conducted in Bikini area in the South Pacific by the United States Atomic Energy Commission, it has become important, especially from the health physics standpoint, to estimate the effective average age of the fission products mixture after the nuclear detonation. If the energy transferred to the atmospheric air at the time of nuclear detonation is large enough (order of megaton at the distance of about 4000 km), the probable time and test site of nuclear detonation may be estimated with considerable accuracy, from the records of the pressure wave caused by the detonation in the microbarographs at different meteorological stations. Even in this case, in order to estimate the possible correlation between the artificial radioactivity observed in the rain and the probable detonation, it is often times desirable to estimate the effective age of the fission products mixture in the rain from the decay measurement of the radioactivity.

  7. Studies on the radioactive contamination due to nuclear detonations III. On the method of estimating the probable time of nuclear detonation from the measurements of gross-activity

    International Nuclear Information System (INIS)

    Nishiwaki, Yasushi

    1961-01-01

    Since it has been observed in Spring of 1954 that a considerable amount of fission products mixture fell with the rain following a large scale nuclear detonation conducted in Bikini area in the South Pacific by the United States Atomic Energy Commission, it has become important, especially from the health physics standpoint, to estimate the effective average age of the fission products mixture after the nuclear detonation. If the energy transferred to the atmospheric air at the time of nuclear detonation is large enough (order of megaton at the distance of about 4000 km), the probable time and test site of nuclear detonation may be estimated with considerable accuracy, from the records of the pressure wave caused by the detonation in the microbarographs at different meteorological stations. Even in this case, in order to estimate the possible correlation between the artificial radioactivity observed in the rain and the probable detonation, it is often times desirable to estimate the effective age of the fission products mixture in the rain from the decay measurement of the radioactivity

  8. Estimating inverse probability weights using super learner when weight-model specification is unknown in a marginal structural Cox model context.

    Science.gov (United States)

    Karim, Mohammad Ehsanul; Platt, Robert W

    2017-06-15

    Correct specification of the inverse probability weighting (IPW) model is necessary for consistent inference from a marginal structural Cox model (MSCM). In practical applications, researchers are typically unaware of the true specification of the weight model. Nonetheless, IPWs are commonly estimated using parametric models, such as the main-effects logistic regression model. In practice, assumptions underlying such models may not hold and data-adaptive statistical learning methods may provide an alternative. Many candidate statistical learning approaches are available in the literature. However, the optimal approach for a given dataset is impossible to predict. Super learner (SL) has been proposed as a tool for selecting an optimal learner from a set of candidates using cross-validation. In this study, we evaluate the usefulness of a SL in estimating IPW in four different MSCM simulation scenarios, in which we varied the specification of the true weight model specification (linear and/or additive). Our simulations show that, in the presence of weight model misspecification, with a rich and diverse set of candidate algorithms, SL can generally offer a better alternative to the commonly used statistical learning approaches in terms of MSE as well as the coverage probabilities of the estimated effect in an MSCM. The findings from the simulation studies guided the application of the MSCM in a multiple sclerosis cohort from British Columbia, Canada (1995-2008), to estimate the impact of beta-interferon treatment in delaying disability progression. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  10. Survival modeling for the estimation of transition probabilities in model-based economic evaluations in the absence of individual patient data: a tutorial.

    Science.gov (United States)

    Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J

    2014-02-01

    Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model

  11. Trophic redundancy reduces vulnerability to extinction cascades.

    Science.gov (United States)

    Sanders, Dirk; Thébault, Elisa; Kehoe, Rachel; Frank van Veen, F J

    2018-03-06

    Current species extinction rates are at unprecedentedly high levels. While human activities can be the direct cause of some extinctions, it is becoming increasingly clear that species extinctions themselves can be the cause of further extinctions, since species affect each other through the network of ecological interactions among them. There is concern that the simplification of ecosystems, due to the loss of species and ecological interactions, increases their vulnerability to such secondary extinctions. It is predicted that more complex food webs will be less vulnerable to secondary extinctions due to greater trophic redundancy that can buffer against the effects of species loss. Here, we demonstrate in a field experiment with replicated plant-insect communities, that the probability of secondary extinctions is indeed smaller in food webs that include trophic redundancy. Harvesting one species of parasitoid wasp led to secondary extinctions of other, indirectly linked, species at the same trophic level. This effect was markedly stronger in simple communities than for the same species within a more complex food web. We show that this is due to functional redundancy in the more complex food webs and confirm this mechanism with a food web simulation model by highlighting the importance of the presence and strength of trophic links providing redundancy to those links that were lost. Our results demonstrate that biodiversity loss, leading to a reduction in redundant interactions, can increase the vulnerability of ecosystems to secondary extinctions, which, when they occur, can then lead to further simplification and run-away extinction cascades. Copyright © 2018 the Author(s). Published by PNAS.

  12. Innovative Methods for Estimating Densities and Detection Probabilities of Secretive Reptiles Including Invasive Constrictors and Rare Upland Snakes

    Science.gov (United States)

    2018-01-30

    home range  maintenance  or attraction to or avoidance of  landscape features, including  roads  (Morales et al. 2004, McClintock et al. 2012). For example...radiotelemetry and extensive road survey data are used to generate the first density estimates available for the species. The results show that southern...secretive snakes that combines behavioral observations of snake road crossing speed, systematic road survey data, and simulations of spatial

  13. Towards valid 'serious non-fatal injury' indicators for international comparisons based on probability of admission estimates

    DEFF Research Database (Denmark)

    Cryer, Colin; Miller, Ted R; Lyons, Ronan A

    2017-01-01

    in regions of Canada, Denmark, Greece, Spain and the USA. International Classification of Diseases (ICD)-9 or ICD-10 4-digit/character injury diagnosis-specific ED attendance and inpatient admission counts were provided, based on a common protocol. Diagnosis-specific and region-specific PrAs with 95% CIs...... diagnoses with high estimated PrAs. These diagnoses can be used as the basis for more valid international comparisons of life-threatening injury, based on hospital discharge data, for countries with well-developed healthcare and data collection systems....

  14. Assessing Extinction Risk: Integrating Genetic Information

    Directory of Open Access Journals (Sweden)

    Jason Dunham

    1999-06-01

    Full Text Available Risks of population extinction have been estimated using a variety of methods incorporating information from different spatial and temporal scales. We briefly consider how several broad classes of extinction risk assessments, including population viability analysis, incidence functions, and ranking methods integrate information on different temporal and spatial scales. In many circumstances, data from surveys of neutral genetic variability within, and among, populations can provide information useful for assessing extinction risk. Patterns of genetic variability resulting from past and present ecological and demographic events, can indicate risks of extinction that are otherwise difficult to infer from ecological and demographic analyses alone. We provide examples of how patterns of neutral genetic variability, both within, and among populations, can be used to corroborate and complement extinction risk assessments.

  15. A Method to Estimate the Probability that Any Individual Cloud-to-Ground Lightning Stroke was Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.

    2011-01-01

    A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.

  16. Comparative studies of parameters based on the most probable versus an approximate linear extrapolation distance estimates for circular cylindrical absorbing rod

    International Nuclear Information System (INIS)

    Wassef, W.A.

    1982-01-01

    Estimates and techniques that are valid to calculate the linear extrapolation distance for an infinitely long circular cylindrical absorbing region are reviewed. Two estimates, in particular, are put into consideration, that is the most probable and the value resulting from an approximate technique based on matching the integral transport equation inside the absorber with the diffusion approximation in the surrounding infinite scattering medium. Consequently, the effective diffusion parameters and the blackness of the cylinder are derived and subjected to comparative studies. A computer code is set up to calculate and compare the different parameters, which is useful in reactor analysis and serves to establish a beneficial estimates that are amenable to direct application to reactor design codes

  17. Preliminary Evaluation of the Effects of Buried Volcanoes on Estimates of Volcano Probability for the Proposed Repository Site at Yucca Mountain, Nevada

    Science.gov (United States)

    Hill, B. E.; La Femina, P. C.; Stamatakos, J.; Connor, C. B.

    2002-12-01

    increases recurrence rates by 3 v/Myr, which essentially doubles most probability estimates. If the ten buried volcanoes formed in a single episode of intense activity at about 4 Ma, then recurrence rates may increase to 17 v/Myr. This recurrence rate increases the point-event probabilities up to a factor of five. Additional analyses are ongoing to evaluate alternative event definitions and construct numerical models of all relevant magnetic anomalies. This abstract is an independent product of the CNWRA and does not necessarily reflect the views or regulatory position of the NRC.

  18. Bimodal extinction without cross-modal extinction.

    OpenAIRE

    Inhoff, A W; Rafal, R D; Posner, M J

    1992-01-01

    Three patients with unilateral neurological injury were clinically examined. All showed consistent unilateral extinction in the tactile and visual modalities on simultaneous intramodal stimulation. There was virtually no evidence for cross-modal extinction, however, so that contralateral stimulation of one modality would have extinguished perception of ipsilateral stimuli in the other modality. It is concluded that the attentional system controlling the encoding of tactile and visual stimuli ...

  19. Estimating rear-end accident probabilities at signalized intersections: a comparison study of intersections with and without green signal countdown devices.

    Science.gov (United States)

    Ni, Ying; Li, Keping

    2014-01-01

    Rear-end accidents are the most common accident type at signalized intersections, because the diversity of actions taken increases due to signal change. Green signal countdown devices (GSCDs), which have been widely installed in Asia, are thought to have the potential of improving capacity and reducing accidents, but some negative effects on intersection safety have been observed in practice; for example, an increase in rear-end accidents. A microscopic modeling approach was applied to estimate rear-end accident probability during the phase transition interval in the study. The rear-end accident probability is determined by the following probabilities: (1) a leading vehicle makes a "stop" decision, which was formulated by using a binary logistic model, and (2) the following vehicle fails to stop in the available stopping distance, which is closely related to the critical deceleration used by the leading vehicle. Based on the field observation carried out at 2 GSCD intersections and 2 NGSCD intersections (i.e., intersections without GSCD devices) along an arterial in Suzhou, the rear-end probabilities at GSCD and NGSCD intersections were calculated using Monte Carlo simulation. The results suggested that, on the one hand, GSCDs caused significantly negative safety effects during the flashing green interval, especially for vehicles in a zone ranging from 15 to 70 m; on the other hand, GSCD devices were helpful in reducing rear-end accidents during the yellow interval, especially in a zone from 0 to 50 m. GSCDs helped shorten indecision zones and reduce rear-end collisions near the stop line during the yellow interval, but they easily resulted in risky car following behavior and much higher rear-end collision probabilities at indecision zones during both flashing green and yellow intervals. GSCDs are recommended to be cautiously installed and education on safe driving behavior should be available.

  20. Common Cause Case Study: An Estimated Probability of Four Solid Rocket Booster Hold-Down Post Stud Hang-ups

    Science.gov (United States)

    Cross, Robert

    2005-01-01

    Until Solid Rocket Motor ignition, the Space Shuttle is mated to the Mobil Launch Platform in part via eight (8) Solid Rocket Booster (SRB) hold-down bolts. The bolts are fractured using redundant pyrotechnics, and are designed to drop through a hold-down post on the Mobile Launch Platform before the Space Shuttle begins movement. The Space Shuttle program has experienced numerous failures where a bolt has hung up. That is, it did not clear the hold-down post before liftoff and was caught by the SRBs. This places an additional structural load on the vehicle that was not included in the original certification requirements. The Space Shuttle is currently being certified to withstand the loads induced by up to three (3) of eight (8) SRB hold-down experiencing a "hang-up". The results of loads analyses performed for (4) stud hang-ups indicate that the internal vehicle loads exceed current structural certification limits at several locations. To determine the risk to the vehicle from four (4) stud hang-ups, the likelihood of the scenario occurring must first be evaluated. Prior to the analysis discussed in this paper, the likelihood of occurrence had been estimated assuming that the stud hang-ups were completely independent events. That is, it was assumed that no common causes or factors existed between the individual stud hang-up events. A review of the data associated with the hang-up events, showed that a common factor (timing skew) was present. This paper summarizes a revised likelihood evaluation performed for the four (4) stud hang-ups case considering that there are common factors associated with the stud hang-ups. The results show that explicitly (i.e. not using standard common cause methodologies such as beta factor or Multiple Greek Letter modeling) taking into account the common factor of timing skew results in an increase in the estimated likelihood of four (4) stud hang-ups of an order of magnitude over the independent failure case.

  1. Probability estimation of rare extreme events in the case of small samples: Technique and examples of analysis of earthquake catalogs

    Science.gov (United States)

    Pisarenko, V. F.; Rodkin, M. V.; Rukavishnikova, T. A.

    2017-11-01

    The most general approach to studying the recurrence law in the area of the rare largest events is associated with the use of limit law theorems of the theory of extreme values. In this paper, we use the Generalized Pareto Distribution (GPD). The unknown GPD parameters are typically determined by the method of maximal likelihood (ML). However, the ML estimation is only optimal for the case of fairly large samples (>200-300), whereas in many practical important cases, there are only dozens of large events. It is shown that in the case of a small number of events, the highest accuracy in the case of using the GPD is provided by the method of quantiles (MQs). In order to illustrate the obtained methodical results, we have formed the compiled data sets characterizing the tails of the distributions for typical subduction zones, regions of intracontinental seismicity, and for the zones of midoceanic (MO) ridges. This approach paves the way for designing a new method for seismic risk assessment. Here, instead of the unstable characteristics—the uppermost possible magnitude M max—it is recommended to use the quantiles of the distribution of random maxima for a future time interval. The results of calculating such quantiles are presented.

  2. A new method for estimating the probable maximum hail loss of a building portfolio based on hailfall intensity determined by radar measurements

    Science.gov (United States)

    Aller, D.; Hohl, R.; Mair, F.; Schiesser, H.-H.

    2003-04-01

    Extreme hailfall can cause massive damage to building structures. For the insurance and reinsurance industry it is essential to estimate the probable maximum hail loss of their portfolio. The probable maximum loss (PML) is usually defined with a return period of 1 in 250 years. Statistical extrapolation has a number of critical points, as historical hail loss data are usually only available from some events while insurance portfolios change over the years. At the moment, footprints are derived from historical hail damage data. These footprints (mean damage patterns) are then moved over a portfolio of interest to create scenario losses. However, damage patterns of past events are based on the specific portfolio that was damaged during that event and can be considerably different from the current spread of risks. A new method for estimating the probable maximum hail loss to a building portfolio is presented. It is shown that footprints derived from historical damages are different to footprints of hail kinetic energy calculated from radar reflectivity measurements. Based on the relationship between radar-derived hail kinetic energy and hail damage to buildings, scenario losses can be calculated. A systematic motion of the hail kinetic energy footprints over the underlying portfolio creates a loss set. It is difficult to estimate the return period of losses calculated with footprints derived from historical damages being moved around. To determine the return periods of the hail kinetic energy footprints over Switzerland, 15 years of radar measurements and 53 years of agricultural hail losses are available. Based on these data, return periods of several types of hailstorms were derived for different regions in Switzerland. The loss set is combined with the return periods of the event set to obtain an exceeding frequency curve, which can be used to derive the PML.

  3. A developmental study of risky decisions on the cake gambling task: age and gender analyses of probability estimation and reward evaluation.

    Science.gov (United States)

    Van Leijenhorst, Linda; Westenberg, P Michiel; Crone, Eveline A

    2008-01-01

    Decision making, or the process of choosing between competing courses of actions, is highly sensitive to age-related change, showing development throughout adolescence. In this study, we tested whether the development of decision making under risk is related to changes in risk-estimation abilities. Participants (N = 93) between ages 8-30 performed a child friendly gambling task, the Cake Gambling task, which was inspired by the Cambridge Gambling Task (Rogers et al., 1999), which has previously been shown to be sensitive to orbitofrontal cortex (OFC) damage. The task allowed comparisons of the contributions to risk perception of (1) the ability to estimate probabilities and (2) evaluate rewards. Adult performance patterns were highly similar to those found in previous reports, showing increased risk taking with increases in the probability of winning and the magnitude of potential reward. Behavioral patterns in children and adolescents did not differ from adult patterns, showing a similar ability for probability estimation and reward evaluation. These data suggest that participants 8 years and older perform like adults in a gambling task, previously shown to depend on the OFC in which all the information needed to make an advantageous decision is given on each trial and no information needs to be inferred from previous behavior. Interestingly, at all ages, females were more risk-averse than males. These results suggest that the increase in real-life risky behavior that is seen in adolescence is not a consequence of changes in risk perception abilities. The findings are discussed in relation to theories about the protracted development of the prefrontal cortex.

  4. Uncertainty in estimating probability of causation in a cross-sectional study: joint effects of radiation and hepatitis-C virus on chronic liver disease

    Energy Technology Data Exchange (ETDEWEB)

    Cologne, John B [Department of Statistics, Radiation Effects Research Foundation, 5-2 Hijiyama Park, Minami-ku, Hiroshima 732-0815 (Japan); Pawel, David J [Office of Radiation and Indoor Air, US Environmental Protection Agency, 1200 Pennsylvania Ave NW, Washington DC 20460 (United States); Sharp, Gerald B [Department of Epidemiology, Radiation Effects Research Foundation, 5-2 Hijiyama Park, Minami-ku, Hiroshima 732-0815 (Japan); Fujiwara, Saeko [Department of Clinical Studies, Radiation Effects Research Foundation, 5-2 Hijiyama Park, Minami-ku, Hiroshima 732-0815 (Japan)

    2004-06-01

    Exposure to other risk factors is an important consideration in assessing the role played by radiation in producing disease. A cross-sectional study of atomic-bomb survivors suggested an interaction between whole-body radiation exposure and chronic hepatitis-C viral (HCV) infection in the etiology of chronic liver disease (chronic hepatitis and cirrhosis), but did not allow determination of the joint-effect mechanism. Different estimates of probability of causation (POC) conditional on HCV status resulted from additive and multiplicative models. We therefore estimated the risk for radiation conditional on HCV status using a more general, mixture model that does not require choosing between additivity or multiplicativity, or deciding whether there is interaction, in the face of the large uncertainty. The results support the conclusion that POC increases with radiation dose in persons without HCV infection, but are inconclusive regarding individuals with HCV infection, the lower confidence bound on estimated POC for radiation with HCV infection being zero over the entire dose range. Although the mixture model may not reflect the true joint-effect mechanism, it avoids restrictive model assumptions that cannot be validated using the available data yet have a profound influence on estimated POC. These considerations apply more generally, given that the additive and multiplicative models are often used in POC related work. We therefore consider that an empirical approach may be preferable to assuming a specific mechanistic model for estimating POC in epidemiological studies where the joint-effect mechanism is in doubt.

  5. Uncertainty in estimating probability of causation in a cross-sectional study: joint effects of radiation and hepatitis-C virus on chronic liver disease

    International Nuclear Information System (INIS)

    Cologne, John B; Pawel, David J; Sharp, Gerald B; Fujiwara, Saeko

    2004-01-01

    Exposure to other risk factors is an important consideration in assessing the role played by radiation in producing disease. A cross-sectional study of atomic-bomb survivors suggested an interaction between whole-body radiation exposure and chronic hepatitis-C viral (HCV) infection in the etiology of chronic liver disease (chronic hepatitis and cirrhosis), but did not allow determination of the joint-effect mechanism. Different estimates of probability of causation (POC) conditional on HCV status resulted from additive and multiplicative models. We therefore estimated the risk for radiation conditional on HCV status using a more general, mixture model that does not require choosing between additivity or multiplicativity, or deciding whether there is interaction, in the face of the large uncertainty. The results support the conclusion that POC increases with radiation dose in persons without HCV infection, but are inconclusive regarding individuals with HCV infection, the lower confidence bound on estimated POC for radiation with HCV infection being zero over the entire dose range. Although the mixture model may not reflect the true joint-effect mechanism, it avoids restrictive model assumptions that cannot be validated using the available data yet have a profound influence on estimated POC. These considerations apply more generally, given that the additive and multiplicative models are often used in POC related work. We therefore consider that an empirical approach may be preferable to assuming a specific mechanistic model for estimating POC in epidemiological studies where the joint-effect mechanism is in doubt

  6. Adaptive Dynamics, Control, and Extinction in Networked Populations

    Science.gov (United States)

    2015-07-09

    network geometries. From the pre-history of paths that go extinct, a density function is created from the prehistory of these paths, and a clear local...density plots of Fig. 3b. Using the IAMM to compute the most probable path and comparing it to the prehistory of extinction events on stochastic networks

  7. Extinction with multiple excitors

    OpenAIRE

    McConnell, Bridget L.; Miguez, Gonzalo; Miller, Ralph R.

    2013-01-01

    Four conditioned suppression experiments with rats, using an ABC renewal design, investigated the effects of compounding the target conditioned excitor with additional, nontarget conditioned excitors during extinction. Experiment 1 showed stronger extinction, as evidenced by less renewal, when the target excitor was extinguished in compound with a second excitor, relative to when it was extinguished with associatively neutral stimuli. Critically, this deepened extinction effect was attenuated...

  8. Local population extinction and vitality of an epiphytic lichen in fragmented old-growth forest.

    Science.gov (United States)

    Ockinger, Erik; Nilsson, Sven G

    2010-07-01

    The population dynamics of organisms living in short-lived habitats will largely depend on the turnover of habitat patches. It has been suggested that epiphytes, whose host plants can be regarded as habitat patches, often form such patch-tracking populations. However, very little is known about the long-term fate of epiphyte individuals and populations. We estimated life span and assessed environmental factors influencing changes in vitality, fertility, abundance, and distribution of the epiphytic lichen species Lobaria pulmonaria on two spatial scales, individual trees and forest patches, over a period of approximately 10 years in 66 old-growth forest fragments. The lichen had gone extinct from 7 of the 66 sites (13.0%) where it was found 10 years earlier, even though the sites remained unchanged. The risk of local population extinction increased with decreasing population size. In contrast to the decrease in the number of occupied trees and sites, the mean area of the lichen per tree increased by 43.0%. The number of trees with fertile ramets of L. pulmonaria increased from 7 (approximately 1%) to 61 (approximately 10%) trees, and the number of forest fragments with fertile ramets increased from 4 to 23 fragments. The mean annual rate of L. pulmonaria extinction at the tree level was estimated to be 2.52%, translating into an expected lifetime of 39.7 years. This disappearance rate is higher than estimated mortality rates for potential host trees. The risk of extinction at the tree level was significantly positively related to tree circumference and differed between tree species. The probability of presence of fertile ramets increased significantly with local population size. Our results show a long expected lifetime of Lobaria pulmonaria ramets on individual trees and a recent increase in vitality, probably due to decreasing air pollution. The population is, however, declining slowly even though remaining stands are left uncut, which we interpret as an

  9. Extinction times of epidemic outbreaks in networks.

    Science.gov (United States)

    Holme, Petter

    2013-01-01

    In the Susceptible-Infectious-Recovered (SIR) model of disease spreading, the time to extinction of the epidemics happens at an intermediate value of the per-contact transmission probability. Too contagious infections burn out fast in the population. Infections that are not contagious enough die out before they spread to a large fraction of people. We characterize how the maximal extinction time in SIR simulations on networks depend on the network structure. For example we find that the average distances in isolated components, weighted by the component size, is a good predictor of the maximal time to extinction. Furthermore, the transmission probability giving the longest outbreaks is larger than, but otherwise seemingly independent of, the epidemic threshold.

  10. Extinction times of epidemic outbreaks in networks.

    Directory of Open Access Journals (Sweden)

    Petter Holme

    Full Text Available In the Susceptible-Infectious-Recovered (SIR model of disease spreading, the time to extinction of the epidemics happens at an intermediate value of the per-contact transmission probability. Too contagious infections burn out fast in the population. Infections that are not contagious enough die out before they spread to a large fraction of people. We characterize how the maximal extinction time in SIR simulations on networks depend on the network structure. For example we find that the average distances in isolated components, weighted by the component size, is a good predictor of the maximal time to extinction. Furthermore, the transmission probability giving the longest outbreaks is larger than, but otherwise seemingly independent of, the epidemic threshold.

  11. Long-term archives reveal shifting extinction selectivity in China's postglacial mammal fauna

    Science.gov (United States)

    Crees, Jennifer J.; Li, Zhipeng; Bielby, Jon; Yuan, Jing

    2017-01-01

    Ecosystems have been modified by human activities for millennia, and insights about ecology and extinction risk based only on recent data are likely to be both incomplete and biased. We synthesize multiple long-term archives (over 250 archaeological and palaeontological sites dating from the early Holocene to the Ming Dynasty and over 4400 historical records) to reconstruct the spatio-temporal dynamics of Holocene–modern range change across China, a megadiverse country experiencing extensive current-day biodiversity loss, for 34 mammal species over three successive postglacial time intervals. Our combined zooarchaeological, palaeontological, historical and current-day datasets reveal that both phylogenetic and spatial patterns of extinction selectivity have varied through time in China, probably in response both to cumulative anthropogenic impacts (an ‘extinction filter’ associated with vulnerable species and accessible landscapes being affected earlier by human activities) and also to quantitative and qualitative changes in regional pressures. China has experienced few postglacial global species-level mammal extinctions, and most species retain over 50% of their maximum estimated Holocene range despite millennia of increasing regional human pressures, suggesting that the potential still exists for successful species conservation and ecosystem restoration. Data from long-term archives also demonstrate that herbivores have experienced more historical extinctions in China, and carnivores have until recently displayed greater resilience. Accurate assessment of patterns of biodiversity loss and the likely predictive power of current-day correlates of faunal vulnerability and resilience is dependent upon novel perspectives provided by long-term archives. PMID:29167363

  12. A technique for estimating the probability of radiation-stimulated failures of integrated microcircuits in low-intensity radiation fields: Application to the Spektr-R spacecraft

    Science.gov (United States)

    Popov, V. D.; Khamidullina, N. M.

    2006-10-01

    In developing radio-electronic devices (RED) of spacecraft operating in the fields of ionizing radiation in space, one of the most important problems is the correct estimation of their radiation tolerance. The “weakest link” in the element base of onboard microelectronic devices under radiation effect is the integrated microcircuits (IMC), especially of large scale (LSI) and very large scale (VLSI) degree of integration. The main characteristic of IMC, which is taken into account when making decisions on using some particular type of IMC in the onboard RED, is the probability of non-failure operation (NFO) at the end of the spacecraft’s lifetime. It should be noted that, until now, the NFO has been calculated only from the reliability characteristics, disregarding the radiation effect. This paper presents the so-called “reliability” approach to determination of radiation tolerance of IMC, which allows one to estimate the probability of non-failure operation of various types of IMC with due account of radiation-stimulated dose failures. The described technique is applied to RED onboard the Spektr-R spacecraft to be launched in 2007.

  13. Estimating reach-specific fish movement probabilities in rivers with a Bayesian state-space model: application to sea lamprey passage and capture at dams

    Science.gov (United States)

    Holbrook, Christopher M.; Johnson, Nicholas S.; Steibel, Juan P.; Twohey, Michael B.; Binder, Thomas R.; Krueger, Charles C.; Jones, Michael L.

    2014-01-01

    Improved methods are needed to evaluate barriers and traps for control and assessment of invasive sea lamprey (Petromyzon marinus) in the Great Lakes. A Bayesian state-space model provided reach-specific probabilities of movement, including trap capture and dam passage, for 148 acoustic tagged invasive sea lamprey in the lower Cheboygan River, Michigan, a tributary to Lake Huron. Reach-specific movement probabilities were combined to obtain estimates of spatial distribution and abundance needed to evaluate a barrier and trap complex for sea lamprey control and assessment. Of an estimated 21 828 – 29 300 adult sea lampreys in the river, 0%–2%, or 0–514 untagged lampreys, could have passed upstream of the dam, and 46%–61% were caught in the trap. Although no tagged lampreys passed above the dam (0/148), our sample size was not sufficient to consider the lock and dam a complete barrier to sea lamprey. Results also showed that existing traps are in good locations because 83%–96% of the population was vulnerable to existing traps. However, only 52%–69% of lampreys vulnerable to traps were caught, suggesting that traps can be improved. The approach used in this study was a novel use of Bayesian state-space models that may have broader applications, including evaluation of barriers for other invasive species (e.g., Asian carp (Hypophthalmichthys spp.)) and fish passage structures for other diadromous fishes.

  14. A scenario tree model for the Canadian Notifiable Avian Influenza Surveillance System and its application to estimation of probability of freedom and sample size determination.

    Science.gov (United States)

    Christensen, Jette; Stryhn, Henrik; Vallières, André; El Allaki, Farouk

    2011-05-01

    In 2008, Canada designed and implemented the Canadian Notifiable Avian Influenza Surveillance System (CanNAISS) with six surveillance activities in a phased-in approach. CanNAISS was a surveillance system because it had more than one surveillance activity or component in 2008: passive surveillance; pre-slaughter surveillance; and voluntary enhanced notifiable avian influenza surveillance. Our objectives were to give a short overview of two active surveillance components in CanNAISS; describe the CanNAISS scenario tree model and its application to estimation of probability of populations being free of NAI virus infection and sample size determination. Our data from the pre-slaughter surveillance component included diagnostic test results from 6296 serum samples representing 601 commercial chicken and turkey farms collected from 25 August 2008 to 29 January 2009. In addition, we included data from a sub-population of farms with high biosecurity standards: 36,164 samples from 55 farms sampled repeatedly over the 24 months study period from January 2007 to December 2008. All submissions were negative for Notifiable Avian Influenza (NAI) virus infection. We developed the CanNAISS scenario tree model, so that it will estimate the surveillance component sensitivity and the probability of a population being free of NAI at the 0.01 farm-level and 0.3 within-farm-level prevalences. We propose that a general model, such as the CanNAISS scenario tree model, may have a broader application than more detailed models that require disease specific input parameters, such as relative risk estimates. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  15. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  16. End Ordovician extinctions

    DEFF Research Database (Denmark)

    Harper, David A. T.; Hammarlund, Emma; Rasmussen, Christian M. Ø.

    2014-01-01

    -global anoxia associated with a marked transgression during the Late Hirnantian. Most recently, however, new drivers for the extinctions have been proposed, including widespread euxinia together with habitat destruction caused by plate tectonic movements, suggesting that the end Ordovician mass extinctions were...

  17. Selection, subdivision and extinction and recolonization.

    Science.gov (United States)

    Cherry, Joshua L

    2004-02-01

    In a subdivided population, the interaction between natural selection and stochastic change in allele frequency is affected by the occurrence of local extinction and subsequent recolonization. The relative importance of selection can be diminished by this additional source of stochastic change in allele frequency. Results are presented for subdivided populations with extinction and recolonization where there is more than one founding allele after extinction, where these may tend to come from the same source deme, where the number of founding alleles is variable or the founders make unequal contributions, and where there is dominance for fitness or local frequency dependence. The behavior of a selected allele in a subdivided population is in all these situations approximately the same as that of an allele with different selection parameters in an unstructured population with a different size. The magnitude of the quantity N(e)s(e), which determines fixation probability in the case of genic selection, is always decreased by extinction and recolonization, so that deleterious alleles are more likely to fix and advantageous alleles less likely to do so. The importance of dominance or frequency dependence is also altered by extinction and recolonization. Computer simulations confirm that the theoretical predictions of both fixation probabilities and mean times to fixation are good approximations.

  18. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  19. Extinction of NGC 7027

    International Nuclear Information System (INIS)

    Seaton, M.J.

    1979-01-01

    Emission intensities of recombination lines in hydrogenic spectra are known accurately relative to intensities in the free-free radio continuum. For NGC 7027 intensities have been measured for the radio continuum and for H I and He II lines in the wavelength range from lambda = 2.17 μm to lambda = 1640 A: comparison with the calculated emission intensities gives the extinction. Determinations of the standard interstellar extinction function are critically discussed. The extinction deduced for the total radiation from NGC 7027 has a dependence on wavelength for 6563 A >= lambda >= 1640 A which is in excellent agreement with the adopted standard results, but there are some anomalies for longer wavelengths and for the ratio of total to selective extinction. These can be explained using a model which allows for a local contribution to the extinction which is variable over the surface of the nebula. (author)

  20. Interstellar extinction correlations

    International Nuclear Information System (INIS)

    Jones, A.P.; Williams, D.A.; Duley, W.W.

    1987-01-01

    A recently proposed model for interstellar grains in which the extinction arises from small silicate cores with mantles of hydrogenated amorphous carbon (HAC or α-C:H), and large, but thinly coated, silicate grains can successfully explain many of the observed properties of interstellar dust. The small silicate cores give rise to the 2200 A extinction feature. The extinction in the visual is produced by the large silicates and the HAC mantles on the small cores, whilst the far UV extinction arises in the HAC mantles with a small contribution form the silicate grains. The grain model requires that the silicate material is the more resilient component and that variations in the observed extinction from region to region are due to the nature and depletion of the carbon in the HAC mantles. (author)

  1. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  2. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  3. "Life history space": a multivariate analysis of life history variation in extant and extinct Malagasy lemurs.

    Science.gov (United States)

    Catlett, Kierstin K; Schwartz, Gary T; Godfrey, Laurie R; Jungers, William L

    2010-07-01

    Studies of primate life history variation are constrained by the fact that all large-bodied extant primates are haplorhines. However, large-bodied strepsirrhines recently existed. If we can extract life history information from their skeletons, these species can contribute to our understanding of primate life history variation. This is particularly important in light of new critiques of the classic "fast-slow continuum" as a descriptor of variation in life history profiles across mammals in general. We use established dental histological methods to estimate gestation length and age at weaning for five extinct lemur species. On the basis of these estimates, we reconstruct minimum interbirth intervals and maximum reproductive rates. We utilize principal components analysis to create a multivariate "life history space" that captures the relationships among reproductive parameters and brain and body size in extinct and extant lemurs. Our data show that, whereas large-bodied extinct lemurs can be described as "slow" in some fashion, they also varied greatly in their life history profiles. Those with relatively large brains also weaned their offspring late and had long interbirth intervals. These were not the largest of extinct lemurs. Thus, we distinguish size-related life history variation from variation that linked more strongly to ecological factors. Because all lemur species larger than 10 kg, regardless of life history profile, succumbed to extinction after humans arrived in Madagascar, we argue that large body size increased the probability of extinction independently of reproductive rate. We also provide some evidence that, among lemurs, brain size predicts reproductive rate better than body size. (c) 2010 Wiley-Liss, Inc.

  4. Extinction and the fossil record

    Science.gov (United States)

    Sepkoski, J. J. Jr; Sepkoski JJ, ,. J. r. (Principal Investigator)

    1994-01-01

    The author examines evidence of mass extinctions in the fossil record and searches for reasons for such large extinctions. Five major mass extinctions eliminated at least 40 percent of animal genera in the oceans and from 65 to 95 percent of ocean species. Questions include the occurrence of gradual or catastrophic extinctions, causes, environment, the capacity of a perturbation to cause extinctions each time it happens, and the possibility and identification of complex events leading to a mass extinction.

  5. Combining information from surveys of several species to estimate the probability of freedom from Echinococcus multilocularis in Sweden, Finland and mainland Norway

    Directory of Open Access Journals (Sweden)

    Hjertqvist Marika

    2011-02-01

    Full Text Available Abstract Background The fox tapeworm Echinococcus multilocularis has foxes and other canids as definitive host and rodents as intermediate hosts. However, most mammals can be accidental intermediate hosts and the larval stage may cause serious disease in humans. The parasite has never been detected in Sweden, Finland and mainland Norway. All three countries require currently an anthelminthic treatment for dogs and cats prior to entry in order to prevent introduction of the parasite. Documentation of freedom from E. multilocularis is necessary for justification of the present import requirements. Methods The probability that Sweden, Finland and mainland Norway were free from E. multilocularis and the sensitivity of the surveillance systems were estimated using scenario trees. Surveillance data from five animal species were included in the study: red fox (Vulpes vulpes, raccoon dog (Nyctereutes procyonoides, domestic pig, wild boar (Sus scrofa and voles and lemmings (Arvicolinae. Results The cumulative probability of freedom from EM in December 2009 was high in all three countries, 0.98 (95% CI 0.96-0.99 in Finland and 0.99 (0.97-0.995 in Sweden and 0.98 (0.95-0.99 in Norway. Conclusions Results from the model confirm that there is a high probability that in 2009 the countries were free from E. multilocularis. The sensitivity analyses showed that the choice of the design prevalences in different infected populations was influential. Therefore more knowledge on expected prevalences for E. multilocularis in infected populations of different species is desirable to reduce residual uncertainty of the results.

  6. Un modelo de opciones barreras para estimar las probabilidades de fracasos financieros de empresas. Barrier options model for estimate firm´s probabilities for financial distress

    Directory of Open Access Journals (Sweden)

    Gastón S. Milanesi

    2016-11-01

    probabilities of financial distress. The exotic barrier options make an alternative approach for predicting financial distress, and its structure fits better to the firm valuevolatility relationship. The paper proposes a “naive” barrier option model, because it simplifies the estimation of the unobservable variables, like firm asset’s value and risk. First, a simple call and barrier option models are developed in order to value the firm’s capital and estimate the financial distress probability. Using an hypothetical case, it is proposed a sensibility exercise over period and volatility. Similar exercise is applied to estimate the capital value and financial distress probability over two firms of Argentinian capitals, with different leverage degree, confirming the consistency in the relationship between volatility-value-financial distress probability of the proposed model. Finally, the main conclusions are shown.

  7. [Prevalence of osteoporosis, estimation of probability of fracture and bone metabolism study in patients with newly diagnosed prostate cancer in the health area of Lugo].

    Science.gov (United States)

    Miguel-Carrera, Jonatan; García-Porrua, Carlos; de Toro Santos, Francisco Javier; Picallo-Sánchez, Jose Antonio

    2018-03-01

    To study the prevalence of osteoporosis and fracture probability in patients diagnosed with prostate cancer. Observational descriptive transversal study. SITE: Study performed from Primary Care of Lugo in collaboration with Rheumatology and Urology Services of our referral hospital. Patients diagnosed with prostate cancer without bone metastatic disease from January to December 2012. Epidemiologic, clinical, laboratory and densitometric variables involved in osteoporosis were collected. The likelihood of fracture was estimated by FRAX ® Tool. Eighty-three patients met the inclusion criteria. None was excluded. The average age was 67 years. The Body Mass Index was 28.28. Twenty-five patients (30.1%) had previous osteoporotic fractures. Other prevalent risk factors were alcohol (26.5%) and smoking (22.9%). Eighty-two subjects had vitamin D below normal level (98.80%). Femoral Neck densitometry showed that 8.9% had osteoporosis and 54% osteopenia. The average fracture risk in this population, estimated by FRAX ® , was 2.63% for hip fracture and 5.28% for major fracture. Cut level for FRAX ® major fracture value without DXA >5% and ≥7.5% proposed by Azagra et al. showed 24 patients (28.92%) and 8 patients (9.64%) respectively. The prevalence of osteoporosis in this population was very high. The more frequent risk factors associated with osteoporosis were: previous osteoporotic fracture, alcohol consumption, smoking and family history of previous fracture. The probability of fracture using femoral neck FRAX ® tool was low. Vitamin D deficiency was very common (98.8%). Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  8. Is IR going extinct?

    Science.gov (United States)

    Mitchell, Audra

    2016-01-01

    A global extinction crisis may threaten the survival of most existing life forms. Influential discourses of ‘existential risk’ suggest that human extinction is a real possibility, while several decades of evidence from conservation biology suggests that the Earth may be entering a ‘sixth mass extinction event’. These conditions threaten the possibilities of survival and security that are central to most branches of International Relations. However, this discipline lacks a framework for addressing (mass) extinction. From notions of ‘nuclear winter’ and ‘omnicide’ to contemporary discourses on catastrophe, International Relations thinking has treated extinction as a superlative of death. This is a profound category mistake: extinction needs to be understood not in the ontic terms of life and death, but rather in the ontological context of be(com)ing and negation. Drawing on the work of theorists of the ‘inhuman’ such as Quentin Meillassoux, Claire Colebrook, Ray Brassier, Jean-Francois Lyotard and Nigel Clark, this article provides a pathway for thinking beyond existing horizons of survival and imagines a profound transformation of International Relations. Specifically, it outlines a mode of cosmopolitics that responds to the element of the inhuman and the forces of extinction. Rather than capitulating to narratives of tragedy, this cosmopolitics would make it possible to think beyond the restrictions of existing norms of ‘humanity’ to embrace an ethics of gratitude and to welcome the possibility of new worlds, even in the face of finitude.

  9. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  10. Sensor Fusion Based on an Integrated Neural Network and Probability Density Function (PDF) Dual Kalman Filter for On-Line Estimation of Vehicle Parameters and States.

    Science.gov (United States)

    Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente

    2017-04-29

    Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.

  11. Temporal Dynamics of Recovery from Extinction Shortly after Extinction Acquisition

    Science.gov (United States)

    Archbold, Georgina E.; Dobbek, Nick; Nader, Karim

    2013-01-01

    Evidence suggests that extinction is new learning. Memory acquisition involves both short-term memory (STM) and long-term memory (LTM) components; however, few studies have examined early phases of extinction retention. Retention of auditory fear extinction was examined at various time points. Shortly (1-4 h) after extinction acquisition…

  12. Biogeographic and bathymetric determinants of brachiopod extinction and survival during the Late Ordovician mass extinction

    DEFF Research Database (Denmark)

    Finnegan, Seth; Mac Ørum Rasmussen, Christian; Harper, David A. T.

    2016-01-01

    –Early Silurian genus extinctions and evaluate which extinction drivers are best supported by the data. The first (latest Katian) pulse of the LOME preferentially affected genera restricted to deeper waters or to relatively narrow (less than 35°) palaeolatitudinal ranges. This pattern is only observed...... in the latest Katian, suggesting that it reflects drivers unique to this interval. Extinction of exclusively deeper-water genera implies that changes in water mass properties such as dissolved oxygen content played an important role. Extinction of genera with narrow latitudinal ranges suggests that interactions...... between shifting climate zones and palaeobiogeography may also have been important. We test the latter hypothesis by estimating whether each genus would have been able to track habitats within its thermal tolerance range during the greenhouse–icehouse climate transition. Models including these estimates...

  13. Unbiased survival estimates and evidence for skipped breeding opportunities in females

    Science.gov (United States)

    Muths, Erin L.; Scherer, Rick D.; Lambert, Brad A.

    2010-01-01

    1. Estimates of demographic parameters for females, in many organisms, are sparse. This is particularly worrisome as more and more species are faced with high extinction probabilities and conservation increasingly depends on actions dictated by complex predictive models that require accurate estimates of demographic parameters for each sex and species.

  14. Re Os depositional ages and seawater Os estimates for the Frasnian Famennian boundary: Implications for weathering rates, land plant evolution, and extinction mechanisms

    Science.gov (United States)

    Turgeon, Steven C.; Creaser, Robert A.; Algeo, Thomas J.

    2007-09-01

    Four TOC-rich shale intervals spanning the Frasnian-Famennian (F-F) boundary were recovered in a drillcore (West Valley NX-1) from western New York (USA) and radiometrically dated using Re-Os. Two of the black shale intervals (WVC785 from ˜ 2.9 m below, and WVC754 from ˜ 6.4 m above the F-F boundary, respectively) yielded statistically overlapping ages with uncertainties of initial 187Os/ 188Os (0.45 to 0.47), reflecting contemporaneous seawater Os values, are low but similar to the value of 0.42 reported for the Exshaw Fm (Canada) at the Devonian-Mississippian boundary (ca. 361 Ma) [Selby D., Creaser R.A., 2005. Direct radiometric dating of the Devonian-Mississippian time-scale boundary using the Re-Os black shale geochronometer. Geology 33, 545-548]. This may suggest fairly constant and low global continental weathering rates during the Late Devonian, although in view of the short residence time of Os in seawater (˜ 1-4 × 10 4 yr), further measurements are needed to assess potential short-term variation in seawater Os ratios. Owing to low Os and Re abundances at the F-F boundary, our data are inconsistent with long-term volcanism and bolide impact as potential Late Devonian mass extinction mechanisms. In addition, the Frasnian-Famennian ocean appears to have been depleted with respect to Re, possibly indicating an exhaustion of the Re seawater reservoir owing to high burial rates of redox-sensitive elements under dysoxic/anoxic conditions leading up to the F-F boundary.

  15. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  16. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  17. Does litter size variation affect models of terrestrial carnivore extinction risk and management?

    Directory of Open Access Journals (Sweden)

    Eleanor S Devenish-Nelson

    Full Text Available Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores.We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species - the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used.These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes.

  18. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Probability Estimates of Solar Particle Event Doses During a Period of Low Sunspot Number for Thinly-Shielded Spacecraft and Short Duration Missions

    Science.gov (United States)

    Atwell, William; Tylka, Allan J.; Dietrich, William; Rojdev, Kristina; Matzkind, Courtney

    2016-01-01

    In an earlier paper (Atwell, et al., 2015), we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the sunspot number (SSN) was less than 30. These SPEs contain Ground Level Events (GLE), sub-GLEs, and sub-sub-GLEs (Tylka and Dietrich, 2009, Tylka and Dietrich, 2008, and Atwell, et al., 2008). GLEs are extremely energetic solar particle events having proton energies extending into the several GeV range and producing secondary particles in the atmosphere, mostly neutrons, observed with ground station neutron monitors. Sub-GLE events are less energetic, extending into the several hundred MeV range, but do not produce secondary atmospheric particles. Sub-sub GLEs are even less energetic with an observable increase in protons at energies greater than 30 MeV, but no observable proton flux above 300 MeV. In this paper, we consider those SPEs that occurred during 1973-2010 when the SSN was greater than 30 but less than 50. In addition, we provide probability estimates of absorbed dose based on mission duration with a 95% confidence level (CL). We also discuss the implications of these data and provide some recommendations that may be useful to spacecraft designers of these smaller spacecraft.

  20. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Science.gov (United States)

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  1. Pre- versus post-mass extinction divergence of Mesozoic marine reptiles dictated by time-scale dependence of evolutionary rates.

    Science.gov (United States)

    Motani, Ryosuke; Jiang, Da-Yong; Tintori, Andrea; Ji, Cheng; Huang, Jian-Dong

    2017-05-17

    The fossil record of a major clade often starts after a mass extinction even though evolutionary rates, molecular or morphological, suggest its pre-extinction emergence (e.g. squamates, placentals and teleosts). The discrepancy is larger for older clades, and the presence of a time-scale-dependent methodological bias has been suggested, yet it has been difficult to avoid the bias using Bayesian phylogenetic methods. This paradox raises the question of whether ecological vacancies, such as those after mass extinctions, prompt the radiations. We addressed this problem by using a unique temporal characteristic of the morphological data and a high-resolution stratigraphic record, for the oldest clade of Mesozoic marine reptiles, Ichthyosauromorpha. The evolutionary rate was fastest during the first few million years of ichthyosauromorph evolution and became progressively slower over time, eventually becoming six times slower. Using the later slower rates, estimates of divergence time become excessively older. The fast, initial rate suggests the emergence of ichthyosauromorphs after the end-Permian mass extinction, matching an independent result from high-resolution stratigraphic confidence intervals. These reptiles probably invaded the sea as a new ecosystem was formed after the end-Permian mass extinction. Lack of information on early evolution biased Bayesian clock rates. © 2017 The Author(s).

  2. Stress and Fear Extinction

    Science.gov (United States)

    Maren, Stephen; Holmes, Andrew

    2016-01-01

    Stress has a critical role in the development and expression of many psychiatric disorders, and is a defining feature of posttraumatic stress disorder (PTSD). Stress also limits the efficacy of behavioral therapies aimed at limiting pathological fear, such as exposure therapy. Here we examine emerging evidence that stress impairs recovery from trauma by impairing fear extinction, a form of learning thought to underlie the suppression of trauma-related fear memories. We describe the major structural and functional abnormalities in brain regions that are particularly vulnerable to stress, including the amygdala, prefrontal cortex, and hippocampus, which may underlie stress-induced impairments in extinction. We also discuss some of the stress-induced neurochemical and molecular alterations in these brain regions that are associated with extinction deficits, and the potential for targeting these changes to prevent or reverse impaired extinction. A better understanding of the neurobiological basis of stress effects on extinction promises to yield novel approaches to improving therapeutic outcomes for PTSD and other anxiety and trauma-related disorders. PMID:26105142

  3. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    International Nuclear Information System (INIS)

    Lee, Tsair-Fwu; Chao, Pei-Ju; Wang, Hung-Yu; Hsu, Hsuan-Chih; Chang, PaoShu; Chen, Wen-Cheng

    2012-01-01

    With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3 + xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R 2 , the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD 50 ) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD 50 =43.6 Gy and m=0.18 with the SEF data, and TD 50 =44.1 Gy and m=0.11 with the QoL data. The rate of grade 3 + xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Our study shows the agreement between the NTCP parameter modeling based on SEF and

  4. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Science.gov (United States)

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP

  5. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Directory of Open Access Journals (Sweden)

    Lee Tsair-Fwu

    2012-12-01

    Full Text Available Abstract Background With advances in modern radiotherapy (RT, many patients with head and neck (HN cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB model to derive parameters for the normal tissue complication probability (NTCP for xerostomia based on scintigraphy assessments and quality of life (QoL questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50 and the slope of the dose–response curve (m were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement

  6. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    International Nuclear Information System (INIS)

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-01-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear–quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18–30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8–30.9 Gy) and 22.0 Gy (range, 20.2–26.6 Gy), respectively. By use of conventional values for α/β, volume parameter n, 50% complication probability dose TD 50 , and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of α/β and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of α/β and n yielded better predictions (0.7 complications), with n = 0.023 and α/β = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high α/β value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models traditionally used to estimate spinal cord NTCP

  7. Biological Extinction in Earth History

    Science.gov (United States)

    Raup, David M.

    1986-03-01

    Virtually all plant and animal species that have ever lived on the earth are extinct. For this reason alone, extinction must play an important role in the evolution of life. The five largest mass extinctions of the past 600 million years are of greatest interest, but there is also a spectrum of smaller events, many of which indicate biological systems in profound stress. Extinction may be episodic at all scales, with relatively long periods of stability alternating with short-lived extinction events. Most extinction episodes are biologically selective, and further analysis of the victims and survivors offers the greatest chance of deducing the proximal causes of extinction. A drop in sea level and climatic change are most frequently invoked to explain mass extinctions, but new theories of collisions with extraterrestrial bodies are gaining favor. Extinction may be constructive in a Darwinian sense or it may only perturb the system by eliminating those organisms that happen to be susceptible to geologically rare stresses.

  8. Biological extinction in earth history

    Science.gov (United States)

    Raup, D. M.

    1986-01-01

    Virtually all plant and animal species that have ever lived on the earth are extinct. For this reason alone, extinction must play an important role in the evolution of life. The five largest mass extinctions of the past 600 million years are of greatest interest, but there is also a spectrum of smaller events, many of which indicate biological systems in profound stress. Extinction may be episodic at all scales, with relatively long periods of stability alternating with short-lived extinction events. Most extinction episodes are biologically selective, and further analysis of the victims and survivors offers the greatest chance of deducing the proximal causes of extinction. A drop in sea level and climatic change are most frequently invoked to explain mass extinctions, but new theories of collisions with extraterrestrial bodies are gaining favor. Extinction may be constructive in a Darwinian sense or it may only perturb the system by eliminating those organisms that happen to be susceptible to geologically rare stresses.

  9. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  10. Análisis de supervivencia en presencia de riesgos competitivos: estimadores de la probabilidad de suceso Survival analysis with competing risks: estimating failure probability

    Directory of Open Access Journals (Sweden)

    Javier Llorca

    2004-10-01

    Full Text Available Objetivo: Mostrar el efecto de los riesgos competitivos de muerte en el análisis de supervivencia. Métodos: Se presenta un ejemplo sobre la supervivencia libre de rechazo tras un trasplante cardíaco, en el que la muerte antes de desarrollar el rechazo actúa como riesgo competitivo. Mediante una simulación se comparan el estimador de Kaplan-Meier y el modelo de decrementos múltiples. Resultados: El método de Kaplan-Meier sobrestima el riesgo de rechazo. A continuación, se expone la aplicación del modelo de decrementos múltiples para el análisis de acontecimientos secundarios (en el ejemplo, la muerte tras el rechazo. Finalmente, se discuten las asunciones propias del método de Kaplan-Meier y las razones por las que no puede ser aplicado en presencia de riesgos competitivos. Conclusiones: El análisis de supervivencia debe ajustarse por los riesgos competitivos de muerte para evitar la sobrestimación del riesgo de fallo que se produce con el método de Kaplan-Meier.Objective: To show the impact of competing risks of death on survival analysis. Method: We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. Results: The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection. Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Conclusions: Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  11. Flightless birds: When did the dodo become extinct?

    Science.gov (United States)

    Roberts, David L.; Solow, Andrew R.

    2003-11-01

    The extinction of the dodo (Raphus cucullatus L.; Fig. 1) is commonly dated to the last confirmed sighting in 1662, reported by Volkert Evertsz on an islet off Mauritius. By this time, the dodo had become extremely rare - the previous sighting having been 24 years earlier - but the species probably persisted unseen beyond this date. Here we use a statistical method to establish the actual extinction time of the dodo as 1690, almost 30 years after its most recent sighting.

  12. How humans drive speciation as well as extinction

    DEFF Research Database (Denmark)

    Bull, Joseph William; Maron, M.

    2016-01-01

    influence upon divergence in microorganisms. Even if human activities resulted in no net loss of species diversity by balancing speciation and extinction rates, this would probably be deemed unacceptable. We discuss why, based upon ‘no net loss’ conservation literature— considering phylogenetic diversity...... and other metrics, risk aversion, taboo trade-offs and spatial heterogeneity. We conclude that evaluating speciation alongside extinction could result in more nuanced understanding of biosphere trends, clarifying what it is we actually value about biodiversity....

  13. Specimen-based modeling, stopping rules, and the extinction of the Ivory-Billed Woodpecker

    DEFF Research Database (Denmark)

    Gotelli, Nicholas J.; Chao, Anne; Colwell, Robert K.

    2012-01-01

    Assessing species survival status is an essential component of conservation programs. We devised a new statistical method for estimating the probability of species persistence from the temporal sequence of collection dates of museum specimens. To complement this approach, we developed quantitative...... (Campephilus principalis), long assumed to have become extinct in the United States in the 1950s, but reportedly rediscovered in 2004. We analyzed the temporal pattern of the collection dates of 239 geo-referenced museum specimens collected throughout the southeastern United States from 1853 to 1932...

  14. Variation in extinction risk among birds: chance or evolutionary predisposition?

    Science.gov (United States)

    Bennett, P. M.; Owens, I. P. F.

    1997-01-01

    Collar et al. (1994) estimate that of the 9,672 extant species of bird, 1,111 are threatened by extinction. Here, we test whether these threatened species are simply a random sample of birds, or whether there is something about their biology that predisposes them to extinction. We ask three specific questions. First, is extinction risk randomly distributed among families? Second, which families, if any, contain more, or less, threatened species than would be expected by chance? Third, is variation between taxa in extinction risk associated with variation in either body size or fecundity? Extinction risk is not randomly distributed among families. The families which contain significantly more threatened species than expected are the parrots (Psittacidae), pheasants and allies (Phasianidae), albatrosses and allies (Procellariidae), rails (Rallidae), cranes (Gruidae), cracids (Cracidae), megapodes (Megapodidae) and pigeons (Columbidae). The only family which contains significantly fewer threatened species than expected is the woodpeckers (Picidae). Extinction risk is also not distributed randomly with respect to fecundity or body size. Once phylogeny has been controlled for, increases in extinction risk are independently associated with increases in body size and decreases in fecundity. We suggest that this is because low rates of fecundity, which evolved many tens of millions of years ago, predisposed certain lineages to extinction. Low-fecundity populations take longer to recover if they are reduced to small sizes and are, therefore, more likely to go extinct if an external force causes an increase in the rate of mortality, thereby perturbing the natural balance between fecundity and mortality.

  15. The design and analysis of salmonid tagging studies in the Columbia basin. Volume 8: A new model for estimating survival probabilities and residualization from a release-recapture study of fall chinook salmon (Oncorhynchus tschawytscha) smolts in the Snake River

    International Nuclear Information System (INIS)

    Lowther, A.B.; Skalski, J.

    1997-09-01

    Standard release-recapture analysis using Cormack-Jolly-Seber (CJS) models to estimate survival probabilities between hydroelectric facilities for Snake river fall chinook salmon (Oncorhynchus tschawytscha) ignore the possibility of individual fish residualizing and completing their migration in the year following tagging. These models do not utilize available capture history data from this second year and, thus, produce negatively biased estimates of survival probabilities. A new multinomial likelihood model was developed that results in biologically relevant, unbiased estimates of survival probabilities using the full two years of capture history data. This model was applied to 1995 Snake River fall chinook hatchery releases to estimate the true survival probability from one of three upstream release points (Asotin, Billy Creek, and Pittsburgh Landing) to Lower Granite Dam. In the data analyzed here, residualization is not a common physiological response and thus the use of CJS models did not result in appreciably different results than the true survival probability obtained using the new multinomial likelihood model

  16. Extinction of sodium fires

    International Nuclear Information System (INIS)

    Malet, J.C.; Spagna, F.

    1989-01-01

    This paper presents how, starting from a knowledge of sodium ignition and burning, principles for extinction (smothering catch trays, leak recuperation systems, powders) can be developed. These techniques applied in Superphenix 1 and PEC reactors have been tested in the ESMERALDA experimental program which is a joint French/Italian project. (author)

  17. A robust nonparametric method for quantifying undetected extinctions.

    Science.gov (United States)

    Chisholm, Ryan A; Giam, Xingli; Sadanandan, Keren R; Fung, Tak; Rheindt, Frank E

    2016-06-01

    How many species have gone extinct in modern times before being described by science? To answer this question, and thereby get a full assessment of humanity's impact on biodiversity, statistical methods that quantify undetected extinctions are required. Such methods have been developed recently, but they are limited by their reliance on parametric assumptions; specifically, they assume the pools of extant and undetected species decay exponentially, whereas real detection rates vary temporally with survey effort and real extinction rates vary with the waxing and waning of threatening processes. We devised a new, nonparametric method for estimating undetected extinctions. As inputs, the method requires only the first and last date at which each species in an ensemble was recorded. As outputs, the method provides estimates of the proportion of species that have gone extinct, detected, or undetected and, in the special case where the number of undetected extant species in the present day is assumed close to zero, of the absolute number of undetected extinct species. The main assumption of the method is that the per-species extinction rate is independent of whether a species has been detected or not. We applied the method to the resident native bird fauna of Singapore. Of 195 recorded species, 58 (29.7%) have gone extinct in the last 200 years. Our method projected that an additional 9.6 species (95% CI 3.4, 19.8) have gone extinct without first being recorded, implying a true extinction rate of 33.0% (95% CI 31.0%, 36.2%). We provide R code for implementing our method. Because our method does not depend on strong assumptions, we expect it to be broadly useful for quantifying undetected extinctions. © 2016 Society for Conservation Biology.

  18. VizieR Online Data Catalog: TGAS MS & giants reddening and extinction (Gontcharov+, 2018)

    Science.gov (United States)

    Gontcharov, G. A.; Mosenkov, A. V.

    2018-01-01

    These are the reddening, interstellar extinction and extinction-to-reddening ratio estimates for the Gaia DR1 TGAS and Hipparcos stars within 415 pc from the Sun based on the 3D reddening map of Gontcharov (J/PAZh/43/521) and 3D extinction-to-reddening (total-to-selective extinction) ratio Rv map of Gontcharov (J/PAZh/38/15). (2 data files).

  19. Decline and local extinction of Fucales in French Riviera: the harbinger of future extinctions?

    Directory of Open Access Journals (Sweden)

    T. THIBAUT

    2014-03-01

    Full Text Available The French Riviera is one of the Mediterranean areas that has been longest and most thoroughly impacted by human activities. Fucales are long-lived, large-sized brown algae that constitute a good model for studying human impact on species diversity. We gathered all historical data (literature and herbarium vouchers, since the early 19th century, to reconstruct their distribution. The current distribution was established from a 7-year (2007-2013 survey of the 212-km shoreline (1/2 500 map, by means of boating, snorkelling and scuba diving. Overall, 18 taxa of Cystoseira and Sargassum have been reported. Upon comparison with historical data, 5 taxa were no longer observed (C. elegans, C. foeniculacea f. latiramosa, C. squarrosa, C. spinosa var. spinosa and S. hornschuchii while C. jabukae, previously unrecorded, was observed. In addition to these  taxa, probably extinct at a local scale, some taxa had suffered a dramatic decline (C. barbata f. barbata, C. crinita, C. spinosa var. compressa and S. acinarium or become nearly extinct (C. foeniculacea f. tenuiramosa. Three of them, which played in the past significant functional roles in coastal communities, can be considered as functionally extinct. Possible causes of decline and local extinction are discussed. A similar situation has already been reported, although at a much more local scale, in a variety of Mediterranean localities. The question therefore arises about the status of Fucales species in the Mediterranean: are some species on the brink of extinction? Is their extinction at the scale of the French Riviera the harbinger of their extinction Mediterranean–wide?

  20. Selection in a subdivided population with local extinction and recolonization.

    Science.gov (United States)

    Cherry, Joshua L

    2003-01-01

    In a subdivided population, local extinction and subsequent recolonization affect the fate of alleles. Of particular interest is the interaction of this force with natural selection. The effect of selection can be weakened by this additional source of stochastic change in allele frequency. The behavior of a selected allele in such a population is shown to be equivalent to that of an allele with a different selection coefficient in an unstructured population with a different size. This equivalence allows use of established results for panmictic populations to predict such quantities as fixation probabilities and mean times to fixation. The magnitude of the quantity N(e)s(e), which determines fixation probability, is decreased by extinction and recolonization. Thus deleterious alleles are more likely to fix, and advantageous alleles less likely to do so, in the presence of extinction and recolonization. Computer simulations confirm that the theoretical predictions of both fixation probabilities and mean times to fixation are good approximations. PMID:12807797

  1. Probability estimate of confirmability of the value of predicted oil and gas reserves of the Chechen-Ingushetiya. Veroyatnostnaya otsenka podtverzhdaemosti velichiny prognoznykh zapasov nefti is gaza Checheno-Ingushetii

    Energy Technology Data Exchange (ETDEWEB)

    Merkulov, N.E.; Lysenkov, P.P.

    1981-01-01

    Estimated are the reliable predicted reserves of oil and gas of the Chechen-Ingushetia by methods of probability calculations. Calculations were made separately for each oil-bearing lithologic-stratigraphic horizon. The computation results are summarized in a table, and graphs are constructed.

  2. THE SECONDARY EXTINCTION CORRECTION

    Energy Technology Data Exchange (ETDEWEB)

    Zachariasen, W. H.

    1963-03-15

    It is shown that Darwin's formula for the secondary extinction correction, which has been universally accepted and extensively used, contains an appreciable error in the x-ray diffraction case. The correct formula is derived. As a first order correction for secondary extinction, Darwin showed that one should use an effective absorption coefficient mu + gQ where an unpolarized incident beam is presumed. The new derivation shows that the effective absorption coefficient is mu + 2gQ(1 + cos/sup 4/2 theta )/(1 plus or minus cos/sup 2/2 theta )/s up 2/, which gives mu + gQ at theta =0 deg and theta = 90 deg , but mu + 2gQ at theta = 45 deg . Darwin's theory remains valid when applied to neutron diffraction. (auth)

  3. Galactic dust and extinction

    International Nuclear Information System (INIS)

    Lyngaa, G.

    1979-01-01

    The ratio R between visual extinction and colour excess, is slightly larger than 3 and does not vary much throughout our part of the Galaxy. The distribution of dust in the galactic plane shows, on the large scale, a gradient with higher colour excesses towards l=50 0 than towards l=230 0 . On the smaller scale, much of the dust responsible for extinction is situated in clouds which tend to group together. The correlation between positions of interstellar dust clouds and positions of spiral tracers seems rather poor in our Galaxy. However, concentrated dark clouds as well as extended regions of dust show an inclined distribution similar to the Gould belt of bright stars. (Auth.)

  4. Stochastic fluctuation induced the competition between extinction and recurrence in a model of tumor growth

    International Nuclear Information System (INIS)

    Li, Dongxi; Xu, Wei; Sun, Chunyan; Wang, Liang

    2012-01-01

    We investigate the phenomenon that stochastic fluctuation induced the competition between tumor extinction and recurrence in the model of tumor growth derived from the catalytic Michaelis–Menten reaction. We analyze the probability transitions between the extinction state and the state of the stable tumor by the Mean First Extinction Time (MFET) and Mean First Return Time (MFRT). It is found that the positional fluctuations hinder the transition, but the environmental fluctuations, to a certain level, facilitate the tumor extinction. The observed behavior could be used as prior information for the treatment of cancer. -- Highlights: ► Stochastic fluctuation induced the competition between extinction and recurrence. ► The probability transitions are investigated. ► The positional fluctuations hinder the transition. ► The environmental fluctuations, to a certain level, facilitate the tumor extinction. ► The observed behavior can be used as prior information for the treatment of cancer.

  5. Dynamic N -occupancy models: estimating demographic rates and local abundance from detection-nondetection data

    Science.gov (United States)

    Sam Rossman; Charles B. Yackulic; Sarah P. Saunders; Janice Reid; Ray Davis; Elise F. Zipkin

    2016-01-01

    Occupancy modeling is a widely used analytical technique for assessing species distributions and range dynamics. However, occupancy analyses frequently ignore variation in abundance of occupied sites, even though site abundances affect many of the parameters being estimated (e.g., extinction, colonization, detection probability). We introduce a new model (“dynamic

  6. AN ANALYSIS OF THE SHAPES OF INTERSTELLAR EXTINCTION CURVES. VI. THE NEAR-IR EXTINCTION LAW

    International Nuclear Information System (INIS)

    Fitzpatrick, E. L.; Massa, D.

    2009-01-01

    We combine new observations from the Hubble Space Telescope's Advanced Camera of Survey with existing data to investigate the wavelength dependence of near-IR (NIR) extinction. Previous studies suggest a power law form for NIR extinction, with a 'universal' value of the exponent, although some recent observations indicate that significant sight line-to-sight line variability may exist. We show that a power-law model for the NIR extinction provides an excellent fit to most extinction curves, but that the value of the power, β, varies significantly from sight line to sight line. Therefore, it seems that a 'universal NIR extinction law' is not possible. Instead, we find that as β decreases, R(V) ≡ A(V)/E(B - V) tends to increase, suggesting that NIR extinction curves which have been considered 'peculiar' may, in fact, be typical for different R(V) values. We show that the power-law parameters can depend on the wavelength interval used to derive them, with the β increasing as longer wavelengths are included. This result implies that extrapolating power-law fits to determine R(V) is unreliable. To avoid this problem, we adopt a different functional form for NIR extinction. This new form mimics a power law whose exponent increases with wavelength, has only two free parameters, can fit all of our curves over a longer wavelength baseline and to higher precision, and produces R(V) values which are consistent with independent estimates and commonly used methods for estimating R(V). Furthermore, unlike the power-law model, it gives R(V)s that are independent of the wavelength interval used to derive them. It also suggests that the relation R(V) = -1.36 E(K-V)/(E(B-V)) - 0.79 can estimate R(V) to ±0.12. Finally, we use model extinction curves to show that our extinction curves are in accord with theoretical expectations, and demonstrate how large samples of observational quantities can provide useful constraints on the grain properties.

  7. Epidemic extinction paths in complex networks

    Science.gov (United States)

    Hindes, Jason; Schwartz, Ira B.

    2017-05-01

    We study the extinction of long-lived epidemics on finite complex networks induced by intrinsic noise. Applying analytical techniques to the stochastic susceptible-infected-susceptible model, we predict the distribution of large fluctuations, the most probable or optimal path through a network that leads to a disease-free state from an endemic state, and the average extinction time in general configurations. Our predictions agree with Monte Carlo simulations on several networks, including synthetic weighted and degree-distributed networks with degree correlations, and an empirical high school contact network. In addition, our approach quantifies characteristic scaling patterns for the optimal path and distribution of large fluctuations, both near and away from the epidemic threshold, in networks with heterogeneous eigenvector centrality and degree distributions.

  8. Ultraviolet extinction in M-supergiant circumstellar envelopes

    International Nuclear Information System (INIS)

    Buss, R.H. Jr.; Snow, T.P. Jr.

    1986-01-01

    Using International Ultraviolet (IUS) archival low-dispersion spectra, ultraviolet spectral extinctions were derived for the circumstellar envelopes of two M supergiants: HD 60414 and HD 213310. The observed stellar systems belong to a class of widely-separated spectroscopic binaries that are called VV Cephei stars. The total extinction was calculated by dividing the reddened fluxes with unreddened comparison fluxes of similar stars (g B2.5 for HD 213310 and a normalized s+B3 for HD 60414) from the reference atlas. After substracting the interstellar extinctions, which were estimated from the E(B-V) reddening of nearby stars, the resultant circumstellar extinctions were normalized at about 3.5 inverse microns. Not only is the 2175 A extinction bump absent in the circumstellar extinctions, but the far-ultraviolet extinction rise is also absent. The rather flat, ultraviolet extinction curves were interpreted as signatures of a population of noncarbonaceous, oxygen-rich grains with diameters larger than the longest observed wavelength

  9. Sexual selection affects local extinction and turnover in bird communities

    Science.gov (United States)

    Doherty, P.F.; Sorci, G.; Royle, J. Andrew; Hines, J.E.; Nichols, J.D.; Boulinier, T.

    2003-01-01

    Predicting extinction risks has become a central goal for conservation and evolutionary biologists interested in population and community dynamics. Several factors have been put forward to explain risks of extinction, including ecological and life history characteristics of individuals. For instance, factors that affect the balance between natality and mortality can have profound effects on population persistence. Sexual selection has been identified as one such factor. Populations under strong sexual selection experience a number of costs ranging from increased predation and parasitism to enhanced sensitivity to environmental and demographic stochasticity. These findings have led to the prediction that local extinction rates should be higher for species/populations with intense sexual selection. We tested this prediction by analyzing the dynamics of natural bird communities at a continental scale over a period of 21 years (1975-1996), using relevant statistical tools. In agreement with the theoretical prediction, we found that sexual selection increased risks of local extinction (dichromatic birds had on average a 23% higher local extinction rate than monochromatic species). However, despite higher local extinction probabilities, the number of dichromatic species did not decrease over the period considered in this study. This pattern was caused by higher local turnover rates of dichromatic species, resulting in relatively stable communities for both groups of species. Our results suggest that these communities function as metacommunities, with frequent local extinctions followed by colonization. Anthropogenic factors impeding dispersal might therefore have a significant impact on the global persistence of sexually selected species.

  10. The Stationary Distribution and Extinction of Generalized Multispecies Stochastic Lotka-Volterra Predator-Prey System

    OpenAIRE

    Yin, Fancheng; Yu, Xiaoyan

    2015-01-01

    This paper is concerned with the existence of stationary distribution and extinction for multispecies stochastic Lotka-Volterra predator-prey system. The contributions of this paper are as follows. (a) By using Lyapunov methods, the sufficient conditions on existence of stationary distribution and extinction are established. (b) By using the space decomposition technique and the continuity of probability, weaker conditions on extinction of the system are obtained. Finally, a numer...

  11. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  12. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  13. The galactic extinction towards Maffei 1

    International Nuclear Information System (INIS)

    Buta, R.J.; McCall, M.L.; McDonald Observatory, Austin, TX; Australian National Univ., Canberra. Mount Stromlo and Siding Spring Observatories)

    1983-01-01

    The extinction of Maffei 1 has been measured by two new techniques. First, BV aperture photometry has been performed to obtain the colour excess from standard aperture-colour relations for early-type galaxies. Secondly, millimetre and radio observations of galactic CO and HI have been used to calculate the total hydrogen column density along the line-of-sight, and thereby estimate the colour excess from the local dust-to-gas ratio. After consideration of all extinction measurements to date, it is concluded that Asub(v)=5.1+-0.2 mag. The isophotal diameter and the corrected apparent visual magnitude are estimated to be approx. 15 arcmin and approx. 6.3 respectively (assuming type E), making Maffei 1 one of the biggest and brightest galaxies in the sky. (author)

  14. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  15. My Lived Experiences Are More Important Than Your Probabilities: The Role of Individualized Risk Estimates for Decision Making about Participation in the Study of Tamoxifen and Raloxifene (STAR).

    Science.gov (United States)

    Holmberg, Christine; Waters, Erika A; Whitehouse, Katie; Daly, Mary; McCaskill-Stevens, Worta

    2015-11-01

    Decision-making experts emphasize that understanding and using probabilistic information are important for making informed decisions about medical treatments involving complex risk-benefit tradeoffs. Yet empirical research demonstrates that individuals may not use probabilities when making decisions. To explore decision making and the use of probabilities for decision making from the perspective of women who were risk-eligible to enroll in the Study of Tamoxifen and Raloxifene (STAR). We conducted narrative interviews with 20 women who agreed to participate in STAR and 20 women who declined. The project was based on a narrative approach. Analysis included the development of summaries of each narrative, and thematic analysis with developing a coding scheme inductively to code all transcripts to identify emerging themes. Interviewees explained and embedded their STAR decisions within experiences encountered throughout their lives. Such lived experiences included but were not limited to breast cancer family history, a personal history of breast biopsies, and experiences or assumptions about taking tamoxifen or medicines more generally. Women's explanations of their decisions about participating in a breast cancer chemoprevention trial were more complex than decision strategies that rely solely on a quantitative risk-benefit analysis of probabilities derived from populations In addition to precise risk information, clinicians and risk communicators should recognize the importance and legitimacy of lived experience in individual decision making. © The Author(s) 2015.

  16. Characterization of Diesel Soot Aggregates by Scattering and Extinction Methods

    Science.gov (United States)

    Kamimoto, Takeyuki

    2006-07-01

    Characteristics of diesel soot particles sampled from diesel exhaust of a common-rail turbo-charged diesel engine are quantified by scattering and extinction diagnostics using newly build two laser-based instruments. The radius of gyration representing the aggregates size is measured by the angular distribution of scattering intensity, while the soot mass concentration is measured by a two-wavelength extinction method. An approach to estimate the refractive index of diesel soot by an analysis of the extinction and scattering data using an aggregates scattering theory is proposed.

  17. Characterization of Diesel Soot Aggregates by Scattering and Extinction Methods

    International Nuclear Information System (INIS)

    Kamimoto, Takeyuki

    2006-01-01

    Characteristics of diesel soot particles sampled from diesel exhaust of a common-rail turbo-charged diesel engine are quantified by scattering and extinction diagnostics using newly build two laser-based instruments. The radius of gyration representing the aggregates size is measured by the angular distribution of scattering intensity, while the soot mass concentration is measured by a two-wavelength extinction method. An approach to estimate the refractive index of diesel soot by an analysis of the extinction and scattering data using an aggregates scattering theory is proposed

  18. Diversification dynamics of rhynchostomatian ciliates: the impact of seven intrinsic traits on speciation and extinction in a microbial group.

    Science.gov (United States)

    Vďačný, Peter; Rajter, Ľubomír; Shazib, Shahed Uddin Ahmed; Jang, Seok Won; Shin, Mann Kyoon

    2017-08-30

    Ciliates are a suitable microbial model to investigate trait-dependent diversification because of their comparatively complex morphology and high diversity. We examined the impact of seven intrinsic traits on speciation, extinction, and net-diversification of rhynchostomatians, a group of comparatively large, predatory ciliates with proboscis carrying a dorsal brush (sensoric structure) and toxicysts (organelles used to kill the prey). Bayesian estimates under the binary-state speciation and extinction model indicate that two types of extrusomes and two-rowed dorsal brush raise diversification through decreasing extinction. On the other hand, the higher number of contractile vacuoles and their dorsal location likely increase diversification via elevating speciation rate. Particular nuclear characteristics, however, do not significantly differ in their diversification rates and hence lineages with various macronuclear patterns and number of micronuclei have similar probabilities to generate new species. Likelihood-based quantitative state diversification analyses suggest that rhynchostomatians conform to Cope's rule in that their diversity linearly grows with increasing body length and relative length of the proboscis. Comparison with other litostomatean ciliates indicates that rhynchostomatians are not among the cladogenically most successful lineages and their survival over several hundred million years could be associated with their comparatively large and complex bodies that reduce the risk of extinction.

  19. Quasi-extinction risk and population targets for the Eastern, migratory population of monarch butterflies (Danaus plexippus)

    Science.gov (United States)

    Semmens, Brice X.; Semmens, Darius J.; Thogmartin, Wayne E.; Wiederholt, Ruscena; Lopez-Hoffman, Laura; Diffendorfer, James E.; Pleasants, John M.; Oberhauser, Karen S.; Taylor, Orley R.

    2016-01-01

    The Eastern, migratory population of monarch butterflies (Danaus plexippus), an iconic North American insect, has declined by ~80% over the last decade. The monarch’s multi-generational migration between overwintering grounds in central Mexico and the summer breeding grounds in the northern U.S. and southern Canada is celebrated in all three countries and creates shared management responsibilities across North America. Here we present a novel Bayesian multivariate auto-regressive state-space model to assess quasi-extinction risk and aid in the establishment of a target population size for monarch conservation planning. We find that, given a range of plausible quasi-extinction thresholds, the population has a substantial probability of quasi-extinction, from 11–57% over 20 years, although uncertainty in these estimates is large. Exceptionally high population stochasticity, declining numbers, and a small current population size act in concert to drive this risk. An approximately 5-fold increase of the monarch population size (relative to the winter of 2014–15) is necessary to halve the current risk of quasi-extinction across all thresholds considered. Conserving the monarch migration thus requires active management to reverse population declines, and the establishment of an ambitious target population size goal to buffer against future environmentally driven variability.

  20. Modelling interstellar extinction: Pt. 1

    International Nuclear Information System (INIS)

    Jones, A.P.

    1988-01-01

    Several methods of calculating the extinction of porous silicate grains are discussed, these include effective medium theories and hollow spherical shells. Porous silicate grains are shown to produce enhanced infrared, ultraviolet and far-ultraviolet extinction and this effect can be used to reduce the abundance of carbon required to match the average interstellar extinction, however, matching the visual extinction is rather more problematical. We have shown that the enhanced extinction at long and short wavelengths have different origins, and have explained why the visual extinction is little affected by porosity. The implications of porous grains in the interstellar medium are discussed with particular reference to surface chemistry, the polarization of starlight, and their dynamical evolution. (author)

  1. Extinction Events Can Accelerate Evolution

    DEFF Research Database (Denmark)

    Lehman, Joel; Miikkulainen, Risto

    2015-01-01

    Extinction events impact the trajectory of biological evolution significantly. They are often viewed as upheavals to the evolutionary process. In contrast, this paper supports the hypothesis that although they are unpredictably destructive, extinction events may in the long term accelerate...... evolution by increasing evolvability. In particular, if extinction events extinguish indiscriminately many ways of life, indirectly they may select for the ability to expand rapidly through vacated niches. Lineages with such an ability are more likely to persist through multiple extinctions. Lending...... computational support for this hypothesis, this paper shows how increased evolvability will result from simulated extinction events in two computational models of evolved behavior. The conclusion is that although they are destructive in the short term, extinction events may make evolution more prolific...

  2. Rescuing Ecosystems from Extinction Cascades

    Science.gov (United States)

    Sahasrabudhe, Sagar; Motter, Adilson

    2010-03-01

    Food web perturbations stemming from climate change, overexploitation, invasive species, and natural disasters often cause an initial loss of species that results in a cascade of secondary extinctions. Using a predictive modeling framework, here we will present a systematic network-based approach to reduce the number of secondary extinctions. We will show that the extinction of one species can often be compensated by the concurrent removal of a second specific species, which is a counter-intuitive effect not previously tested in complex food webs. These compensatory perturbations frequently involve long-range interactions that are not a priori evident from local predator-prey relationships. Strikingly, in numerous cases even the early removal of a species that would eventually be extinct by the cascade is found to significantly reduce the number of cascading extinctions. Other nondestructive interventions based on partial removals and growth suppression and/or mortality increase are shown to sometimes prevent all secondary extinctions.

  3. Acoustic integrated extinction

    OpenAIRE

    Norris, Andrew N.

    2015-01-01

    The integrated extinction (IE) is defined as the integral of the scattering cross section as a function of wavelength. Sohl et al. (2007 J. Acoust. Soc. Am. 122, 3206–3210. (doi:10.1121/1.2801546)) derived an IE expression for acoustic scattering that is causal, i.e. the scattered wavefront in the forward direction arrives later than the incident plane wave in the background medium. The IE formula was based on electromagnetic results, for which scattering is causal by default. Here, we der...

  4. Mass extinctions of Earth

    International Nuclear Information System (INIS)

    Fernandez, B.; Fernandez, P.; Pereira, B.

    2015-01-01

    Throughout the history of our planet, there have been global phenomena which have led to the disappearance of a large number of species: It is what is known as mass or massive extinctions. This article will make a tour of these large events, from the most remote antiquity to the present day. Today we find ourselves immersed in a process unprecedented since we are eyewitnesses and, more important still, an active part in the decision-making process to try to mitigate their effects. (Author)

  5. Mass extinctions and supernova explosions

    OpenAIRE

    Korschinek, Gunther

    2016-01-01

    A nearby supernova (SN) explosion could have negatively influenced life on Earth, maybe even been responsible for mass extinctions. Mass extinction poses a significant extinction of numerous species on Earth, as recorded in the paleontologic, paleoclimatic, and geological record of our planet. Depending on the distance between the Sun and the SN, different types of threats have to be considered, such as ozone depletion on Earth, causing increased exposure to the Sun's ultraviolet radiation, o...

  6. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013.

    Science.gov (United States)

    2015-01-01

    Traditionally, the Iowa Department of Transportation : has used the Iowa Runoff Chart and single-variable regional-regression equations (RREs) from a U.S. Geological Survey : report (published in 1987) as the primary methods to estimate : annual exce...

  7. Extinction of H II regions

    International Nuclear Information System (INIS)

    Israel, F.P.; Kennicutt, R.C.

    1980-01-01

    Visual extinction of H II regions in nine nearby galaxies as derived from the ratio of the radio continuum emission to H-alpha emission is systematically larger than visual extinction deduced from the Balmer lines alone, if one assumes a value Av/E(B-V) 3. An optically-limited sample of about 30 extragalactic H II regions has a mean extinction of 1.7 m in the visual while about 1.2 m is not seen in the reddening of the Balmer lines. Both reddening and extinction decreases with increasing galactic radius, at least for M33 and M101

  8. A reconciliation of extinction theories

    International Nuclear Information System (INIS)

    Sabine, T.M.

    1988-01-01

    The differences between previous theoretical treatments of extinction based on the Darwin intensity equations arise because of the different functional form chosen for the coupling constant σ. When the same function is used these theories make closely similar predictions. It is shown that a limiting condition on integrated intensity as the crystal size increases puts restrictions on the functions which may be used. A Lorentzian or Fresnellian function can be used for primary extinction while secondary extinction requires a Gaussian, rectangular or triangular function. An analytical expression is given for the variation in the value of the extinction factor with scattering angle. (orig.)

  9. Estimating the short run effects of South Africa's Employment Tax Incentive on youth employment probabilities using a difference-in-differences approach

    OpenAIRE

    Vimal Ranchhod; Arden Finn

    2014-01-01

    What effect did the introduction of the Employment Tax Incentive (ETI) have on youth employment probabilities in South Africa in the short run? The ETI came into effect on the 1st of January 2014. Its purpose is to stimulate youth employment levels and ease the challenges that many youth experience in finding their first jobs. Under the ETI, firms that employ youth are eligible to claim a deduction from their taxes due, for the portion of their wage bill that is paid to certain groups of yout...

  10. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  11. Global extinction in spiral galaxies

    NARCIS (Netherlands)

    Tully, RB; Pierce, MJ; Saunders, W; Verheijen, MAW; Witchalls, PL

    Magnitude-limited samples of spiral galaxies drawn from the Ursa Major and Pisces Clusters are used to determine their extinction properties as a function of inclination. Imaging photometry is available for 87 spirals in the B, R, I, and K' bands. Extinction causes systematic scatter in

  12. Evolution: Postponing Extinction by Polyandry

    OpenAIRE

    Wade, Michael J.

    2010-01-01

    Sex-ratio meiotic drive occurs when males produce a predominance of X-chromosome bearing sperm and an inordinate number of daughters. A driving X causes highly female-biased sex ratios and the risk of extinction. Polyandry can rescue a population from extinction.

  13. Optimising Extinction of Conditioned Disgust

    NARCIS (Netherlands)

    Bosman, Renske C.; Borg, Charmaine; de Jong, Peter J.

    2016-01-01

    Maladaptive disgust responses are tenacious and resistant to exposure-based interventions. In a similar vein, laboratory studies have shown that conditioned disgust is relatively insensitive to Conditioned Stimulus (CS)-only extinction procedures. The relatively strong resistance to extinction might

  14. Linking indices for biodiversity monitoring to extinction risk theory.

    Science.gov (United States)

    McCarthy, Michael A; Moore, Alana L; Krauss, Jochen; Morgan, John W; Clements, Christopher F

    2014-12-01

    Biodiversity indices often combine data from different species when used in monitoring programs. Heuristic properties can suggest preferred indices, but we lack objective ways to discriminate between indices with similar heuristics. Biodiversity indices can be evaluated by determining how well they reflect management objectives that a monitoring program aims to support. For example, the Convention on Biological Diversity requires reporting about extinction rates, so simple indices that reflect extinction risk would be valuable. We developed 3 biodiversity indices that are based on simple models of population viability that relate extinction risk to abundance. We based the first index on the geometric mean abundance of species and the second on a more general power mean. In a third index, we integrated the geometric mean abundance and trend. These indices require the same data as previous indices, but they also relate directly to extinction risk. Field data for butterflies and woodland plants and experimental studies of protozoan communities show that the indices correlate with local extinction rates. Applying the index based on the geometric mean to global data on changes in avian abundance suggested that the average extinction probability of birds has increased approximately 1% from 1970 to 2009. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for Conservation Biology.

  15. Rho Ophiuchi Cloud Core Extinction Map

    Science.gov (United States)

    Gibson, D. J.; Rudolph, A.; Barsony, M.

    1997-12-01

    We present an extinction map of a one square degree region ( ~ 2.2pc square) of the core of the star-forming region rho Ophiuchi derived by the method of star counts. Photometry from the near-infrared J, H, and K band images of Barsony et al. (1997) provided the stellar catalog for this study. From this map an estimate of the mass of the region is made and compared with previous estimates from other methods. Reference Barsony, M., Kenyon, S.J., Lada, E.A., & Teuben, P.J. 1997, ApJS, 112, 109

  16. Behavioral tagging of extinction learning.

    Science.gov (United States)

    de Carvalho Myskiw, Jociane; Benetti, Fernando; Izquierdo, Iván

    2013-01-15

    Extinction of contextual fear in rats is enhanced by exposure to a novel environment at 1-2 h before or 1 h after extinction training. This effect is antagonized by administration of protein synthesis inhibitors anisomycin and rapamycin into the hippocampus, but not into the amygdala, immediately after either novelty or extinction training, as well as by the gene expression blocker 5,6-dichloro-1-beta-D-ribofuranosylbenzimidazole administered after novelty training, but not after extinction training. Thus, this effect can be attributed to a mechanism similar to synaptic tagging, through which long-term potentiation can be enhanced by other long-term potentiations or by exposure to a novel environment in a protein synthesis-dependent fashion. Extinction learning produces a tag at the appropriate synapses, whereas novelty learning causes the synthesis of plasticity-related proteins that are captured by the tag, strengthening the synapses that generated this tag.

  17. Extinction of Harrington's mountain goat

    International Nuclear Information System (INIS)

    Mead, J.I.; Martin, P.S.; Euler, R.C.; Long, A.; Jull, A.J.T.; Toolin, L.J.; Donahue, D.J.; Linick, T.W.

    1986-01-01

    Keratinous horn sheaths of the extinct Harrington's mountain goat, Oreamnos harringtoni, were recovered at or near the surface of dry caves of the Grand Canyon, Arizona. Twenty-three separate specimens from two caves were dated nondestructively by the tandem accelerator mass spectrometer (TAMS). Both the TAMS and the conventional dates indicate that Harrington's mountain goat occupied the Grand Canyon for at least 19,000 years prior to becoming extinct by 11,160 +/- 125 radiocarbon years before present. The youngest average radiocarbon dates on Shasta ground sloths, Nothrotheriops shastensis, from the region are not significantly younger than those on extinct mountain goats. Rather than sequential extinction with Harrington's mountain goat disappearing from the Grand Canyon before the ground sloths, as one might predict in view of evidence of climatic warming at the time, the losses were concurrent. Both extinctions coincide with the regional arrival of Clovis hunters

  18. Extinction of metal fires

    International Nuclear Information System (INIS)

    Mellottee, H.

    1977-01-01

    The main points of a large bibliography on liquid and solid metal fires are set out. The various methods used to fight these fires are presented; covering by powders is specially emphasized. Since this method has promising results, the various possible techniques, extinction by cooling the metal, by blanketing, by formation of a continuous insulating layer (by fusion or pyrolysis of a powder) or by a surface reaction between powder and metal are studied. The conditions of conservation and use of powders are outlined, then the various powders are described: inert powders, powders undergoing a physical transformation (fusion or vitrification of an organic compound, fusion of eutectic inorganic mixtures), multiple effect powders. Precise examples are quoted [fr

  19. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  20. Measuring Extinction with ALE

    Science.gov (United States)

    Zimmer, Peter C.; McGraw, J. T.; Gimmestad, G. G.; Roberts, D.; Stewart, J.; Smith, J.; Fitch, J.

    2007-12-01

    ALE (Astronomical LIDAR for Extinction) is deployed at the University of New Mexico's (UNM) Campus Observatory in Albuquerque, NM. It has begun a year-long testing phase prior deployment at McDonald Observatory in support of the CCD/Transit Instrument II (CTI-II). ALE is designed to produce a high-precision measurement of atmospheric absorption and scattering above the observatory site every ten minutes of every moderately clear night. LIDAR (LIght Detection And Ranging) is the VIS/UV/IR analog of radar, using a laser, telescope and time-gated photodetector instead of a radio transmitter, dish and receiver. In the case of ALE -- an elastic backscatter LIDAR -- 20ns-long, eye-safe laser pulses are launched 2500 times per second from a 0.32m transmitting telescope co-mounted with a 50mm short-range receiver on an alt-az mounted 0.67m long-range receiver. Photons from the laser pulse are scattered and absorbed as the pulse propagates through the atmosphere, a portion of which are scattered into the field of view of the short- and long-range receiver telescopes and detected by a photomultiplier. The properties of a given volume of atmosphere along the LIDAR path are inferred from both the altitude-resolved backscatter signal as well as the attenuation of backscatter signal from altitudes above it. We present ALE profiles from the commissioning phase and demonstrate some of the astronomically interesting atmospheric information that can be gleaned from these data, including, but not limited to, total line-of-sight extinction. This project is funded by NSF Grant 0421087.

  1. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  2. Commentary on Holmes et al. (2007): resolving the debate on when extinction risk is predictable.

    Science.gov (United States)

    Ellner, Stephen P; Holmes, Elizabeth E

    2008-08-01

    We reconcile the findings of Holmes et al. (Ecology Letters, 10, 2007, 1182) that 95% confidence intervals for quasi-extinction risk were narrow for many vertebrates of conservation concern, with previous theory predicting wide confidence intervals. We extend previous theory, concerning the precision of quasi-extinction estimates as a function of population dynamic parameters, prediction intervals and quasi-extinction thresholds, and provide an approximation that specifies the prediction interval and threshold combinations where quasi-extinction estimates are precise (vs. imprecise). This allows PVA practitioners to define the prediction interval and threshold regions of safety (low risk with high confidence), danger (high risk with high confidence), and uncertainty.

  3. Arbuscular mycorrhizal propagules in soils from a tropical forest and an abandoned cornfield in Quintana Roo, Mexico: visual comparison of most-probable-number estimates.

    Science.gov (United States)

    Ramos-Zapata, José A; Guadarrama, Patricia; Navarro-Alberto, Jorge; Orellana, Roger

    2011-02-01

    The present study was aimed at comparing the number of arbuscular mycorrhizal fungi (AMF) propagules found in soil from a mature tropical forest and that found in an abandoned cornfield in Noh-Bec Quintana Roo, Mexico, during three seasons. Agricultural practices can dramatically reduce the availability and viability of AMF propagules, and in this way delay the regeneration of tropical forests in abandoned agricultural areas. In addition, rainfall seasonality, which characterizes deciduous tropical forests, may strongly influence AMF propagules density. To compare AMF propagule numbers between sites and seasons (summer rainy, winter rainy and dry season), a "most probable number" (MPN) bioassay was conducted under greenhouse conditions employing Sorgum vulgare L. as host plant. Results showed an average value of 3.5 ± 0.41 propagules in 50 ml of soil for the mature forest while the abandoned cornfield had 15.4 ± 5.03 propagules in 50 ml of soil. Likelihood analysis showed no statistical differences in MPN of propagules between seasons within each site, or between sites, except for the summer rainy season for which soil from the abandoned cornfield had eight times as many propagules compared to soil from the mature forest site for this season. Propagules of arbuscular mycorrhizal fungi remained viable throughout the sampling seasons at both sites. Abandoned areas resulting from traditional slash and burn agriculture practices involving maize did not show a lower number of AMF propagules, which should allow the establishment of mycotrophic plants thus maintaining the AMF inoculum potential in these soils.

  4. Taylor-series and Monte-Carlo-method uncertainty estimation of the width of a probability distribution based on varying bias and random error

    International Nuclear Information System (INIS)

    Wilson, Brandon M; Smith, Barton L

    2013-01-01

    Uncertainties are typically assumed to be constant or a linear function of the measured value; however, this is generally not true. Particle image velocimetry (PIV) is one example of a measurement technique that has highly nonlinear, time varying local uncertainties. Traditional uncertainty methods are not adequate for the estimation of the uncertainty of measurement statistics (mean and variance) in the presence of nonlinear, time varying errors. Propagation of instantaneous uncertainty estimates into measured statistics is performed allowing accurate uncertainty quantification of time-mean and statistics of measurements such as PIV. It is shown that random errors will always elevate the measured variance, and thus turbulent statistics such as u'u'-bar. Within this paper, nonlinear, time varying errors are propagated from instantaneous measurements into the measured mean and variance using the Taylor-series method. With these results and knowledge of the systematic and random uncertainty of each measurement, the uncertainty of the time-mean, the variance and covariance can be found. Applicability of the Taylor-series uncertainty equations to time varying systematic and random errors and asymmetric error distributions are demonstrated with Monte-Carlo simulations. The Taylor-series uncertainty estimates are always accurate for uncertainties on the mean quantity. The Taylor-series variance uncertainty is similar to the Monte-Carlo results for cases in which asymmetric random errors exist or the magnitude of the instantaneous variations in the random and systematic errors is near the ‘true’ variance. However, the Taylor-series method overpredicts the uncertainty in the variance as the instantaneous variations of systematic errors are large or are on the same order of magnitude as the ‘true’ variance. (paper)

  5. Fission-fragment mass distribution and estimation of the cluster emission probability in the γ + 232Th and 181Ta reactions

    International Nuclear Information System (INIS)

    Karamyan, S.A.; Adam, J.; Belov, A.G.; Chaloun, P.; Norseev, Yu.V.; Stegajlov, V.I.

    1997-01-01

    Fission-fragment mass distribution has been measured by the cumulative yields of radionuclides detected in the 232 Th(γ,f)-reaction at the Bremsstrahlung endpoint energies of 12 and 24 MeV. The yield upper limits have been estimated for the light nuclei 24 Na, 28 Mg, 38 S etc. at the Th and Ta targets exposure to the 24 MeV Bremsstrahlung. The results are discussed in terms of the multimodal fission phenomena and cluster emission >from a deformed fissioning system or from a compound nucleus

  6. The silent mass extinction of insect herbivores in biodiversity hotspots.

    Science.gov (United States)

    Fonseca, Carlos Roberto

    2009-12-01

    Habitat loss is silently leading numerous insects to extinction. Conservation efforts, however, have not been designed specifically to protect these organisms, despite their ecological and evolutionary significance. On the basis of species-host area equations, parameterized with data from the literature and interviews with botanical experts, I estimated the number of specialized plant-feeding insects (i.e., monophages) that live in 34 biodiversity hotspots and the number committed to extinction because of habitat loss. I estimated that 795,971-1,602,423 monophagous insect species live in biodiversity hotspots on 150,371 endemic plant species, which is 5.3-10.6 monophages per plant species. I calculated that 213,830-547,500 monophagous species are committed to extinction in biodiversity hotspots because of reduction of the geographic range size of their endemic hosts. I provided rankings of biodiversity hotspots on the basis of estimated richness of monophagous insects and on estimated number of extinctions of monophagous species. Extinction rates were predicted to be higher in biodiversity hotspots located along strong environmental gradients and on archipelagos, where high spatial turnover of monophagous species along the geographic distribution of their endemic plants is likely. The results strongly support the overall strategy of selecting priority conservation areas worldwide primarily on the basis of richness of endemic plants. To face the global decline of insect herbivores, one must expand the coverage of the network of protected areas and improve the richness of native plants on private lands.

  7. Towards the prediction of pre-mining stresses in the European continent. [Estimates of vertical and probable maximum lateral stress in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Blackwood, R. L.

    1980-05-15

    There are now available sufficient data from in-situ, pre-mining stress measurements to allow a first attempt at predicting the maximum stress magnitudes likely to occur in a given mining context. The sub-horizontal (lateral) stress generally dominates the stress field, becoming critical to stope stability in many cases. For cut-and-fill mining in particular, where developed fill pressures are influenced by lateral displacement of pillars or stope backs, extraction maximization planning by mathematical modelling techniques demands the best available estimate of pre-mining stresses. While field measurements are still essential for this purpose, in the present paper it is suggested that the worst stress case can be predicted for preliminary design or feasibility study purposes. In the Eurpoean continent the vertical component of pre-mining stress may be estimated by adding 2 MPa to the pressure due to overburden weight. The maximum lateral stress likely to be encountered is about 57 MPa at depths of some 800m to 1000m below the surface.

  8. Mass extinction efficiency and extinction hygroscopicity of ambient PM2.5 in urban China.

    Science.gov (United States)

    Cheng, Zhen; Ma, Xin; He, Yujie; Jiang, Jingkun; Wang, Xiaoliang; Wang, Yungang; Sheng, Li; Hu, Jiangkai; Yan, Naiqiang

    2017-07-01

    The ambient PM 2.5 pollution problem in China has drawn substantial international attentions. The mass extinction efficiency (MEE) and hygroscopicity factor (f(RH)) of PM 2.5 can be readily applied to study the impacts on atmospheric visibility and climate. The few previous investigations in China only reported results from pilot studies and are lack of spatial representativeness. In this study, hourly average ambient PM 2.5 mass concentration, relative humidity, and atmospheric visibility data from China national air quality and meteorological monitoring networks were retrieved and analyzed. It includes 24 major Chinese cities from nine city-clusters with the period of October 2013 to September 2014. Annual average extinction coefficient in urban China was 759.3±258.3Mm -1 , mainly caused by dry PM 2.5 (305.8.2±131.0Mm -1 ) and its hygroscopicity (414.6±188.1Mm -1 ). High extinction coefficient values were resulted from both high ambient PM 2.5 concentration (68.5±21.7µg/m 3 ) and high relative humidity (69.7±8.6%). The PM 2.5 mass extinction efficiency varied from 2.87 to 6.64m 2 /g with an average of 4.40±0.84m 2 /g. The average extinction hygroscopic factor f(RH=80%) was 2.63±0.45. The levels of PM 2.5 mass extinction efficiency and hygroscopic factor in China were in comparable range with those found in developed countries in spite of the significant diversities among all 24 cities. Our findings help to establish quantitative relationship between ambient extinction coefficient (visual range) and PM 2.5 & relative humidity. It will reduce the uncertainty of extinction coefficient estimation of ambient PM 2.5 in urban China which is essential for the research of haze pollution and climate radiative forcing. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Estimating the probability of polyreactive antibodies 4E10 and 2F5 disabling a gp41 trimer after T cell-HIV adhesion.

    Directory of Open Access Journals (Sweden)

    Bin Hu

    2014-01-01

    Full Text Available A few broadly neutralizing antibodies, isolated from HIV-1 infected individuals, recognize epitopes in the membrane proximal external region (MPER of gp41 that are transiently exposed during viral entry. The best characterized, 4E10 and 2F5, are polyreactive, binding to the viral membrane and their epitopes in the MPER. We present a model to calculate, for any antibody concentration, the probability that during the pre-hairpin intermediate, the transient period when the epitopes are first exposed, a bound antibody will disable a trivalent gp41 before fusion is complete. When 4E10 or 2F5 bind to the MPER, a conformational change is induced that results in a stably bound complex. The model predicts that for these antibodies to be effective at neutralization, the time to disable an epitope must be shorter than the time the antibody remains bound in this conformation, about five minutes or less for 4E10 and 2F5. We investigate the role of avidity in neutralization and show that 2F5 IgG, but not 4E10, is much more effective at neutralization than its Fab fragment. We attribute this to 2F5 interacting more stably than 4E10 with the viral membrane. We use the model to elucidate the parameters that determine the ability of these antibodies to disable epitopes and propose an extension of the model to analyze neutralization data. The extended model predicts the dependencies of IC50 for neutralization on the rate constants that characterize antibody binding, the rate of fusion of gp41, and the number of gp41 bridging the virus and target cell at the start of the pre-hairpin intermediate. Analysis of neutralization experiments indicate that only a small number of gp41 bridges must be disabled to prevent fusion. However, the model cannot determine the exact number from neutralization experiments alone.

  10. Variation of normal tissue complication probability (NTCP) estimates of radiation-induced hypothyroidism in relation to changes in delineation of the thyroid gland

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Laugaard Lorenzen, Ebbe

    2015-01-01

    volume, Dmean and estimated risk of HT. Bland-Altman plots were used for assessment of the systematic (mean) and random [standard deviation (SD)] variability of the three parameters, and a method for displaying the spatial variation in delineation differences was developed. Results. Intra......-observer variability resulted in a mean difference in thyroid volume and Dmean of 0.4 cm(3) (SD ± 1.6) and -0.5 Gy (SD ± 1.0), respectively, and 0.3 cm(3) (SD ± 1.8) and 0.0 Gy (SD ± 1.3) for inter-observer variability. The corresponding mean differences of NTCP values for radiation-induced HT due to intra- and inter...

  11. Accelerated modern human-induced species losses: Entering the sixth mass extinction.

    Science.gov (United States)

    Ceballos, Gerardo; Ehrlich, Paul R; Barnosky, Anthony D; García, Andrés; Pringle, Robert M; Palmer, Todd M

    2015-06-01

    The oft-repeated claim that Earth's biota is entering a sixth "mass extinction" depends on clearly demonstrating that current extinction rates are far above the "background" rates prevailing between the five previous mass extinctions. Earlier estimates of extinction rates have been criticized for using assumptions that might overestimate the severity of the extinction crisis. We assess, using extremely conservative assumptions, whether human activities are causing a mass extinction. First, we use a recent estimate of a background rate of 2 mammal extinctions per 10,000 species per 100 years (that is, 2 E/MSY), which is twice as high as widely used previous estimates. We then compare this rate with the current rate of mammal and vertebrate extinctions. The latter is conservatively low because listing a species as extinct requires meeting stringent criteria. Even under our assumptions, which would tend to minimize evidence of an incipient mass extinction, the average rate of vertebrate species loss over the last century is up to 100 times higher than the background rate. Under the 2 E/MSY background rate, the number of species that have gone extinct in the last century would have taken, depending on the vertebrate taxon, between 800 and 10,000 years to disappear. These estimates reveal an exceptionally rapid loss of biodiversity over the last few centuries, indicating that a sixth mass extinction is already under way. Averting a dramatic decay of biodiversity and the subsequent loss of ecosystem services is still possible through intensified conservation efforts, but that window of opportunity is rapidly closing.

  12. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  13. Enhancement of large fluctuations to extinction in adaptive networks

    Science.gov (United States)

    Hindes, Jason; Schwartz, Ira B.; Shaw, Leah B.

    2018-01-01

    During an epidemic, individual nodes in a network may adapt their connections to reduce the chance of infection. A common form of adaption is avoidance rewiring, where a noninfected node breaks a connection to an infected neighbor and forms a new connection to another noninfected node. Here we explore the effects of such adaptivity on stochastic fluctuations in the susceptible-infected-susceptible model, focusing on the largest fluctuations that result in extinction of infection. Using techniques from large-deviation theory, combined with a measurement of heterogeneity in the susceptible degree distribution at the endemic state, we are able to predict and analyze large fluctuations and extinction in adaptive networks. We find that in the limit of small rewiring there is a sharp exponential reduction in mean extinction times compared to the case of zero adaption. Furthermore, we find an exponential enhancement in the probability of large fluctuations with increased rewiring rate, even when holding the average number of infected nodes constant.

  14. Kinetic Analysis of Isothermal Decomposition Process of Sodium Bicarbonate Using the Weibull Probability Function—Estimation of Density Distribution Functions of the Apparent Activation Energies

    Science.gov (United States)

    Janković, Bojan

    2009-10-01

    The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.

  15. Reconsidering the extinction of ichthyosaurs

    OpenAIRE

    Fischer, Valentin

    2010-01-01

    Despite their extreme adaptation to life in open sea, ichthyosaurs were one of the first major groups of post-Triassic marine reptiles to disappear, at the end of Cenomanian, whereas plesiosaurs, mosasaurs and numerous families of marine crocodiles and sea turtles disappeared during the Cretaceous/Paleocene Extinction Event. It has been proposed that unique biological factors drove ichthyosaurs to extinction, namely a break in the food chain at the level of belemnites or a progressive ecologi...

  16. The Sixth Great Mass Extinction

    Science.gov (United States)

    Wagler, Ron

    2012-01-01

    Five past great mass extinctions have occurred during Earth's history. Humanity is currently in the midst of a sixth, human-induced great mass extinction of plant and animal life (e.g., Alroy 2008; Jackson 2008; Lewis 2006; McDaniel and Borton 2002; Rockstrom et al. 2009; Rohr et al. 2008; Steffen, Crutzen, and McNeill 2007; Thomas et al. 2004;…

  17. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  18. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  19. Rates of movement of threatened bird species between IUCN red list categories and toward extinction.

    Science.gov (United States)

    Brooke, M de L; Butchart, S H M; Garnett, S T; Crowley, G M; Mantilla-Beniers, N B; Stattersfield, A J

    2008-04-01

    In recent centuries bird species have been deteriorating in status and becoming extinct at a rate that may be 2-3 orders of magnitude higher than in prehuman times. We examined extinction rates of bird species designated critically endangered in 1994 and the rate at which species have moved through the IUCN (World Conservation Union) Red List categories of extinction risk globally for the period 1988-2004 and regionally in Australia from 1750 to 2000. For Australia we drew on historical accounts of the extent and condition of species habitats, spread of invasive species, and changes in sighting frequencies. These data sets permitted comparison of observed rates of movement through the IUCN Red List categories with novel predictions based on the IUCN Red List criterion E, which relates to explicit extinction probabilities determined, for example, by population viability analysis. The comparison also tested whether species listed on the basis of other criteria face a similar probability of moving to a higher threat category as those listed under criterion E. For the rate at which species moved from vulnerable to endangered, there was a good match between observations and predictions, both worldwide and in Australia. Nevertheless, species have become extinct at a rate that, although historically high, is 2 (Australia) to 10 (globally) times lower than predicted. Although the extinction probability associated with the critically endangered category may be too high, the shortfall in realized extinctions can also be attributed to the beneficial impact of conservation intervention. These efforts may have reduced the number of global extinctions from 19 to 3 and substantially slowed the extinction trajectory of 33 additional critically endangered species. Our results suggest that current conservation action benefits species on the brink of extinction, but is less targeted at or has less effect on moderately threatened species.

  20. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  1. Neutron time-of-flight techniques for investigation of the extinction effect

    International Nuclear Information System (INIS)

    Niimura, N.; Tomiyoshi, S.; Takahashi, J.; Harada, J.

    1975-01-01

    An application of the time-of-flight neutron diffraction technique to an investigation of the nature of the extinction effect in a single-crystal specimen is given. It is shown that the wavelength dependence of the extinction can be easily obtained by changing the scattering angle. An estimation of the extinction factor for a CuCl single crystal is given as an example and a comparison of the results with recent extinction theory [Becker and Coppens. Acta Cryst.(1974). A30, 129-147; 148-153] is made. (Auth.)

  2. Direct numerical simulations of non-premixed ethylene-air flames: Local flame extinction criterion

    KAUST Repository

    Lecoustre, Vivien R.

    2014-11-01

    Direct Numerical Simulations (DNS) of ethylene/air diffusion flame extinctions in decaying two-dimensional turbulence were performed. A Damköhler-number-based flame extinction criterion as provided by classical large activation energy asymptotic (AEA) theory is assessed for its validity in predicting flame extinction and compared to one based on Chemical Explosive Mode Analysis (CEMA) of the detailed chemistry. The DNS code solves compressible flow conservation equations using high order finite difference and explicit time integration schemes. The ethylene/air chemistry is simulated with a reduced mechanism that is generated based on the directed relation graph (DRG) based methods along with stiffness removal. The numerical configuration is an ethylene fuel strip embedded in ambient air and exposed to a prescribed decaying turbulent flow field. The emphasis of this study is on the several flame extinction events observed in contrived parametric simulations. A modified viscosity and changing pressure (MVCP) scheme was adopted in order to artificially manipulate the probability of flame extinction. Using MVCP, pressure was changed from the baseline case of 1 atm to 0.1 and 10 atm. In the high pressure MVCP case, the simulated flame is extinction-free, whereas in the low pressure MVCP case, the simulated flame features frequent extinction events and is close to global extinction. Results show that, despite its relative simplicity and provided that the global flame activation temperature is correctly calibrated, the AEA-based flame extinction criterion can accurately predict the simulated flame extinction events. It is also found that the AEA-based criterion provides predictions of flame extinction that are consistent with those provided by a CEMA-based criterion. This study supports the validity of a simple Damköhler-number-based criterion to predict flame extinction in engineering-level CFD models. © 2014 The Combustion Institute.

  3. The celestial factor and the formula to explain or predict all extinctions of the fossil record

    NARCIS (Netherlands)

    Elewa, A.M.T.

    2012-01-01

    In reality there are various kinds of explanations for each type of extinction. This paper introduces a new theory to explain and to estimate the size and frequency of all extinctions over the entire period of 600 my of the fossil record. The central point was the search for a common pattern and

  4. Climate change not to blame for late Quaternary megafauna extinctions in Australia.

    Science.gov (United States)

    Saltré, Frédérik; Rodríguez-Rey, Marta; Brook, Barry W; Johnson, Christopher N; Turney, Chris S M; Alroy, John; Cooper, Alan; Beeton, Nicholas; Bird, Michael I; Fordham, Damien A; Gillespie, Richard; Herrando-Pérez, Salvador; Jacobs, Zenobia; Miller, Gifford H; Nogués-Bravo, David; Prideaux, Gavin J; Roberts, Richard G; Bradshaw, Corey J A

    2016-01-29

    Late Quaternary megafauna extinctions impoverished mammalian diversity worldwide. The causes of these extinctions in Australia are most controversial but essential to resolve, because this continent-wide event presaged similar losses that occurred thousands of years later on other continents. Here we apply a rigorous metadata analysis and new ensemble-hindcasting approach to 659 Australian megafauna fossil ages. When coupled with analysis of several high-resolution climate records, we show that megafaunal extinctions were broadly synchronous among genera and independent of climate aridity and variability in Australia over the last 120,000 years. Our results reject climate change as the primary driver of megafauna extinctions in the world's most controversial context, and instead estimate that the megafauna disappeared Australia-wide ∼13,500 years after human arrival, with shorter periods of coexistence in some regions. This is the first comprehensive approach to incorporate uncertainty in fossil ages, extinction timing and climatology, to quantify mechanisms of prehistorical extinctions.

  5. Accelerated modern human–induced species losses: Entering the sixth mass extinction

    Science.gov (United States)

    Ceballos, Gerardo; Ehrlich, Paul R.; Barnosky, Anthony D.; García, Andrés; Pringle, Robert M.; Palmer, Todd M.

    2015-01-01

    The oft-repeated claim that Earth’s biota is entering a sixth “mass extinction” depends on clearly demonstrating that current extinction rates are far above the “background” rates prevailing between the five previous mass extinctions. Earlier estimates of extinction rates have been criticized for using assumptions that might overestimate the severity of the extinction crisis. We assess, using extremely conservative assumptions, whether human activities are causing a mass extinction. First, we use a recent estimate of a background rate of 2 mammal extinctions per 10,000 species per 100 years (that is, 2 E/MSY), which is twice as high as widely used previous estimates. We then compare this rate with the current rate of mammal and vertebrate extinctions. The latter is conservatively low because listing a species as extinct requires meeting stringent criteria. Even under our assumptions, which would tend to minimize evidence of an incipient mass extinction, the average rate of vertebrate species loss over the last century is up to 100 times higher than the background rate. Under the 2 E/MSY background rate, the number of species that have gone extinct in the last century would have taken, depending on the vertebrate taxon, between 800 and 10,000 years to disappear. These estimates reveal an exceptionally rapid loss of biodiversity over the last few centuries, indicating that a sixth mass extinction is already under way. Averting a dramatic decay of biodiversity and the subsequent loss of ecosystem services is still possible through intensified conservation efforts, but that window of opportunity is rapidly closing. PMID:26601195

  6. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  7. A scenario for impacts of water availability loss due to climate change on riverine fish extinction rates

    OpenAIRE

    Tedesco, Pablo; Oberdorff, Thierry; Cornu, Jean-François; Beauchard, O.; Brosse, S.; Durr, H. H.; Grenouillet, G.; Leprieur, F.; Tisseuil, Clément; Zaiss, Rainer; Hugueny, Bernard

    2013-01-01

    1. Current models estimating impact of habitat loss on biodiversity in the face of global climate change usually project only percentages of species committed to extinction' on an uncertain time-scale. Here, we show that this limitation can be overcome using an empirically derived background extinction rate-area' curve to estimate natural rates and project future rates of freshwater fish extinction following variations in river drainage area resulting from global climate change.2. Based on fu...

  8. The extinction of the dinosaurs.

    Science.gov (United States)

    Brusatte, Stephen L; Butler, Richard J; Barrett, Paul M; Carrano, Matthew T; Evans, David C; Lloyd, Graeme T; Mannion, Philip D; Norell, Mark A; Peppe, Daniel J; Upchurch, Paul; Williamson, Thomas E

    2015-05-01

    Non-avian dinosaurs went extinct 66 million years ago, geologically coincident with the impact of a large bolide (comet or asteroid) during an interval of massive volcanic eruptions and changes in temperature and sea level. There has long been fervent debate about how these events affected dinosaurs. We review a wealth of new data accumulated over the past two decades, provide updated and novel analyses of long-term dinosaur diversity trends during the latest Cretaceous, and discuss an emerging consensus on the extinction's tempo and causes. Little support exists for a global, long-term decline across non-avian dinosaur diversity prior to their extinction at the end of the Cretaceous. However, restructuring of latest Cretaceous dinosaur faunas in North America led to reduced diversity of large-bodied herbivores, perhaps making communities more susceptible to cascading extinctions. The abruptness of the dinosaur extinction suggests a key role for the bolide impact, although the coarseness of the fossil record makes testing the effects of Deccan volcanism difficult. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  9. Promotion of cooperation by selective group extinction

    Science.gov (United States)

    Böttcher, Marvin A.; Nagler, Jan

    2016-06-01

    Multilevel selection is an important organizing principle that crucially underlies evolutionary processes from the emergence of cells to eusociality and the economics of nations. Previous studies on multilevel selection assumed that the effective higher-level selection emerges from lower-level reproduction. This leads to selection among groups, although only individuals reproduce. We introduce selective group extinction, where groups die with a probability inversely proportional to their group fitness. When accounting for this the critical benefit-to-cost ratio is substantially lowered. Because in game theory and evolutionary dynamics the degree of cooperation crucially depends on this ratio above which cooperation emerges, previous studies may have substantially underestimated the establishment and maintenance of cooperation.

  10. Molar extinction coefficients of solutions of some organic compounds

    Indian Academy of Sciences (India)

    (C4H8O2), succinimide (C4H5NO2) as estimated from the measured absorbance of. 7 radiations in their ... species in the solution and ε is called the molar absorptivity or extinction coefficient. (l mol-1cm-1 or ... Integration of eq. (4) leads to.

  11. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  12. Local fish extinction in a small tropical lake in Brazil

    Directory of Open Access Journals (Sweden)

    Paulo dos Santos Pompeu

    Full Text Available Lagoa Santa is a shallow permanent lake, located in Belo Horizonte metropolitan region, Brazil. In this study, the loss in fish diversity of the lake over the past 150 years is evaluated. Local extinction of almost 70% of the original fish fauna is described. Probably, the main causes of this richness loss were: obstruction of natural communication with rio das Velhas, non-native species introduction, change in the water level, organic pollution, and elimination of littoral and submerged vegetation.

  13. The atmospheric extinction of light

    International Nuclear Information System (INIS)

    Hughes, Stephen W; Powell, Sean; Carroll, Joshua; Cowley, Michael

    2016-01-01

    An experiment is described that enables students to understand the properties of atmospheric extinction due to Rayleigh scattering. The experiment requires the use of red, green and blue lasers attached to a travelling microscope or similar device. The laser beams are passed through an artificial atmosphere, made from milky water, at varying depths, before impinging on either a light meter or a photodiode integral to a Picotech Dr. DAQ ADC. A plot of measured spectral intensity verses depth reveals the contribution Rayleigh scattering has to the extinction coefficient. For the experiment with the light meter, the extinction coefficients for red, green and blue light in the milky sample of water were 0.27, 0.36 and 0.47 cm −1 respectively and 0.032, 0.037 and 0.092 cm −1 for the Picotech Dr. DAQ ADC. (paper)

  14. Infectious Disease, Endangerment, and Extinction

    Science.gov (United States)

    MacPhee, Ross D. E.; Greenwood, Alex D.

    2013-01-01

    Infectious disease, especially virulent infectious disease, is commonly regarded as a cause of fluctuation or decline in biological populations. However, it is not generally considered as a primary factor in causing the actual endangerment or extinction of species. We review here the known historical examples in which disease has, or has been assumed to have had, a major deleterious impact on animal species, including extinction, and highlight some recent cases in which disease is the chief suspect in causing the outright endangerment of particular species. We conclude that the role of disease in historical extinctions at the population or species level may have been underestimated. Recent methodological breakthroughs may lead to a better understanding of the past and present roles of infectious disease in influencing population fitness and other parameters. PMID:23401844

  15. Both population size and patch quality affect local extinctions and colonizations.

    Science.gov (United States)

    Franzén, Markus; Nilsson, Sven G

    2010-01-07

    Currently, the habitat of many species is fragmented, resulting in small local populations with individuals occasionally dispersing between the remaining habitat patches. In a solitary bee metapopulation, extinction probability was related to both local bee population sizes and pollen resources measured as host plant population size. Patch size, on the other hand, had no additional predictive power. The turnover rate of local bee populations in 63 habitat patches over 4 years was high, with 72 extinction events and 31 colonization events, but the pollen plant population was stable with no extinctions or colonizations. Both pollen resources and bee populations had strong and independent effects on extinction probability, but connectivity was not of importance. Colonizations occurred more frequently within larger host plant populations. For metapopulation survival of the bee, large pollen plant populations are essential, independent of current bee population size.

  16. Chapter 4. Susceptibility of sharks, rays and chimaeras to global extinction.

    Science.gov (United States)

    Field, Iain C; Meekan, Mark G; Buckworth, Rik C; Bradshaw, Corey J A

    2009-01-01

    marine teleosts to test explicitly whether the former group is intrinsically more susceptible to extinction than fishes in general. Around 52% of chondrichthyans have been Red-Listed compared to only 8% of all marine teleosts; however, listed teleosts were in general placed more frequently into the higher-risk categories relative to chondrichthyans. IUCN threat risk in both taxa was positively correlated with body size and negatively correlated albeit weakly, with geographic range size. Even after accounting for the positive influence of size, Red-Listed teleosts were still more likely than chondrichthyans to be classified as threatened. We suggest that while sharks might not have necessarily experienced the same magnitude of deterministic decline as Red-Listed teleosts, their larger size and lower fecundity (not included in the analysis) predispose chondrichthyans to a higher risk of extinction overall. Removal of these large predators can elicit trophic cascades and destabilise the relative abundance of smaller species. Predator depletions can lead to permanent shifts in marine communities and alternate equilibrium states. Climate change might influence the phenology and physiology of some species, with the most probable response being changes in the timing of migrations and shifts in distribution. The synergistic effects among harvesting, habitat changes and climate-induced forcings are greatest for coastal chondrichthyans with specific habitat requirements and these are currently the most likely candidates for extinction. Management of shark populations must take into account the rate at which drivers of decline affect specific species. Only through the detailed collection of data describing demographic rates, habitat affinities, trophic linkages and geographic ranges, and how environmental stressors modify these, can extinction risk be more precisely estimated and reduced. The estimation of minimum viable population sizes, below which rapid extinction is more likely

  17. Extinction of planetary nebulae and the turbulent structure of the galaxy

    International Nuclear Information System (INIS)

    Lerche, I.; Milne, D.K.

    1980-01-01

    Fluctuations in the extinction of planetary nebulae provide strong support for the concept of a turbulent interstellar medium. We have analyzed theoretically the mean extinction and its variance as a function of height, z, above the galactic plane. The mean increases monotonically, and exponentially, to a saturation level. The variance increases as z 2 for small z and has damped oscillations for intermediate z, before levelling off at large z. The observed mean extinction and the observed variance are found to be in excellent agreement with these theoretical deductions. The spatial scale of the mean extinction is estimated to be 100 pc; the oscillation scale of the variance and the damping scale of the oscillations are estimated to be about 200 +- 100 pc. The rms level of density fluctuations in the absorbing material causing the extinction is about equal to the mean value

  18. Does red noise increase or decrease extinction risk? Single extreme events versus series of unfavorable conditions.

    Science.gov (United States)

    Schwager, Monika; Johst, Karin; Jeltsch, Florian

    2006-06-01

    Recent theoretical studies have shown contrasting effects of temporal correlation of environmental fluctuations (red noise) on the risk of population extinction. It is still debated whether and under which conditions red noise increases or decreases extinction risk compared with uncorrelated (white) noise. Here, we explain the opposing effects by introducing two features of red noise time series. On the one hand, positive autocorrelation increases the probability of series of poor environmental conditions, implying increasing extinction risk. On the other hand, for a given time period, the probability of at least one extremely bad year ("catastrophe") is reduced compared with white noise, implying decreasing extinction risk. Which of these two features determines extinction risk depends on the strength of environmental fluctuations and the sensitivity of population dynamics to these fluctuations. If extreme (catastrophic) events can occur (strong noise) or sensitivity is high (overcompensatory density dependence), then temporal correlation decreases extinction risk; otherwise, it increases it. Thus, our results provide a simple explanation for the contrasting previous findings and are a crucial step toward a general understanding of the effect of noise color on extinction risk.

  19. Extinction risk is most acute for the world's largest and smallest vertebrates.

    Science.gov (United States)

    Ripple, William J; Wolf, Christopher; Newsome, Thomas M; Hoffmann, Michael; Wirsing, Aaron J; McCauley, Douglas J

    2017-10-03

    Extinction risk in vertebrates has been linked to large body size, but this putative relationship has only been explored for select taxa, with variable results. Using a newly assembled and taxonomically expansive database, we analyzed the relationships between extinction risk and body mass (27,647 species) and between extinction risk and range size (21,294 species) for vertebrates across six main classes. We found that the probability of being threatened was positively and significantly related to body mass for birds, cartilaginous fishes, and mammals. Bimodal relationships were evident for amphibians, reptiles, and bony fishes. Most importantly, a bimodal relationship was found across all vertebrates such that extinction risk changes around a body mass breakpoint of 0.035 kg, indicating that the lightest and heaviest vertebrates have elevated extinction risk. We also found range size to be an important predictor of the probability of being threatened, with strong negative relationships across nearly all taxa. A review of the drivers of extinction risk revealed that the heaviest vertebrates are most threatened by direct killing by humans. By contrast, the lightest vertebrates are most threatened by habitat loss and modification stemming especially from pollution, agricultural cropping, and logging. Our results offer insight into halting the ongoing wave of vertebrate extinctions by revealing the vulnerability of large and small taxa, and identifying size-specific threats. Moreover, they indicate that, without intervention, anthropogenic activities will soon precipitate a double truncation of the size distribution of the world's vertebrates, fundamentally reordering the structure of life on our planet.

  20. Relating life-history traits, environmental constraints and local extinctions in river fish

    OpenAIRE

    Bergerot, B.; Hugueny, Bernard; Belliard, J.

    2015-01-01

    The life histories of freshwater fish are widely studied because they represent fundamental determinants of population performances. However, a gap remains in our understanding of how species traits may predispose species to extinction in a changing environment. In this study, based on a large data set provided by the French National Agency for Water and Aquatic Environment (325 sites), we analysed factors that explain the probability of local extinction in 40 freshwater species across French...

  1. Andean Condor (Vultur gryphus) in Ecuador: Geographic Distribution, Population Size and Extinction Risk.

    Science.gov (United States)

    Naveda-Rodríguez, Adrián; Vargas, Félix Hernán; Kohn, Sebastián; Zapata-Ríos, Galo

    2016-01-01

    The Andean Condor (Vultur gryphus) in Ecuador is classified as Critically Endangered. Before 2015, standardized and systematic estimates of geographic distribution, population size and structure were not available for this species, hampering the assessment of its current status and hindering the design and implementation of effective conservation actions. In this study, we performed the first quantitative assessment of geographic distribution, population size and population viability of Andean Condor in Ecuador. We used a methodological approach that included an ecological niche model to study geographic distribution, a simultaneous survey of 70 roosting sites to estimate population size and a population viability analysis (PVA) for the next 100 years. Geographic distribution in the form of extent of occurrence was 49 725 km2. During a two-day census, 93 Andean Condors were recorded and a population of 94 to 102 individuals was estimated. In this population, adult-to-immature ratio was 1:0.5. In the modeled PVA scenarios, the probability of extinction, mean time to extinction and minimum population size varied from zero to 100%, 63 years and 193 individuals, respectively. Habitat loss is the greatest threat to the conservation of Andean Condor populations in Ecuador. Population size reduction in scenarios that included habitat loss began within the first 15 years of this threat. Population reinforcement had no effects on the recovery of Andean Condor populations given the current status of the species in Ecuador. The population size estimate presented in this study is the lower than those reported previously in other countries where the species occur. The inferences derived from the population viability analysis have implications for Condor management in Ecuador. This study highlights the need to redirect efforts from captive breeding and population reinforcement to habitat conservation.

  2. Verifying reddening and extinction for Gaia DR1 TGAS giants

    Science.gov (United States)

    Gontcharov, George A.; Mosenkov, Aleksandr V.

    2018-03-01

    Gaia DR1 Tycho-Gaia Astrometric Solution parallaxes, Tycho-2 photometry, and reddening/extinction estimates from nine data sources for 38 074 giants within 415 pc from the Sun are used to compare their position in the Hertzsprung-Russell diagram with theoretical estimates, which are based on the PARSEC and MIST isochrones and the TRILEGAL model of the Galaxy with its parameters being widely varied. We conclude that (1) some systematic errors of the reddening/extinction estimates are the main uncertainty in this study; (2) any emission-based 2D reddening map cannot give reliable estimates of reddening within 415 pc due to a complex distribution of dust; (3) if a TRILEGAL's set of the parameters of the Galaxy is reliable and if the solar metallicity is Z 50°, give the best fit of the empirical and theoretical data with each other.

  3. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  4. Involvement of Dopamine D1/D5 and D2 Receptors in Context-Dependent Extinction Learning and Memory Reinstatement.

    Science.gov (United States)

    André, Marion Agnès Emma; Manahan-Vaughan, Denise

    2015-01-01

    Dopamine contributes to the regulation of higher order information processing and executive control. It is important for memory consolidation processes, and for the adaptation of learned responses based on experience. In line with this, under aversive learning conditions, application of dopamine receptor antagonists prior to extinction result in enhanced memory reinstatement. Here, we investigated the contribution of the dopaminergic system to extinction and memory reinstatement (renewal) of an appetitive spatial learning task in rodents. Rats were trained for 3 days in a T-maze (context "A") to associate a goal arm with a food reward, despite low reward probability (acquisition phase). On day 4, extinction learning (unrewarded) occurred, that was reinforced by a context change ("B"). On day 5, re-exposure to the (unrewarded) "A" context took place (renewal of context "A", followed by extinction of context "A"). In control animals, significant extinction occurred on day 4, that was followed by an initial memory reinstatement (renewal) on day 5, that was, in turn, succeeded by extinction of renewal. Intracerebral treatment with a D1/D5-receptor antagonist prior to the extinction trials, elicited a potent enhancement of extinction in context "B". By contrast, a D1/D5-agonist impaired renewal in context "A". Extinction in the "A" context on day 5 was unaffected by the D1/D5-ligands. Treatment with a D2-receptor antagonist prior to extinction had no overall effect on extinction in context "B" or renewal in context "A", although extinction of the renewal effect was impaired on day 5, compared to controls. Taken together, these data suggest that dopamine acting on the D1/D5-receptor modulates both acquisition and consolidation of context-dependent extinction. By contrast, the D2-receptor may contribute to context-independent aspects of this kind of extinction learning.

  5. Involvement of dopamine D1/D5 and D2 receptors in context-dependent extinction learning and memory reinstatement

    Directory of Open Access Journals (Sweden)

    Marion Agnes Emma Andre

    2016-01-01

    Full Text Available Dopamine contributes to the regulation of higher order information processing and executive control. It is important for memory consolidation processes, and for the adaptation of learned responses based on experience. In line with this, under aversive learning conditions, application of dopamine receptor antagonists prior to extinction result in enhanced memory reinstatement. Here, we investigated the contribution of the dopaminergic system to extinction and memory reinstatement (renewal of an appetitive spatial learning task in rodents. Rats were trained for 3 days in a T-maze (context ‘A’ to associate a goal arm with a food reward, despite low reward probability (acquisition phase. On day 4, extinction learning (unrewarded occurred, that was reinforced by a context change (‘B’. On day 5, re-exposure to the (unrewarded ‘A’-context took place (renewal of context ‘A’, followed by extinction of context ‘A’. In control animals, significant extinction occurred on day 4, that was followed by an initial memory reinstatement (renewal on day 5, that was, in turn, succeeded by extinction of renewal. Intracerebral treatment with a D1/D5-receptor antagonist prior to the extinction trials, elicited a potent enhancement of extinction in context ‘B’. By contrast, a D1/D5-agonist impaired renewal in context ’A’. Extinction in the ‘A’ context on day 5 was unaffected by the D1/D5-ligands. Treatment with a D2-receptor antagonist prior to extinction had no overall effect on extinction in context ‘B or renewal in context ‘A’, although extinction of the renewal effect was impaired on day 5, compared to controls.Taken together, these data suggest that dopamine acting on the D1/D5-receptor modulates both acquisition and consolidation of context-dependent extinction. By contrast, the D2-receptor may contribute to context-independent aspects of this kind of extinction learning.

  6. Size distribution of interstellar particles. III. Peculiar extinctions and normal infrared extinction

    International Nuclear Information System (INIS)

    Mathis, J.S.; Wallenhorst, S.G.

    1981-01-01

    The effect of changing the upper and lower size limits of a distribution of bare graphite and silicate particles with n(a)αa/sup -q/ is investigated. Mathis, Rumpl, and Nordsieck showed that the normal extinction is matched very well by having the small-size cutoff, a/sub -/, roughly-equal0.005 or 0.01 μm, and the large size a/sub +/, about 0.25 μm, and q = 3.5 for both substances. We consider the progressively peculiar extinctions exhibited by the well-observed stars, sigma Sco, rho Oph, and theta 1 Ori C, with values of R/sub v/[equivalentA/sub v//E(B--V)] of 3.4, 4.4, and 5.5 compared to the normal 3.1. Two (sigma Sco, rho Oph) are in a neutral dense cloud; theta 1 Ori C is in the Orion Nebula. We find that sigma Sco has a normal graphite distribution but has had its small silicate particles removed, so that a/sub -/(sil)roughly-equal0.04 μm if q = 3.5, or q(sil) = 2.6 if the size limits are fixed. However, the upper size limit on silicates remains normal. In rho Oph, the graphite is still normal, but both a/sub -/(sil) and a/sub +/(sil) are increased, to about 0.04 μm and 0.4 or 0.5 μm, respectively, if q = 3.5, or q(sil)roughly-equal1.3 if the size limits are fixed. In theta 1 Ori, the small limit on graphite has increased to about 0.04 μm, or q(gra)roughly-equal3, while the silicates are about like those in rho Oph. The calculated lambda2175 bump is broader than the observed, but normal foreground extinction probably contributes appreciably to the observed bump. The absolute amount of extinction per H atom for rho Oph is not explained. The column density of H is so large that systematic effects might be present. Very large graphite particles (a>3 μm) are required to ''hide'' the graphite without overly affecting the visual extinction, but a normal (small) graphite size distribution is required by the lambda2175 bump. We feel that it is unlikely that such a bimodal distribution exists

  7. Global warming and extinctions of endemic species from biodiversity hotspots.

    Science.gov (United States)

    Malcolm, Jay R; Liu, Canran; Neilson, Ronald P; Hansen, Lara; Hannah, Lee

    2006-04-01

    Global warming is a key threat to biodiversity, but few researchers have assessed the magnitude of this threat at the global scale. We used major vegetation types (biomes) as proxies for natural habitats and, based on projected future biome distributions under doubled-CO2 climates, calculated changes in habitat areas and associated extinctions of endemic plant and vertebrate species in biodiversity hotspots. Because of numerous uncertainties in this approach, we undertook a sensitivity analysis of multiple factors that included (1) two global vegetation models, (2) different numbers of biome classes in our biome classification schemes, (3) different assumptions about whether species distributions were biome specific or not, and (4) different migration capabilities. Extinctions were calculated using both species-area and endemic-area relationships. In addition, average required migration rates were calculated for each hotspot assuming a doubled-CO2 climate in 100 years. Projected percent extinctions ranged from hotspots were the Cape Floristic Region, Caribbean, Indo-Burma, Mediterranean Basin, Southwest Australia, and Tropical Andes, where plant extinctions per hotspot sometimes exceeded 2000 species. Under the assumption that projected habitat changes were attained in 100 years, estimated global-warming-induced rates of species extinctions in tropical hotspots in some cases exceeded those due to deforestation, supporting suggestions that global warming is one of the most serious threats to the planet's biodiversity.

  8. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  9. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  10. An exact solution of the extinction problem in supercritical multiplying systems

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    1979-01-01

    Using the point model approximation and one-speed theory with no delayed neutrons a probability balance equation for neutrons by the backward method has been constructed. This probability gives the distribution of neutrons in a multiplying medium at a given time and also the distribution that a chain will have generated a specified number of neutrons before extinction. We consider the limit of this probability for super and subcritical systems for long times after the initial triggering neutron. This leads to the extinction probability and to the individual probabilities of neutron population. To obtain specific results we have used a variety of models for the neutron multiplicity in the fission process, ie Poisson, birth and death, geometric and binomial. Exact solutions for the extinction probability have been obtained and its sensitivity to various parameters examined. Finally, we use the 'quadratic approximation' and assess its accuracy; it is found to overestimate the extinction probability and to be useful only for multiplication factors near unity. (author)

  11. Modeling Population Growth and Extinction

    Science.gov (United States)

    Gordon, Sheldon P.

    2009-01-01

    The exponential growth model and the logistic model typically introduced in the mathematics curriculum presume that a population grows exclusively. In reality, species can also die out and more sophisticated models that take the possibility of extinction into account are needed. In this article, two extensions of the logistic model are considered,…

  12. New theories about ancient extinctions

    Science.gov (United States)

    Spall, H.

    1986-01-01

    The abrupt disappearance of all the dinosaurs about 65 million years ago, along with perhaps half the plant species and other animals, has been one of the great geological mysteries. Clues to the cause of these extinctions have been scarce and open to many interpretations.

  13. Extinction debt on oceanic islands

    DEFF Research Database (Denmark)

    Triantis, Kostas A.; Borges, Paulo A. V.; Ladle, Richard J.

    2010-01-01

    the magnitude of such future extinction events has been hampered by potentially inaccurate assumptions about the slope of species-area relationships, which are habitat- and taxon-specific. We overcome this challenge by applying a method that uses the historical sequence of deforestation in the Azorean Islands...

  14. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  15. EARTH SCIENCE: Did Volcanoes Drive Ancient Extinctions?

    Science.gov (United States)

    Kerr, R A

    2000-08-18

    With the publication in recent weeks of two papers on a mass extinction 183 million years ago, researchers can add five suggestive cases to the list of extinctions with known causes. These extinctions coincide with massive outpourings of lava, accompanied by signs that global warming threw the ocean-atmosphere system out of whack. Although no one can yet pin any of these mass extinctions with certainty on the volcanic eruptions, scientists say it's unlikely that they're all coincidences.

  16. An investigation of the interstellar extinction

    International Nuclear Information System (INIS)

    Roche, P.F.; Aitken, D.K.; Melbourne Univ., Point Cook

    1984-01-01

    The 10 μm extinction towards six WC8 or WC9 Wolf-Rayet stars is investigated. All objects show smooth dust emission suffering silicate absorption with depths well correlated with the extinction in the visible. The de-reddened spectra are well represented by emission from featureless grain components, possibly from iron or carbon grains. The extinction to the stars is found to be dominantly interstellar in origin with little extinction from the circumstellar shell. (author)

  17. Inverse probability weighting and doubly robust methods in correcting the effects of non-response in the reimbursed medication and self-reported turnout estimates in the ATH survey.

    Science.gov (United States)

    Härkänen, Tommi; Kaikkonen, Risto; Virtala, Esa; Koskinen, Seppo

    2014-11-06

    To assess the nonresponse rates in a questionnaire survey with respect to administrative register data, and to correct the bias statistically. The Finnish Regional Health and Well-being Study (ATH) in 2010 was based on a national sample and several regional samples. Missing data analysis was based on socio-demographic register data covering the whole sample. Inverse probability weighting (IPW) and doubly robust (DR) methods were estimated using the logistic regression model, which was selected using the Bayesian information criteria. The crude, weighted and true self-reported turnout in the 2008 municipal election and prevalences of entitlements to specially reimbursed medication, and the crude and weighted body mass index (BMI) means were compared. The IPW method appeared to remove a relatively large proportion of the bias compared to the crude prevalence estimates of the turnout and the entitlements to specially reimbursed medication. Several demographic factors were shown to be associated with missing data, but few interactions were found. Our results suggest that the IPW method can improve the accuracy of results of a population survey, and the model selection provides insight into the structure of missing data. However, health-related missing data mechanisms are beyond the scope of statistical methods, which mainly rely on socio-demographic information to correct the results.

  18. Further Evidence of Auditory Extinction in Aphasia

    Science.gov (United States)

    Marshall, Rebecca Shisler; Basilakos, Alexandra; Love-Myers, Kim

    2013-01-01

    Purpose: Preliminary research ( Shisler, 2005) suggests that auditory extinction in individuals with aphasia (IWA) may be connected to binding and attention. In this study, the authors expanded on previous findings on auditory extinction to determine the source of extinction deficits in IWA. Method: Seventeen IWA (M[subscript age] = 53.19 years)…

  19. Habitat fragmentation and extinction rates within freshwater fish communities : a faunal relaxation approach

    OpenAIRE

    Hugueny, Bernard; Movellan, A.; Belliard, J.

    2011-01-01

    Aim To estimate population extinction rates within freshwater fish communities since the fragmentation of palaeo-rivers due to sea level rise at the end of the Pleistocene; to combine this information with rates estimated by other approaches (population surveys, fossil records); and to build an empirical extinction-area relationship. Location Temperate rivers from the Northern Hemisphere, with a special focus on rivers discharging into the English Channel, in north-western France. Methods (1)...

  20. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.