WorldWideScience

Sample records for range dependent probability

  1. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  2. Impact parameter dependence of inner-shell ionization probabilities

    International Nuclear Information System (INIS)

    Cocke, C.L.

    1974-01-01

    The probability for ionization of an inner shell of a target atom by a heavy charged projectile is a sensitive function of the impact parameter characterizing the collision. This probability can be measured experimentally by detecting the x-ray resulting from radiative filling of the inner shell in coincidence with the projectile scattered at a determined angle, and by using the scattering angle to deduce the impact parameter. It is conjectured that the functional dependence of the ionization probability may be a more sensitive probe of the ionization mechanism than is a total cross section measurement. Experimental results for the K-shell ionization of both solid and gas targets by oxygen, carbon and fluorine projectiles in the MeV/amu energy range will be presented, and their use in illuminating the inelastic collision process discussed

  3. Propagation in a waveguide with range-dependent seabed properties.

    Science.gov (United States)

    Holland, Charles W

    2010-11-01

    The ocean environment contains features affecting acoustic propagation that vary on a wide range of time and space scales. A significant body of work over recent decades has aimed at understanding the effects of water column spatial and temporal variability on acoustic propagation. Much less is understood about the impact of spatial variability of seabed properties on propagation, which is the focus of this study. Here, a simple, intuitive expression for propagation with range-dependent boundary properties and uniform water depth is derived. It is shown that incoherent range-dependent propagation depends upon the geometric mean of the seabed plane-wave reflection coefficient and the arithmetic mean of the cycle distance. Thus, only the spatial probability distributions (pdfs) of the sediment properties are required. Also, it is shown that the propagation over a range-dependent seabed tends to be controlled by the lossiest, not the hardest, sediments. Thus, range-dependence generally leads to higher propagation loss than would be expected, due for example to lossy sediment patches and/or nulls in the reflection coefficient. In a few instances, propagation over a range-dependent seabed can be calculated using range-independent sediment properties. The theory may be useful for other (non-oceanic) waveguides.

  4. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  5. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  6. Antitrust Enforcement Under Endogenous Fines and Price-Dependent Detection Probabilities

    NARCIS (Netherlands)

    Houba, H.E.D.; Motchenkova, E.; Wen, Q.

    2010-01-01

    We analyze the effectiveness of antitrust regulation in a repeated oligopoly model in which both fines and detection probabilities depend on the cartel price. Such fines are closer to actual guidelines than the commonly assumed fixed fines. Under a constant detection probability, we confirm the

  7. An analytical evaluation for spatial-dependent intra-pebble Dancoff factor and escape probability

    International Nuclear Information System (INIS)

    Kim, Songhyun; Kim, Hong-Chul; Kim, Jong Kyung; Kim, Soon Young; Noh, Jae Man

    2009-01-01

    The analytical evaluation of spatial-dependent intra-pebble Dancoff factors and their escape probabilities is pursued by the model developed in this study. Intra-pebble Dancoff factors and their escape probabilities are calculated as a function of fuel kernel radius, number of fuel kernels, and fuel region radius. The method in this study can be easily utilized to analyze the tendency of spatial-dependent intra-pebble Dancoff factor and spatial-dependent fuel region escape probability for the various geometries because it is faster than the MCNP method as well as good accuracy. (author)

  8. Time dependent non-extinction probability for prompt critical systems

    International Nuclear Information System (INIS)

    Gregson, M. W.; Prinja, A. K.

    2009-01-01

    The time dependent non-extinction probability equation is presented for slab geometry. Numerical solutions are provided for a nested inner/outer iteration routine where the fission terms (both linear and non-linear) are updated and then held fixed over the inner scattering iteration. Time dependent results are presented highlighting the importance of the injection position and angle. The iteration behavior is also described as the steady state probability of initiation is approached for both small and large time steps. Theoretical analysis of the nested iteration scheme is shown and highlights poor numerical convergence for marginally prompt critical systems. An acceleration scheme for the outer iterations is presented to improve convergence of such systems. Theoretical analysis of the acceleration scheme is also provided and the associated decrease in computational run time addressed. (authors)

  9. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  10. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  11. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  12. Off-diagonal long-range order, cycle probabilities, and condensate fraction in the ideal Bose gas.

    Science.gov (United States)

    Chevallier, Maguelonne; Krauth, Werner

    2007-11-01

    We discuss the relationship between the cycle probabilities in the path-integral representation of the ideal Bose gas, off-diagonal long-range order, and Bose-Einstein condensation. Starting from the Landsberg recursion relation for the canonic partition function, we use elementary considerations to show that in a box of size L3 the sum of the cycle probabilities of length k>L2 equals the off-diagonal long-range order parameter in the thermodynamic limit. For arbitrary systems of ideal bosons, the integer derivative of the cycle probabilities is related to the probability of condensing k bosons. We use this relation to derive the precise form of the pik in the thermodynamic limit. We also determine the function pik for arbitrary systems. Furthermore, we use the cycle probabilities to compute the probability distribution of the maximum-length cycles both at T=0, where the ideal Bose gas reduces to the study of random permutations, and at finite temperature. We close with comments on the cycle probabilities in interacting Bose gases.

  13. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ferson, Scott [Applied Biomathematics, Setauket, NY (United States); Nelsen, Roger B. [Lewis & Clark College, Portland OR (United States); Hajagos, Janos [Applied Biomathematics, Setauket, NY (United States); Berleant, Daniel J. [Iowa State Univ., Ames, IA (United States); Zhang, Jianzhong [Iowa State Univ., Ames, IA (United States); Tucker, W. Troy [Applied Biomathematics, Setauket, NY (United States); Ginzburg, Lev R. [Applied Biomathematics, Setauket, NY (United States); Oberkampf, William L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  14. The transition probability and the probability for the left-most particle's position of the q-totally asymmetric zero range process

    Energy Technology Data Exchange (ETDEWEB)

    Korhonen, Marko [Department of Mathematics and Statistics, University of Helsinki, FIN-00014 (Finland); Lee, Eunghyun [Centre de Recherches Mathématiques (CRM), Université de Montréal, Quebec H3C 3J7 (Canada)

    2014-01-15

    We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle's position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.

  15. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  16. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. On Z-dependence of probability of atomic capture of mesons in matter

    International Nuclear Information System (INIS)

    Vasil'ev, V.A.; Petrukhin, V.I.; Suvorov, V.M.; Khorvat, D.

    1976-01-01

    All experimental data available on the atomic capture of negative muons and pions are systematically studied to find more appropriate empirical expression for the capture probability as a function of the atomic number. It is shown that Z-dependence, as a rule, does not hold. Zsup(1/3)-dependence gives more satisfactory results. A modified Zsup(1/3-dependence is proposed which is more appropriate for hydrogen - containing compounds

  18. Voltage dependency of transmission probability of aperiodic DNA molecule

    Science.gov (United States)

    Wiliyanti, V.; Yudiarsah, E.

    2017-07-01

    Characteristics of electron transports in aperiodic DNA molecules have been studied. Double stranded DNA model with the sequences of bases, GCTAGTACGTGACGTAGCTAGGATATGCCTGA, in one chain and its complements on the other chains has been used. Tight binding Hamiltonian is used to model DNA molecules. In the model, we consider that on-site energy of the basis has a linearly dependency on the applied electric field. Slater-Koster scheme is used to model electron hopping constant between bases. The transmission probability of electron from one electrode to the next electrode is calculated using a transfer matrix technique and scattering matrix method simultaneously. The results show that, generally, higher voltage gives a slightly larger value of the transmission probability. The applied voltage seems to shift extended states to lower energy. Meanwhile, the value of the transmission increases with twisting motion frequency increment.

  19. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    Science.gov (United States)

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  20. Class dependency of fuzzy relational database using relational calculus and conditional probability

    Science.gov (United States)

    Deni Akbar, Mohammad; Mizoguchi, Yoshihiro; Adiwijaya

    2018-03-01

    In this paper, we propose a design of fuzzy relational database to deal with a conditional probability relation using fuzzy relational calculus. In the previous, there are several researches about equivalence class in fuzzy database using similarity or approximate relation. It is an interesting topic to investigate the fuzzy dependency using equivalence classes. Our goal is to introduce a formulation of a fuzzy relational database model using the relational calculus on the category of fuzzy relations. We also introduce general formulas of the relational calculus for the notion of database operations such as ’projection’, ’selection’, ’injection’ and ’natural join’. Using the fuzzy relational calculus and conditional probabilities, we introduce notions of equivalence class, redundant, and dependency in the theory fuzzy relational database.

  1. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  2. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Directory of Open Access Journals (Sweden)

    Michael R W Dawson

    Full Text Available Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  3. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability

    Science.gov (United States)

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent’s environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned. PMID:28212422

  4. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Science.gov (United States)

    Dawson, Michael R W; Gupta, Maya

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  5. Effect of field-dependent mobility on the escape probability. I. Electrons photoinjected in neopentane

    International Nuclear Information System (INIS)

    Mozumder, A.; Carmichael, I.

    1978-01-01

    A general procedure is described for calculating the escape probability of an electron against neutralization in the presence of an external field after it has been ejected into a dielectric liquid from a planar surface. The present paper utilizes the field-dependent electron mobility measurement in neopentane by Bakale and Schmidt. The calculated escape probability, upon averaging over the initial distribution, is compared with the current efficiency measurement of Holroyd et al. The median thermalization legnth, inferred from this comparison, depends in general upon the assumed form of initial distribution. It is less than the value obtained when the field dependence of the mobility is ignored but greater than that applicable to the high energy irradiation case. A plausible explanation is offered

  6. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  7. Evolution of density-dependent movement during experimental range expansions.

    Science.gov (United States)

    Fronhofer, E A; Gut, S; Altermatt, F

    2017-12-01

    Range expansions and biological invasions are prime examples of transient processes that are likely impacted by rapid evolutionary changes. As a spatial process, range expansions are driven by dispersal and movement behaviour. Although it is widely accepted that dispersal and movement may be context-dependent, for instance density-dependent, and best represented by reaction norms, the evolution of density-dependent movement during range expansions has received little experimental attention. We therefore tested current theory predicting the evolution of increased movement at low densities at range margins using highly replicated and controlled range expansion experiments across multiple genotypes of the protist model system Tetrahymena thermophila. Although rare, we found evolutionary changes during range expansions even in the absence of initial standing genetic variation. Range expansions led to the evolution of negatively density-dependent movement at range margins. In addition, we report the evolution of increased intrastrain competitive ability and concurrently decreased population growth rates in range cores. Our findings highlight the importance of understanding movement and dispersal as evolving reaction norms and plastic life-history traits of central relevance for range expansions, biological invasions and the dynamics of spatially structured systems in general. © 2017 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2017 European Society For Evolutionary Biology.

  8. Unresolved resonance range cross section, probability tables and self shielding factor

    International Nuclear Information System (INIS)

    Sublet, J.Ch.; Blomquist, R.N.; Goluoglu, S.; Mac Farlane, R.E.

    2009-07-01

    The performance and methodology of 4 processing codes have been compared in the unresolved resonance range of a selected set of isotopes. Those isotopes have been chosen to encompass most cases encountered in the unresolved energy range contained in major libraries like Endf/B-7 or Jeff-3.1.1. The code results comparison is accompanied by data format and formalism examinations and processing code fine-interpretation study. After some improvements, the results showed generally good agreement, although not perfect with infinite dilute cross-sections. However, much larger differences occur when shelf-shielded effective cross-sections are compared. The infinitely dilute cross-section are often plot checked but it is the probability table derived and shelf-shielded cross sections that are used and interpreted in criticality and transport calculations. This suggests that the current evaluation data format and formalism, in the unresolved resonance range should be tightened up, ambiguities removed. In addition production of the shelf shielded cross-sections should be converged to a much greater accuracy. (author)

  9. Using the probability method for multigroup calculations of reactor cells in a thermal energy range

    International Nuclear Information System (INIS)

    Rubin, I.E.; Pustoshilova, V.S.

    1984-01-01

    The possibility of using the transmission probability method with performance inerpolation for determining spatial-energy neutron flux distribution in cells of thermal heterogeneous reactors is considered. The results of multigroup calculations of several uranium-water plane and cylindrical cells with different fuel enrichment in a thermal energy range are given. A high accuracy of results is obtained with low computer time consumption. The use of the transmission probability method is particularly reasonable in algorithms of the programmes compiled computer with significant reserve of internal memory

  10. Time dependent and asymptotic neutron number probability distribution calculation using discrete Fourier transform

    International Nuclear Information System (INIS)

    Humbert, Ph.

    2005-01-01

    In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)

  11. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  12. Probability tables and gauss quadrature: application to neutron cross-sections in the unresolved energy range

    International Nuclear Information System (INIS)

    Ribon, P.; Maillard, J.M.

    1986-09-01

    The idea of describing neutron cross-section fluctuations by sets of discrete values, called ''probability tables'', was formulated some 15 years ago. We propose to define the probability tables from moments by equating the moments of the actual cross-section distribution in a given energy range to the moments of the table. This definition introduces PADE approximants, orthogonal polynomials and GAUSS quadrature. This mathematical basis applies very well to the total cross-section. Some difficulties appear when partial cross-sections are taken into account, linked to the ambiguity of the definition of multivariate PADE approximants. Nevertheless we propose solutions and choices which appear to be satisfactory. Comparisons are made with other definitions of probability tables and an example of the calculation of a mixture of nuclei is given. 18 refs

  13. Probability tables and gauss quadrature: application to neutron cross-sections in the unresolved energy range

    International Nuclear Information System (INIS)

    Ribon, P.; Maillard, J.M.

    1986-01-01

    The idea of describing neutron cross-section fluctuations by sets of discrete values, called probability tables, was formulated some 15 years ago. The authors propose to define the probability tables from moments by equating the moments of the actual cross-section distribution in a given energy range to the moments of the table. This definition introduces PADE approximants, orthogonal polynomials and GAUSS quadrature. This mathematical basis applies very well to the total cross-section. Some difficulties appear when partial cross-sections are taken into account, linked to the ambiguity of the definition of multivariate PADE approximants. Nevertheless the authors propose solutions and choices which appear to be satisfactory. Comparisons are made with other definition of probability tables and an example of the calculation of a mixture of nuclei is given

  14. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  15. Exploring flavor-dependent long-range forces in long-baseline neutrino oscillation experiments

    Science.gov (United States)

    Chatterjee, Sabya Sachi; Dasgupta, Arnab; Agarwalla, Sanjib Kumar

    2015-12-01

    The Standard Model gauge group can be extended with minimal matter content by introducing anomaly free U(1) symmetry, such as L e - L μ or L e - L τ . If the neutral gauge boson corresponding to this abelian symmetry is ultra-light, then it will give rise to flavor-dependent long-range leptonic force, which can have significant impact on neutrino oscillations. For an instance, the electrons inside the Sun can generate a flavor-dependent long-range potential at the Earth surface, which can suppress the ν μ → ν e appearance probability in terrestrial experiments. The sign of this potential is opposite for anti-neutrinos, and affects the oscillations of (anti-)neutrinos in different fashion. This feature invokes fake CP-asymmetry like the SM matter effect and can severely affect the leptonic CP-violation searches in long-baseline experiments. In this paper, we study in detail the possible impacts of these long-range flavor-diagonal neutral current interactions due to L e - L μ symmetry, when (anti-)neutrinos travel from Fermilab to Homestake (1300 km) and CERN to Pyhäsalmi (2290 km) in the context of future high-precision superbeam facilities, DUNE and LBNO respectively. If there is no signal of long-range force, DUNE (LBNO) can place stringent constraint on the effective gauge coupling α eμ < 1.9 × 10-53 (7.8 × 10-54) at 90% C.L., which is almost 30 (70) times better than the existing bound from the Super-Kamiokande experiment. We also observe that if α eμ ≥ 2 × 10-52, the CP-violation discovery reach of these future facilities vanishes completely. The mass hierarchy measurement remains robust in DUNE (LBNO) if α eμ < 5 × 10-52 (10-52).

  16. The temperature dependence of the BK channel activity - kinetics, thermodynamics, and long-range correlations.

    Science.gov (United States)

    Wawrzkiewicz-Jałowiecka, Agata; Dworakowska, Beata; Grzywna, Zbigniew J

    2017-10-01

    Large-conductance, voltage dependent, Ca 2+ -activated potassium channels (BK) are transmembrane proteins that regulate many biological processes by controlling potassium flow across cell membranes. Here, we investigate to what extent temperature (in the range of 17-37°C with ΔT=5°C step) is a regulating parameter of kinetic properties of the channel gating and memory effect in the series of dwell-time series of subsequent channel's states, at membrane depolarization and hyperpolarization. The obtained results indicate that temperature affects strongly the BK channels' gating, but, counterintuitively, it exerts no effect on the long-range correlations, as measured by the Hurst coefficient. Quantitative differences between dependencies of appropriate channel's characteristics on temperature are evident for different regimes of voltage. Examining the characteristics of BK channel activity as a function of temperature allows to estimate the net activation energy (E act ) and changes of thermodynamic parameters (ΔH, ΔS, ΔG) by channel opening. Larger E act corresponds to the channel activity at membrane hyperpolarization. The analysis of entropy and enthalpy changes of closed to open channel's transition suggest the entropy-driven nature of the increase of open state probability during voltage activation and supports the hypothesis about the voltage-dependent geometry of the channel vestibule. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    Science.gov (United States)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  18. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  19. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  20. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  1. On discriminating between long-range dependence and changes in mean

    OpenAIRE

    Berkes, István; Horváth, Lajos; Kokoszka, Piotr; Shao, Qi-Man

    2006-01-01

    We develop a testing procedure for distinguishing between a long-range dependent time series and a weakly dependent time series with change-points in the mean. In the simplest case, under the null hypothesis the time series is weakly dependent with one change in mean at an unknown point, and under the alternative it is long-range dependent. We compute the CUSUM statistic Tn, which allows us to construct an estimator k̂ of a change-point. We then compute the statistic Tn,1 based on the observa...

  2. Testing for long-range dependence in world stock markets

    OpenAIRE

    Cajueiro, Daniel Oliveira; Tabak, Benjamin Miranda

    2008-01-01

    In this paper, we show a novel approach to rank stock market indices in terms of weak form efficiency using state of the art methodology in statistical physics. We employ the R/S and V/S methodologies to test for long-range dependence in equity returns and volatility. Empirical results suggests that although emerging markets possess stronger long-range dependence in equity returns than developed economies, this is not true for volatility. In the case of volatility, Hurst exponents...

  3. Trait mindfulness, reasons for living and general symptom severity as predictors of suicide probability in males with substance abuse or dependence.

    Directory of Open Access Journals (Sweden)

    Parvaneh Mohammadkhani

    2015-03-01

    Full Text Available The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms.Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS.The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001. The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001.It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability.

  4. Experimental impact-parameter--dependent probabilities for K-shell vacancy production by fast heavy-ion projectiles

    International Nuclear Information System (INIS)

    Randall, R.R.; Bednar, J.A.; Curnutte, B.; Cocke, C.L.

    1976-01-01

    The impact-parameter dependence of the probability for production of target K x rays has been measured for oxygen projectiles on copper and for carbon and fluorine projectiles on argon at scaled velocities near 0.5. The O-on-Cu data were taken for 1.56-, 1.88-, and 2.69-MeV/amu O beams incident upon thin Cu foils. A thin Ar-gas target was used for 1.56-MeV/amu C and F beams, permitting measurements to be made for charge-pure C +4 , C +6 , F +9 and F +5 projectiles. Ar and Cu K x rays were observed with a Si(Li) detector and scattered projectiles with a collimated surface-barrier detector. Comparison of the shapes of the measured K-vacancy--production probability curves with predictions of the semiclassical Coulomb approximation (SCA) shows adequate agreement for the O-on-Cu system. For the higher ratio of projectile-to-target nuclear charge (Z 1 /Z 2 ) characterizing the C-on-Ar and F-on-Ar systems, the SCA predictions are entirely inadequate in describing the observed impact-parameter dependence. In particular, they cannot account for large probabilities found at large impact parameters. Furthermore, the dependence of the shapes on the projectile charge state is found to become pronounced at larger Z 1 /Z 2 . Attempts to account for this behavior in terms of alternative vacancy-production processes are discussed

  5. Prediction suppression in monkey inferotemporal cortex depends on the conditional probability between images.

    Science.gov (United States)

    Ramachandran, Suchitra; Meyer, Travis; Olson, Carl R

    2016-01-01

    When monkeys view two images in fixed sequence repeatedly over days and weeks, neurons in area TE of the inferotemporal cortex come to exhibit prediction suppression. The trailing image elicits only a weak response when presented following the leading image that preceded it during training. Induction of prediction suppression might depend either on the contiguity of the images, as determined by their co-occurrence and captured in the measure of joint probability P(A,B), or on their contingency, as determined by their correlation and as captured in the measures of conditional probability P(A|B) and P(B|A). To distinguish between these possibilities, we measured prediction suppression after imposing training regimens that held P(A,B) constant but varied P(A|B) and P(B|A). We found that reducing either P(A|B) or P(B|A) during training attenuated prediction suppression as measured during subsequent testing. We conclude that prediction suppression depends on contingency, as embodied in the predictive relations between the images, and not just on contiguity, as embodied in their co-occurrence. Copyright © 2016 the American Physiological Society.

  6. Measuring sensitivity in pharmacoeconomic studies. Refining point sensitivity and range sensitivity by incorporating probability distributions.

    Science.gov (United States)

    Nuijten, M J

    1999-07-01

    The aim of the present study is to describe a refinement of a previously presented method, based on the concept of point sensitivity, to deal with uncertainty in economic studies. The original method was refined by the incorporation of probability distributions which allow a more accurate assessment of the level of uncertainty in the model. In addition, a bootstrap method was used to create a probability distribution for a fixed input variable based on a limited number of data points. The original method was limited in that the sensitivity measurement was based on a uniform distribution of the variables and that the overall sensitivity measure was based on a subjectively chosen range which excludes the impact of values outside the range on the overall sensitivity. The concepts of the refined method were illustrated using a Markov model of depression. The application of the refined method substantially changed the ranking of the most sensitive variables compared with the original method. The response rate became the most sensitive variable instead of the 'per diem' for hospitalisation. The refinement of the original method yields sensitivity outcomes, which greater reflect the real uncertainty in economic studies.

  7. OL-DEC-MDP Model for Multiagent Online Scheduling with a Time-Dependent Probability of Success

    Directory of Open Access Journals (Sweden)

    Cheng Zhu

    2014-01-01

    Full Text Available Focusing on the on-line multiagent scheduling problem, this paper considers the time-dependent probability of success and processing duration and proposes an OL-DEC-MDP (opportunity loss-decentralized Markov Decision Processes model to include opportunity loss into scheduling decision to improve overall performance. The success probability of job processing as well as the process duration is dependent on the time at which the processing is started. The probability of completing the assigned job by an agent would be higher when the process is started earlier, but the opportunity loss could also be high due to the longer engaging duration. As a result, OL-DEC-MDP model introduces a reward function considering the opportunity loss, which is estimated based on the prediction of the upcoming jobs by a sampling method on the job arrival. Heuristic strategies are introduced in computing the best starting time for an incoming job by each agent, and an incoming job will always be scheduled to the agent with the highest reward among all agents with their best starting policies. The simulation experiments show that the OL-DEC-MDP model will improve the overall scheduling performance compared with models not considering opportunity loss in heavy-loading environment.

  8. Probability of primordial black hole formation and its dependence on the radial profile of initial configurations

    International Nuclear Information System (INIS)

    Hidalgo, J. C.; Polnarev, A. G.

    2009-01-01

    In this paper we derive the probability of the radial profiles of spherically symmetric inhomogeneities in order to provide an improved estimation of the number density of primordial black holes (PBHs). We demonstrate that the probability of PBH formation depends sensitively on the radial profile of the initial configuration. We do this by characterizing this profile with two parameters chosen heuristically: the amplitude of the inhomogeneity and the second radial derivative, both evaluated at the center of the configuration. We calculate the joint probability of initial cosmological inhomogeneities as a function of these two parameters and then find a correspondence between these parameters and those used in numerical computations of PBH formation. Finally, we extend our heuristic study to evaluate the probability of PBH formation taking into account for the first time the radial profile of curvature inhomogeneities.

  9. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  10. A better understanding of long-range temporal dependence of traffic flow time series

    Science.gov (United States)

    Feng, Shuo; Wang, Xingmin; Sun, Haowei; Zhang, Yi; Li, Li

    2018-02-01

    Long-range temporal dependence is an important research perspective for modelling of traffic flow time series. Various methods have been proposed to depict the long-range temporal dependence, including autocorrelation function analysis, spectral analysis and fractal analysis. However, few researches have studied the daily temporal dependence (i.e. the similarity between different daily traffic flow time series), which can help us better understand the long-range temporal dependence, such as the origin of crossover phenomenon. Moreover, considering both types of dependence contributes to establishing more accurate model and depicting the properties of traffic flow time series. In this paper, we study the properties of daily temporal dependence by simple average method and Principal Component Analysis (PCA) based method. Meanwhile, we also study the long-range temporal dependence by Detrended Fluctuation Analysis (DFA) and Multifractal Detrended Fluctuation Analysis (MFDFA). The results show that both the daily and long-range temporal dependence exert considerable influence on the traffic flow series. The DFA results reveal that the daily temporal dependence creates crossover phenomenon when estimating the Hurst exponent which depicts the long-range temporal dependence. Furthermore, through the comparison of the DFA test, PCA-based method turns out to be a better method to extract the daily temporal dependence especially when the difference between days is significant.

  11. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  12. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  13. Registration-Based Range-Dependence Compensation for Bistatic STAP Radars

    Directory of Open Access Journals (Sweden)

    Lapierre Fabian D

    2005-01-01

    Full Text Available We address the problem of detecting slow-moving targets using space-time adaptive processing (STAP radar. Determining the optimum weights at each range requires data snapshots at neighboring ranges. However, in virtually all configurations, snapshot statistics are range dependent, meaning that snapshots are nonstationary with respect to range. This results in poor performance. In this paper, we propose a new compensation method based on registration of clutter ridges and designed to work on a single realization of the stochastic snapshot at each range. The method has been successfully tested on simulated, stochastic snapshots. An evaluation of performance is presented.

  14. Long-range dependence in returns and volatility of Central European Stock Indices

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2010-01-01

    Roč. 2010, č. 3 (2010), s. 1-19 R&D Projects: GA ČR GD402/09/H045 Institutional research plan: CEZ:AV0Z10750506 Keywords : long-range dependence * rescaled range * modified rescaled range * bootstrapping Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-long-range dependence in returns and volatility of central european stock indices.pdf

  15. Range-separated time-dependent density-functional theory with a frequency-dependent second-order Bethe-Salpeter correlation kernel

    Energy Technology Data Exchange (ETDEWEB)

    Rebolini, Elisa, E-mail: elisa.rebolini@kjemi.uio.no; Toulouse, Julien, E-mail: julien.toulouse@upmc.fr [Laboratoire de Chimie Théorique, Sorbonne Universités, UPMC Univ Paris 06, CNRS, 4 place Jussieu, F-75005 Paris (France)

    2016-03-07

    We present a range-separated linear-response time-dependent density-functional theory (TDDFT) which combines a density-functional approximation for the short-range response kernel and a frequency-dependent second-order Bethe-Salpeter approximation for the long-range response kernel. This approach goes beyond the adiabatic approximation usually used in linear-response TDDFT and aims at improving the accuracy of calculations of electronic excitation energies of molecular systems. A detailed derivation of the frequency-dependent second-order Bethe-Salpeter correlation kernel is given using many-body Green-function theory. Preliminary tests of this range-separated TDDFT method are presented for the calculation of excitation energies of the He and Be atoms and small molecules (H{sub 2}, N{sub 2}, CO{sub 2}, H{sub 2}CO, and C{sub 2}H{sub 4}). The results suggest that the addition of the long-range second-order Bethe-Salpeter correlation kernel overall slightly improves the excitation energies.

  16. A novel nuclear dependence of nucleon–nucleon short-range correlations

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Hongkai [College of Physics and Electronic Engineering, Northwest Normal University, Lanzhou 730070 (China); Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Wang, Rong, E-mail: rwang@impcas.ac.cn [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Lanzhou University, Lanzhou 730000 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Huang, Yin [Lanzhou University, Lanzhou 730000 (China); Chen, Xurong [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2017-06-10

    A linear correlation is found between the magnitude of nucleon–nucleon short-range correlations and the nuclear binding energy per nucleon with pairing energy removed. By using this relation, the strengths of nucleon–nucleon short-range correlations of some unmeasured nuclei are predicted. Discussions on nucleon–nucleon pairing energy and nucleon–nucleon short-range correlations are made. The found nuclear dependence of nucleon–nucleon short-range correlations may shed some lights on the short-range structure of nucleus.

  17. Attempts to reduce alcohol intake and treatment needs among people with probable alcohol dependence in England: a general population survey.

    Science.gov (United States)

    Dunne, Jacklyn; Kimergård, Andreas; Brown, Jamie; Beard, Emma; Buykx, Penny; Michie, Susan; Drummond, Colin

    2018-03-25

    To compare the proportion of people in England with probable alcohol dependence [Alcohol Use Disorders Identification Test (AUDIT) score ≥ 20] with those with other drinking patterns (categorized by AUDIT scores) in terms of motivation to reduce drinking and use of alcohol support resources. A combination of random probability and simple quota sampling to conduct monthly cross-sectional household computer-assisted interviews between March 2014 and August 2017. The general population in all nine regions of England. Participants in the Alcohol Toolkit Study (ATS), a monthly household survey of alcohol consumption among people aged 16 years and over in England (n = 69 826). The mean age was 47 years [standard deviation (SD) = 18.78; 95% confidence interval (CI) = 46.8-47] and 51% (n = 35 560) were female. χ 2 tests were used to investigate associations with demographic variables, motivation to quit drinking, attempts to quit drinking, general practitioner (GP) engagement and types of support accessed in the last 12 months across AUDIT risk zones. A total of 0.6% were classified as people with probable alcohol dependence (95% CI = 0.5-0.7). Motivation to quit (χ 2  = 1692.27, P AUDIT risk zone. People with probable dependence were more likely than other ATS participants to have a past-year attempt to cut down or quit (51.8%) and have received a specialist referral from their GP about drinking (13.7%), and less likely to report no motivation to reduce their drinking (26.2%). Those with probable dependence had higher use of self-help books and mobile applications (apps) than other ATS participants; however, 27.7% did not access any resources during their most recent attempt to cut down. Adults in England with probable alcohol dependence, measured through the Alcohol Use Disorders Identification Test, demonstrate higher motivation to quit drinking and greater use of both specialist treatment and self-driven support compared with those in other

  18. Testing for long-range dependence in the Brazilian term structure of interest rates

    International Nuclear Information System (INIS)

    Cajueiro, Daniel O.; Tabak, Benjamin M.

    2009-01-01

    This paper presents empirical evidence of fractional dynamics in interest rates for different maturities for Brazil. A variation of a newly developed test for long-range dependence, the V/S statistic, with a post-blackening bootstrap is employed. Results suggest that Brazilian interest rates possess strong long-range dependence in volatility, even when considering the structural break in 1999. These findings imply that the development of policy models that give rise to long-range dependence in interest rates' volatility could be very useful. The long-short-term interest rates spread has strong long-range dependence, which suggests that traditional tests of expectation hypothesis of the term structure of interest rates may be misspecified.

  19. HELIOS: transformation laws for multiple-collision probabilities with angular dependence

    International Nuclear Information System (INIS)

    Villarino, E.A.; Stamm'ler, R.J.J.

    1996-01-01

    In the lattice code HELIOS, neutron and gamma transport in a given system is treated by the CCCP (current-coupling collision-probability) method. The system is partitioned into space elements which are coupled by currents. Inside the space elements first-flight probabilities are used to obtain the coefficients of the coupling equation and of the equations for the fluxes. The calculation of these coefficients is expensive in CPU time on two scores: the evaluation of the first-flight probabilities, and the matrix inversion to convert these probabilities into the desired coefficients. If the cross sections of two geometrically equal space elements, or of the same element at an earlier burnup level, differ less than a small fraction, considerable CPU time can be saved by using transformation laws. Previously, such laws were derived for first-flight probabilities; here, they are derived for the multiple-collision coefficients of the CCCP equations. They avoid not only the expensive calculations of the first-flight probabilities, but also the subsequent matrix inversion. Various examples illustrate the savings achieved by using these new transformation laws - or by directly using earlier calculated coefficients, if the cross section differences are negligible. (author)

  20. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Sissay, Adonay [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J. [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Lopata, Kenneth, E-mail: klopata@lsu.edu [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2016-09-07

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  1. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    International Nuclear Information System (INIS)

    Sissay, Adonay; Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J.; Lopata, Kenneth

    2016-01-01

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  2. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  3. Search for an intermediate-range composition-dependent force

    International Nuclear Information System (INIS)

    Boynton, P.E.; Crosby, D.; Ekstrom, P.; Szumilo, A.

    1987-01-01

    We have conducted an experiment to detect a composition-dependent force with range λ between 10 m and 1 km, and find a statistically significant effect. If interpreted as arising from a new force, this result and other recent measurementes would be consistent in strength only if the coupling were predominantly to nuclear isospin

  4. The effect of fog on the probability density distribution of the ranging data of imaging laser radar

    Directory of Open Access Journals (Sweden)

    Wenhua Song

    2018-02-01

    Full Text Available This paper outlines theoretically investigations of the probability density distribution (PDD of ranging data for the imaging laser radar (ILR system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.

  5. The effect of fog on the probability density distribution of the ranging data of imaging laser radar

    Science.gov (United States)

    Song, Wenhua; Lai, JianCheng; Ghassemlooy, Zabih; Gu, Zhiyong; Yan, Wei; Wang, Chunyong; Li, Zhenhua

    2018-02-01

    This paper outlines theoretically investigations of the probability density distribution (PDD) of ranging data for the imaging laser radar (ILR) system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.

  6. Maximizing probable oil field profit: uncertainties on well spacing

    International Nuclear Information System (INIS)

    MacKay, J.A.; Lerche, I.

    1997-01-01

    The influence of uncertainties in field development costs, well costs, lifting costs, selling price, discount factor, and oil field reserves are evaluated for their impact on assessing probable ranges of uncertainty on present day worth (PDW), oil field lifetime τ 2/3 , optimum number of wells (OWI), and the minimum (n-) and maximum (n+) number of wells to produce a PDW ≥ O. The relative importance of different factors in contributing to the uncertainties in PDW, τ 2/3 , OWI, nsub(-) and nsub(+) is also analyzed. Numerical illustrations indicate how the maximum PDW depends on the ranges of parameter values, drawn from probability distributions using Monte Carlo simulations. In addition, the procedure illustrates the relative importance of contributions of individual factors to the total uncertainty, so that one can assess where to place effort to improve ranges of uncertainty; while the volatility of each estimate allows one to determine when such effort is needful. (author)

  7. Long-range dependence and sea level forecasting

    CERN Document Server

    Ercan, Ali; Abbasov, Rovshan K

    2013-01-01

    This study shows that the Caspian Sea level time series possess long range dependence even after removing linear trends, based on analyses of the Hurst statistic, the sample autocorrelation functions, and the periodogram of the series. Forecasting performance of ARMA, ARIMA, ARFIMA and Trend Line-ARFIMA (TL-ARFIMA) combination models are investigated. The forecast confidence bands and the forecast updating methodology, provided for ARIMA models in the literature, are modified for the ARFIMA models. Sample autocorrelation functions are utilized to estimate the differencing lengths of the ARFIMA

  8. Long-range dependence in returns and volatility of Central European Stock Indices

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2010-01-01

    Roč. 17, č. 27 (2010), s. 50-67 ISSN 1212-074X R&D Projects: GA ČR GD402/09/H045; GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 5183/2010 Institutional research plan: CEZ:AV0Z10750506 Keywords : long-range dependence * bootstrapping * rescaled range analysis * rescaled variance analysis Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-long-range dependence in returns and volatility of central european stock indices bces.pdf

  9. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  10. Investigation of photon detection probability dependence of SPADnet-I digital photon counter as a function of angle of incidence, wavelength and polarization

    Energy Technology Data Exchange (ETDEWEB)

    Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor

    2015-01-01

    SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.

  11. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  12. Ruin Probabilities in a Dependent Discrete-Time Risk Model With Gamma-Like Tailed Insurance Risks

    Directory of Open Access Journals (Sweden)

    Xing-Fang Huang

    2017-03-01

    Full Text Available This paper considered a dependent discrete-time risk model, in which the insurance risks are represented by a sequence of independent and identically distributed real-valued random variables with a common Gamma-like tailed distribution; the financial risks are denoted by another sequence of independent and identically distributed positive random variables with a finite upper endpoint, but a general dependence structure exists between each pair of the insurance risks and the financial risks. Following the works of Yang and Yuen in 2016, we derive some asymptotic relations for the finite-time and infinite-time ruin probabilities. As a complement, we demonstrate our obtained result through a Crude Monte Carlo (CMC simulation with asymptotics.

  13. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  14. Linker-dependent Junction Formation Probability in Single-Molecule Junctions

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Pil Sun; Kim, Taekyeong [HankukUniversity of Foreign Studies, Yongin (Korea, Republic of)

    2015-01-15

    We compare the junction formation probabilities of single-molecule junctions with different linker molecules by using a scanning tunneling microscope-based break-junction technique. We found that the junction formation probability varies as SH > SMe > NH2 for the benzene backbone molecule with different types of anchoring groups, through quantitative statistical analysis. These results are attributed to different bonding forces according to the linker groups formed with Au atoms in the electrodes, which is consistent with previous works. Our work allows a better understanding of the contact chemistry in the metal.molecule junction for future molecular electronic devices.

  15. Impact-parameter dependence of the total probability for electromagnetic electron-positron pair production in relativistic heavy-ion collisions

    International Nuclear Information System (INIS)

    Hencken, K.; Trautmann, D.; Baur, G.

    1995-01-01

    We calculate the impact-parameter-dependent total probability P total (b) for the electromagnetic production of electron-positron pairs in relativistic heavy-ion collisions in lowest order. We study expecially impact parameters smaller than the Compton wavelength of the electron, where the equivalent-photon approximation cannot be used. Calculations with and without a form factor for the heavy ions are done; the influence is found to be small. The lowest-order results are found to violate unitarity and are used for the calculation of multiple-pair production probabilities with the help of the approximate Poisson distribution already found in earlier publications

  16. Fractality Evidence and Long-Range Dependence on Capital Markets: a Hurst Exponent Evaluation

    Science.gov (United States)

    Oprean, Camelia; Tănăsescu, Cristina

    2014-07-01

    Since the existence of market memory could implicate the rejection of the efficient market hypothesis, the aim of this paper is to find any evidence that selected emergent capital markets (eight European and BRIC markets, namely Hungary, Romania, Estonia, Czech Republic, Brazil, Russia, India and China) evince long-range dependence or the random walk hypothesis. In this paper, the Hurst exponent as calculated by R/S fractal analysis and Detrended Fluctuation Analysis is our measure of long-range dependence in the series. The results reinforce our previous findings and suggest that if stock returns present long-range dependence, the random walk hypothesis is not valid anymore and neither is the market efficiency hypothesis.

  17. Misclassification probability as obese or lean in hypercaloric and normocaloric diet

    Directory of Open Access Journals (Sweden)

    ANDRÉ F NASCIMENTO

    2008-01-01

    Full Text Available The aim of the present study was to determine the classification error probabilities, as lean or obese, in hypercaloric diet-induced obesity, which depends on the variable used to characterize animal obesity. In addition, the misclassification probabilities in animáis submitted to normocaloric diet were also evaluated. Male Wistar rats were randomly distributed into two groups: normal diet (ND; n=31; 3,5 Kcal/g and hypercaloric diet (HD; n=31; 4,6 Kcal/g. The ND group received commercial Labina rat feed and HD animáis a cycle of five hypercaloric diets for a 14-week period. The variables analysed were body weight, body composition, body weight to length ratio, Lee Índex, body mass Índex and misclassification probability. A 5% significance level was used. The hypercaloric pellet-diet cycle promoted increase of body weight, carcass fat, body weight to length ratio and Lee Índex. The total misclassification probabilities ranged from 19.21% to 40.91%. In conclusión, the results of this experiment show that misclassification probabilities occur when dietary manipulation is used to promote obesity in animáis. This misjudgement ranges from 19.49% to 40.52% in hypercaloric diet and 18.94% to 41.30% in normocaloric diet.

  18. Persistent current and transmission probability in the Aharonov-Bohm ring with an embedded quantum dot

    International Nuclear Information System (INIS)

    Wu Suzhi; Li Ning; Jin Guojun; Ma Yuqiang

    2008-01-01

    Persistent current and transmission probability in the Aharonov-Bohm (AB) ring with an embedded quantum dot (QD) are studied using the technique of the scattering matrix. For the first time, we find that the persistent current can arise in the absence of magnetic flux in the ring with an embedded QD. The persistent current and the transmission probability are sensitive to the lead-ring coupling and the short-range potential barrier. It is shown that increasing the lead-ring coupling or the short-range potential barrier causes the suppression of the persistent current and the increasing resonance width of the transmission probability. The effect of the potential barrier on the number of the transmission peaks is also investigated. The dependence of the persistent current and the transmission probability on the magnetic flux exhibits a periodic property with period of the flux quantum

  19. AUDIT-C scores as a scaled marker of mean daily drinking, alcohol use disorder severity, and probability of alcohol dependence in a U.S. general population sample of drinkers.

    Science.gov (United States)

    Rubinsky, Anna D; Dawson, Deborah A; Williams, Emily C; Kivlahan, Daniel R; Bradley, Katharine A

    2013-08-01

    Brief alcohol screening questionnaires are increasingly used to identify alcohol misuse in routine care, but clinicians also need to assess the level of consumption and the severity of misuse so that appropriate intervention can be offered. Information provided by a patient's alcohol screening score might provide a practical tool for assessing the level of consumption and severity of misuse. This post hoc analysis of data from the 2001 to 2002 National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) included 26,546 U.S. adults who reported drinking in the past year and answered additional questions about their consumption, including Alcohol Use Disorders Identification Test-Consumption questionnaire (AUDIT-C) alcohol screening. Linear or logistic regression models and postestimation methods were used to estimate mean daily drinking, the number of endorsed alcohol use disorder (AUD) criteria ("AUD severity"), and the probability of alcohol dependence associated with each individual AUDIT-C score (1 to 12), after testing for effect modification by gender and age. Among eligible past-year drinkers, mean daily drinking, AUD severity, and the probability of alcohol dependence increased exponentially across increasing AUDIT-C scores. Mean daily drinking ranged from alcohol dependence ranged from used to estimate patient-specific consumption and severity based on age, gender, and alcohol screening score. This information could be integrated into electronic decision support systems to help providers estimate and provide feedback about patient-specific risks and identify those patients most likely to benefit from further diagnostic assessment. Copyright © 2013 by the Research Society on Alcoholism.

  20. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  1. Measurements of transition probabilities in the range from vacuum ultraviolet to infrared

    International Nuclear Information System (INIS)

    Peraza Fernandez, M.C.

    1992-01-01

    In this memory we describe the design, testing and calibration of different spectrometers to measure transition probabilities from the vacuum ultraviolet to the infrared spectral region. For the infrared measurements we have designed and performed a phase sensitive detection system, using an InGaAs photodiode like detector. With this system we have determined the transition probabilities of infrared lines of KrI and XeI. For these lines we haven't found previous measurements. In the vacuum ultraviolet spectral region we have designed a 3 m normal incidence monochromator where we have installed an optical multichannel analyzer. We have tested its accurate working, obtaining the absorption spectrum of KrI. In the visible region we have obtained the emission spectrum of Al using different spectral: hallow-cathode lamp and Nd: YAG laser produced Al plasma. With these spectra we have determined different atomic parameters like transition probabilities and electron temperatures.(author). 83 refs

  2. Improving Delay-Range-Dependent Stability Condition for Systems with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Wei Qian

    2013-01-01

    Full Text Available This paper discusses the delay-range-dependent stability for systems with interval time-varying delay. Through defining the new Lyapunov-Krasovskii functional and estimating the derivative of the LKF by introducing new vectors, using free matrices and reciprocally convex approach, the new delay-range-dependent stability conditions are obtained. Two well-known examples are given to illustrate the less conservatism of the proposed theoretical results.

  3. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  4. Energy dependence of polymer gels in the orthovoltage energy range

    Directory of Open Access Journals (Sweden)

    Yvonne Roed

    2014-03-01

    Full Text Available Purpose: Ortho-voltage energies are often used for treatment of patients’ superficial lesions, and also for small- animal irradiations. Polymer-Gel dosimeters such as MAGAT (Methacrylic acid Gel and THPC are finding increasing use for 3-dimensional verification of radiation doses in a given treatment geometry. For mega-voltage beams, energy dependence of MAGAT has been quoted as nearly energy-independent. In the kilo-voltage range, there is hardly any literature to shade light on its energy dependence.Methods: MAGAT was used to measure depth-dose for 250 kVp beam. Comparison with ion-chamber data showed a discrepancy increasing significantly with depth. An over-response as much as 25% was observed at a depth of 6 cm.Results and Conclusion: Investigation concluded that 6 cm water in the beam resulted in a half-value-layer (HVL change from 1.05 to 1.32 mm Cu. This amounts to an effective-energy change from 81.3 to 89.5 keV. Response measurements of MAGAT at these two energies explained the observed discrepancy in depth-dose measurements. Dose-calibration curves of MAGAT for (i 250 kVp beam, and (ii 250 kVp beam through 6 cm of water column are presented showing significant energy dependence.-------------------Cite this article as: Roed Y, Tailor R, Pinksy L, Ibbott G. Energy dependence of polymer gels in the orthovoltage energy range. Int J Cancer Ther Oncol 2014; 2(2:020232. DOI: 10.14319/ijcto.0202.32 

  5. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  6. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  7. Stochastic processes and long range dependence

    CERN Document Server

    Samorodnitsky, Gennady

    2016-01-01

    This monograph is a gateway for researchers and graduate students to explore the profound, yet subtle, world of long-range dependence (also known as long memory). The text is organized around the probabilistic properties of stationary processes that are important for determining the presence or absence of long memory. The first few chapters serve as an overview of the general theory of stochastic processes which gives the reader sufficient background, language, and models for the subsequent discussion of long memory. The later chapters devoted to long memory begin with an introduction to the subject along with a brief history of its development, followed by a presentation of what is currently the best known approach, applicable to stationary processes with a finite second moment. The book concludes with a chapter devoted to the author’s own, less standard, point of view of long memory as a phase transition, and even includes some novel results. Most of the material in the book has not previously been publis...

  8. Multi-configuration time-dependent density-functional theory based on range separation

    DEFF Research Database (Denmark)

    Fromager, E.; Knecht, S.; Jensen, Hans Jørgen Aagaard

    2013-01-01

    Multi-configuration range-separated density-functional theory is extended to the time-dependent regime. An exact variational formulation is derived. The approximation, which consists in combining a long-range Multi-Configuration- Self-Consistent Field (MCSCF) treatment with an adiabatic short...... (srGGA) approximations. As expected, when modeling long-range interactions with the MCSCF model instead of the adiabatic Buijse-Baerends density-matrix functional as recently proposed by Pernal [J. Chem. Phys. 136, 184105 (2012)10.1063/1.4712019], the description of both the 1D doubly-excited state...

  9. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  10. The rate of beneficial mutations surfing on the wave of a range expansion.

    Directory of Open Access Journals (Sweden)

    Rémi Lehe

    Full Text Available Many theoretical and experimental studies suggest that range expansions can have severe consequences for the gene pool of the expanding population. Due to strongly enhanced genetic drift at the advancing frontier, neutral and weakly deleterious mutations can reach large frequencies in the newly colonized regions, as if they were surfing the front of the range expansion. These findings raise the question of how frequently beneficial mutations successfully surf at shifting range margins, thereby promoting adaptation towards a range-expansion phenotype. Here, we use individual-based simulations to study the surfing statistics of recurrent beneficial mutations on wave-like range expansions in linear habitats. We show that the rate of surfing depends on two strongly antagonistic factors, the probability of surfing given the spatial location of a novel mutation and the rate of occurrence of mutations at that location. The surfing probability strongly increases towards the tip of the wave. Novel mutations are unlikely to surf unless they enjoy a spatial head start compared to the bulk of the population. The needed head start is shown to be proportional to the inverse fitness of the mutant type, and only weakly dependent on the carrying capacity. The precise location dependence of surfing probabilities is derived from the non-extinction probability of a branching process within a moving field of growth rates. The second factor is the mutation occurrence which strongly decreases towards the tip of the wave. Thus, most successful mutations arise at an intermediate position in the front of the wave. We present an analytic theory for the tradeoff between these factors that allows to predict how frequently substitutions by beneficial mutations occur at invasion fronts. We find that small amounts of genetic drift increase the fixation rate of beneficial mutations at the advancing front, and thus could be important for adaptation during species invasions.

  11. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1993-01-01

    This paper will describe a methodology which has been developed to allow accident probabilities associated with one severity category scheme to be transferred to another severity category scheme, permitting some comparisons of different studies at the category level. In this methodology, the severity category schemes to be compared are mapped onto a common set of axes. The axes represent critical accident environments (e.g., impact, thermal, crush, puncture) and indicate the range of accident parameters from zero (no accident) to the most sever credible forces. The choice of critical accident environments for the axes depends on the package being transported and the mode of transportation. The accident probabilities associated with one scheme are then transferred to the other scheme. This transfer of category probabilities is based on the relationships of the critical accident parameters to probability of occurrence. The methodology can be employed to transfer any quantity between category schemes if the appropriate supporting information is available. (J.P.N.)

  12. Time-dependent fracture probability of bilayer, lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation

    Science.gov (United States)

    Anusavice, Kenneth J.; Jadaan, Osama M.; Esquivel–Upshaw, Josephine

    2013-01-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. Objective The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6 mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Materials and methods Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Results Predicted fracture probabilities (Pf) for centrally-loaded 1,6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8 mm/0.8 mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4 mm/1.2 mm). Conclusion CARES/Life results support the proposed crown design and load orientation hypotheses. Significance The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. PMID:24060349

  13. Time-dependent fracture probability of bilayer, lithium-disilicate-based, glass-ceramic, molar crowns as a function of core/veneer thickness ratio and load orientation.

    Science.gov (United States)

    Anusavice, Kenneth J; Jadaan, Osama M; Esquivel-Upshaw, Josephine F

    2013-11-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Predicted fracture probabilities (Pf) for centrally loaded 1.6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8mm/0.8mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4mm/1.2mm). CARES/Life results support the proposed crown design and load orientation hypotheses. The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. Copyright © 2013 Academy of Dental Materials. All rights reserved.

  14. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  16. Normal tissue complication probabilities: dependence on choice of biological model and dose-volume histogram reduction scheme

    International Nuclear Information System (INIS)

    Moiseenko, Vitali; Battista, Jerry; Van Dyk, Jake

    2000-01-01

    Purpose: To evaluate the impact of dose-volume histogram (DVH) reduction schemes and models of normal tissue complication probability (NTCP) on ranking of radiation treatment plans. Methods and Materials: Data for liver complications in humans and for spinal cord in rats were used to derive input parameters of four different NTCP models. DVH reduction was performed using two schemes: 'effective volume' and 'preferred Lyman'. DVHs for competing treatment plans were derived from a sample DVH by varying dose uniformity in a high dose region so that the obtained cumulative DVHs intersected. Treatment plans were ranked according to the calculated NTCP values. Results: Whenever the preferred Lyman scheme was used to reduce the DVH, competing plans were indistinguishable as long as the mean dose was constant. The effective volume DVH reduction scheme did allow us to distinguish between these competing treatment plans. However, plan ranking depended on the radiobiological model used and its input parameters. Conclusions: Dose escalation will be a significant part of radiation treatment planning using new technologies, such as 3-D conformal radiotherapy and tomotherapy. Such dose escalation will depend on how the dose distributions in organs at risk are interpreted in terms of expected complication probabilities. The present study indicates considerable variability in predicted NTCP values because of the methods used for DVH reduction and radiobiological models and their input parameters. Animal studies and collection of standardized clinical data are needed to ascertain the effects of non-uniform dose distributions and to test the validity of the models currently in use

  17. Coevolution of patch-type dependent emigration and patch-type dependent immigration.

    Science.gov (United States)

    Weigang, Helene C

    2017-08-07

    The three phases of dispersal - emigration, transfer and immigration - are affecting each other and the former and latter decisions may depend on patch types. Despite the inevitable fact of the complexity of the dispersal process, patch-type dependencies of dispersal decisions modelled as emigration and immigration are usually missing in theoretical dispersal models. Here, I investigate the coevolution of patch-type dependent emigration and patch-type dependent immigration in an extended Hamilton-May model. The dispersing population inhabits a landscape structured into many patches of two types and disperses during a continuous-time season. The trait under consideration is a four dimensional vector consisting of two values for emigration probability from the patches and two values for immigration probability into the patches of each type. Using the adaptive dynamics approach I show that four qualitatively different dispersal strategies may evolve in different parameter regions, including a counterintuitive strategy, where patches of one type are fully dispersed from (emigration probability is one) but individuals nevertheless always immigrate into them during the dispersal season (immigration probability is one). I present examples of evolutionary branching in a wide parameter range, when the patches with high local death rate during the dispersal season guarantee a high expected disperser output. I find that two dispersal strategies can coexist after evolutionary branching: a strategy with full immigration only into the patches with high expected disperser output coexists with a strategy that immigrates into any patch. Stochastic simulations agree with the numerical predictions. Since evolutionary branching is also found when immigration evolves alone, the present study is adding coevolutionary constraints on the emigration traits and hence finds that the coevolution of a higher dimensional trait sometimes hinders evolutionary diversification. Copyright © 2017

  18. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  19. Probability of spin flipping of proton with energy 6.9 MeV at inelastic scattering with sup(54,56)Fe nuclei

    International Nuclear Information System (INIS)

    Prokopenko, V.S.; Sklyarenko, V.; Chernievskij, V.K.; Shustov, A.V.

    1980-01-01

    Spin-orbital effects of inelastic scattering of protons by nuclei with mean atomic weight are investigated along with the mechanisms of the reaction course by measuring proton spin flip. The experiment consists in measuring proton-gamma coincidences in mutually perpendicular planes by the technique of quick-slow coincidences. The excitation function of the 56 Fe(P,P 1 ) reaction is measured in the 3.5-6.2 MeV energy range. Angular dependences of probability of proton spin flip (a level of 2 + , 0.847 MeV) are measured at energies of incident protons of 4.96; 5.58 and 5.88 MeV. Measurements of probabilities of proton spin flipping at inelastic scattering by sup(54,56)Fe nuclei are performed in the process of studying spin-orbital effects and mechanisms of the reaction course. A conclusion is made that the inelastic scattering process in the energy range under investigation is mainly realized by two equivalent mechanisms: direct interaction and formation of a compound nucleus. Angular dependences for 54 Fe and 56 Fe noticeably differ in the values of probability of spin flip in the angular range of 50-150 deg

  20. System Estimation of Panel Data Models under Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre

    A general dynamic panel data model is considered that incorporates individual and interactive fixed effects allowing for contemporaneous correlation in model innovations. The model accommodates general stationary or nonstationary long-range dependence through interactive fixed effects...... and innovations, removing the necessity to perform a priori unit-root or stationarity testing. Moreover, persistence in innovations and interactive fixed effects allows for cointegration; innovations can also have vector-autoregressive dynamics; deterministic trends can be featured. Estimations are performed...

  1. Common long-range dependence in a panel of hourly Nord Pool electricity prices and loads

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre; Haldrup, Niels; Rodríguez-Caballero, Carlos Vladimir

    to strong seasonal periodicity, and along the cross-sectional dimension, i.e. the hours of the day, there is a strong dependence which necessarily has to be accounted for in order to avoid spurious inference when focusing on the time series dependence alone. The long-range dependence is modelled in terms...... of a fractionally integrated panel data model and it is shown that both prices and loads consist of common factors with long memory and with loadings that vary considerably during the day. Due to the competitiveness of the Nordic power market the aggregate supply curve approximates well the marginal costs...... data approaches to analyse the time series and the cross-sectional dependence of hourly Nord Pool electricity spot prices and loads for the period 2000-2013. Hourly electricity prices and loads data are characterized by strong serial long-range dependence in the time series dimension in addition...

  2. Measurement of the Mis-identification Probability of τ Leptons from Hadronic Jets and from Electrons

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    Measurements of the mis-identification probability of QCD jets and electrons as hadronically decaying τ leptons using tag-and-probe methods are described. The analyses are based on 35pb−1 of proton-proton collision data, taken by the ATLAS experiment at a center-of-mass energy of sqrt(s) = 7 TeV. The mis-identification probabilities range between 10% and 0.1% for QCD jets, and about (1 − 2)% for electrons. They depend on the identification algorithm chosen, the pT and the number of prongs of the τ candidate, and on the amount of pile up present in the event.

  3. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  4. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  5. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  6. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  7. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  8. Generalized Efficient Inference on Factor Models with Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre

    . Short-memory dynamics are allowed in the common factor structure and possibly heteroskedastic error term. In the estimation, a generalized version of the principal components (PC) approach is proposed to achieve efficiency. Asymptotics for efficient common factor and factor loading as well as long......A dynamic factor model is considered that contains stochastic time trends allowing for stationary and nonstationary long-range dependence. The model nests standard I(0) and I(1) behaviour smoothly in common factors and residuals, removing the necessity of a priori unit-root and stationarity testing...

  9. Earthquake simulations with time-dependent nucleation and long-range interactions

    Directory of Open Access Journals (Sweden)

    J. H. Dieterich

    1995-01-01

    Full Text Available A model for rapid simulation of earthquake sequences is introduced which incorporates long-range elastic interactions among fault elements and time-dependent earthquake nucleation inferred from experimentally derived rate- and state-dependent fault constitutive properties. The model consists of a planar two-dimensional fault surface which is periodic in both the x- and y-directions. Elastic interactions among fault elements are represented by an array of elastic dislocations. Approximate solutions for earthquake nucleation and dynamics of earthquake slip are introduced which permit computations to proceed in steps that are determined by the transitions from one sliding state to the next. The transition-driven time stepping and avoidance of systems of simultaneous equations permit rapid simulation of large sequences of earthquake events on computers of modest capacity, while preserving characteristics of the nucleation and rupture propagation processes evident in more detailed models. Earthquakes simulated with this model reproduce many of the observed spatial and temporal characteristics of clustering phenomena including foreshock and aftershock sequences. Clustering arises because the time dependence of the nucleation process is highly sensitive to stress perturbations caused by nearby earthquakes. Rate of earthquake activity following a prior earthquake decays according to Omori's aftershock decay law and falls off with distance.

  10. Short-range correlations in an extended time-dependent mean-field theory

    International Nuclear Information System (INIS)

    Madler, P.

    1982-01-01

    A generalization is performed of the time-dependent mean-field theory by an explicit inclusion of strong short-range correlations on a level of microscopic reversibility relating them to realistic nucleon-nucleon forces. Invoking a least action principle for correlated trial wave functions, equations of motion for the correlation functions and the single-particle model wave function are derived in lowest order of the FAHT cluster expansion. Higher order effects as well as long-range correlations are consider only to the extent to which they contribute to the mean field via a readjusted phenomenological effective two-body interaction. The corresponding correlated stationary problem is investigated and appropriate initial conditions to describe a heavy ion reaction are proposed. The singleparticle density matrix is evaluated

  11. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  12. Should I stay or should I go? A habitat-dependent dispersal kernel improves prediction of movement.

    Directory of Open Access Journals (Sweden)

    Fabrice Vinatier

    Full Text Available The analysis of animal movement within different landscapes may increase our understanding of how landscape features affect the perceptual range of animals. Perceptual range is linked to movement probability of an animal via a dispersal kernel, the latter being generally considered as spatially invariant but could be spatially affected. We hypothesize that spatial plasticity of an animal's dispersal kernel could greatly modify its distribution in time and space. After radio tracking the movements of walking insects (Cosmopolites sordidus in banana plantations, we considered the movements of individuals as states of a Markov chain whose transition probabilities depended on the habitat characteristics of current and target locations. Combining a likelihood procedure and pattern-oriented modelling, we tested the hypothesis that dispersal kernel depended on habitat features. Our results were consistent with the concept that animal dispersal kernel depends on habitat features. Recognizing the plasticity of animal movement probabilities will provide insight into landscape-level ecological processes.

  13. Should I stay or should I go? A habitat-dependent dispersal kernel improves prediction of movement.

    Science.gov (United States)

    Vinatier, Fabrice; Lescourret, Françoise; Duyck, Pierre-François; Martin, Olivier; Senoussi, Rachid; Tixier, Philippe

    2011-01-01

    The analysis of animal movement within different landscapes may increase our understanding of how landscape features affect the perceptual range of animals. Perceptual range is linked to movement probability of an animal via a dispersal kernel, the latter being generally considered as spatially invariant but could be spatially affected. We hypothesize that spatial plasticity of an animal's dispersal kernel could greatly modify its distribution in time and space. After radio tracking the movements of walking insects (Cosmopolites sordidus) in banana plantations, we considered the movements of individuals as states of a Markov chain whose transition probabilities depended on the habitat characteristics of current and target locations. Combining a likelihood procedure and pattern-oriented modelling, we tested the hypothesis that dispersal kernel depended on habitat features. Our results were consistent with the concept that animal dispersal kernel depends on habitat features. Recognizing the plasticity of animal movement probabilities will provide insight into landscape-level ecological processes.

  14. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  15. Rolling estimations of long range dependence volatility for high frequency S&P500 index

    Science.gov (United States)

    Cheong, Chin Wen; Pei, Tan Pei

    2015-10-01

    This study evaluates the time-varying long range dependence behaviors of the S&P500 volatility index using the modified rescaled adjusted range (R/S) statistic. For better computational result, a high frequency rolling bipower variation realized volatility estimates are used to avoid possible abrupt jump. The empirical analysis findings allow us to understand better the informationally market efficiency before and after the subprime mortgage crisis.

  16. Range dependent characteristics in the head-related transfer functions of a bat-head cast: part 2. Binaural characteristics

    International Nuclear Information System (INIS)

    Kim, S; Allen, R; Rowan, D

    2012-01-01

    Further innovations in bio-inspired engineering based on biosonar systems, such as bats, may arise from more detailed understanding of the underlying acoustic processes. This includes the range-dependent properties of bat heads and ears, particularly at the higher frequencies of bat vocalizations. In a companion paper Kim et al (2012 Bioinspir. Biomim.), range-dependent head-related transfer functions of a bat head cast were investigated up to 100 kHz at either ear (i.e. monaural features). The current paper extends this to consider range-dependent spectral and temporal disparities between the two ears (i.e. binaural features), using experimental data and a spherical model of a bat head to provide insights into the physical basis for these features. It was found that binaural temporal and high-frequency binaural spectral features are approximately independent of distance, having the effect of decreasing their angular resolution at close range. In contrast, low-frequency binaural spectral features are strongly distance-dependent, such that angular sensitivity can be maintained by lowering the frequency of the echolocation emission at close range. Together with the companion paper Kim et al, we speculate that distance-dependent low-frequency monaural and binaural features at short range might help explain why some species of bats that drop the frequency of their calls on target approach while approaching a target. This also provides an impetus for the design of effective emissions in sonar engineering applied to similar tasks. (paper)

  17. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  18. Long Range Dependence Prognostics for Bearing Vibration Intensity Chaotic Time Series

    Directory of Open Access Journals (Sweden)

    Qing Li

    2016-01-01

    Full Text Available According to the chaotic features and typical fractional order characteristics of the bearing vibration intensity time series, a forecasting approach based on long range dependence (LRD is proposed. In order to reveal the internal chaotic properties, vibration intensity time series are reconstructed based on chaos theory in phase-space, the delay time is computed with C-C method and the optimal embedding dimension and saturated correlation dimension are calculated via the Grassberger–Procaccia (G-P method, respectively, so that the chaotic characteristics of vibration intensity time series can be jointly determined by the largest Lyapunov exponent and phase plane trajectory of vibration intensity time series, meanwhile, the largest Lyapunov exponent is calculated by the Wolf method and phase plane trajectory is illustrated using Duffing-Holmes Oscillator (DHO. The Hurst exponent and long range dependence prediction method are proposed to verify the typical fractional order features and improve the prediction accuracy of bearing vibration intensity time series, respectively. Experience shows that the vibration intensity time series have chaotic properties and the LRD prediction method is better than the other prediction methods (largest Lyapunov, auto regressive moving average (ARMA and BP neural network (BPNN model in prediction accuracy and prediction performance, which provides a new approach for running tendency predictions for rotating machinery and provide some guidance value to the engineering practice.

  19. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  20. Gravity and count probabilities in an expanding universe

    Science.gov (United States)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  1. NEUTRON-PROTON EFFECTIVE RANGE PARAMETERS AND ZERO-ENERGY SHAPE DEPENDENCE.

    Energy Technology Data Exchange (ETDEWEB)

    HACKENBURG, R.W.

    2005-06-01

    A completely model-independent effective range theory fit to available, unpolarized, np scattering data below 3 MeV determines the zero-energy free proton cross section {sigma}{sub 0} = 20.4287 {+-} 0.0078 b, the singlet apparent effective range r{sub s} = 2.754 {+-} 0.018{sub stat} {+-} 0.056{sub syst} fm, and improves the error slightly on the parahydrogen coherent scattering length, a{sub c} = -3.7406 {+-} 0.0010 fm. The triplet and singlet scattering lengths and the triplet mixed effective range are calculated to be a{sub t} = 5.4114 {+-} 0.0015 fm, a{sub s} = -23.7153 {+-} 0.0043 fm, and {rho}{sub t}(0,-{epsilon}{sub t}) = 1.7468 {+-} 0.0019 fm. The model-independent analysis also determines the zero-energy effective ranges by treating them as separate fit parameters without the constraint from the deuteron binding energy {epsilon}{sub t}. These are determined to be {rho}{sub t}(0,0) = 1.705 {+-} 0.023 fm and {rho}{sub s}(0,0) = 2.665 {+-} 0.056 fm. This determination of {rho}{sub t}(0,0) and {rho}{sub s}(0,0) is most sensitive to the sparse data between about 20 and 600 keV, where the correlation between the determined values of {rho}{sub t}(0,0) and {rho}{sub s}(0,0) is at a minimum. This correlation is responsible for the large systematic error in r{sub s}. More precise data in this range are needed. The present data do not event determine (with confidence) that {rho}{sub t}(0,0) {ne} {rho}{sub t}(0, -{epsilon}{sub t}), referred to here as ''zero-energy shape dependence''. The widely used measurement of {sigma}{sub 0} = 20.491 {+-} 0.014 b from W. Dilg, Phys. Rev. C 11, 103 (1975), is argued to be in error.

  2. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  3. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  4. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  5. Calculation of probabilities of rotational transitions of two-atom molecules in the collision with heavy particles

    International Nuclear Information System (INIS)

    Vargin, A.N.; Ganina, N.A.; Konyukhov, V.K.; Selyakov, V.I.

    1975-01-01

    The problem of calculation of collisional probabilities of rotational transitions (CPRT) in molecule-molecule and molecule-atom interactions in a three-dimensional space has been solved in this paper. A quasiclassical approach was used. The calculation of collisional probabilities of rotational transitions trajectory was carried out in the following way. The particle motion trajectory was calculated by a classical method and the time dependence of the perturbation operator was obtained, its averaging over wave functions of initial and finite states produced CPRT. The classical calculation of the molecule motion trajectory was justified by triviality of the de Broglie wavelength, compared with characteristic atomic distances, and by triviality of a transfered rotational quantum compared with the energy of translational motion of particles. The results of calculation depend on the chosen interaction potential of collisional particles. It follows from the Messy criterion that the region of nonadiabaticity of interaction may be compared with internuclear distances of a molecule. Therefore, for the description of the interaction a short-range potential is required. Analytical expressions were obtained appropriate for practical calculations for one- and two-quantum rotational transitions of diatomic molecules. The CPRT was averaged over the Maxwell distribution over velocities and analytical dependences on a gas temperature were obtained. The results of the numerical calculation of probabilities for the HCl-HCl, HCl-He, CO-CO interactions are presented to illustrate the method

  6. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  7. Long-range spatial dependence in fractured rock. Empirical evidence and implications for tracer transport

    International Nuclear Information System (INIS)

    Painter, S.

    1999-02-01

    Nonclassical stochastic continuum models incorporating long-range spatial dependence are evaluated as models for fractured crystalline rock. Open fractures and fracture zones are not modeled explicitly in this approach. The fracture zones and intact rock are modeled as a single stochastic continuum. The large contrasts between the fracture zones and unfractured rock are accounted for by making use of random field models specifically designed for highly variable systems. Hydraulic conductivity data derived from packer tests in the vicinity of the Aespoe Hard Rock Laboratory form the basis for the evaluation. The Aespoe log K data were found to be consistent with a fractal scaling model based on bounded fractional Levy motion (bfLm), a model that has been used previously to model highly variable sedimentary formations. However, the data are not sufficient to choose between this model, a fractional Brownian motion model for the normal-score transform of log K, and a conventional geostatistical model. Stochastic simulations conditioned by the Aespoe data coupled with flow and tracer transport calculations demonstrate that the models with long-range dependence predict earlier arrival times for contaminants. This demonstrates the need to evaluate this class of models when assessing the performance of proposed waste repositories. The relationship between intermediate-scale and large-scale transport properties in media with long-range dependence is also addressed. A new Monte Carlo method for stochastic upscaling of intermediate-scale field data is proposed

  8. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  9. Survival probability in a one-dimensional quantum walk on a trapped lattice

    International Nuclear Information System (INIS)

    Goenuelol, Meltem; Aydiner, Ekrem; Shikano, Yutaka; Muestecaplioglu, Oezguer E

    2011-01-01

    The dynamics of the survival probability of quantum walkers on a one-dimensional lattice with random distribution of absorbing immobile traps is investigated. The survival probability of quantum walkers is compared with that of classical walkers. It is shown that the time dependence of the survival probability of quantum walkers has a piecewise stretched exponential character depending on the density of traps in numerical and analytical observations. The crossover between the quantum analogues of the Rosenstock and Donsker-Varadhan behavior is identified.

  10. Prediction ranges. Annual review

    Energy Technology Data Exchange (ETDEWEB)

    Parker, J.C.; Tharp, W.H.; Spiro, P.S.; Keng, K.; Angastiniotis, M.; Hachey, L.T.

    1988-01-01

    Prediction ranges equip the planner with one more tool for improved assessment of the outcome of a course of action. One of their major uses is in financial evaluations, where corporate policy requires the performance of uncertainty analysis for large projects. This report gives an overview of the uses of prediction ranges, with examples; and risks and uncertainties in growth, inflation, and interest and exchange rates. Prediction ranges and standard deviations of 80% and 50% probability are given for various economic indicators in Ontario, Canada, and the USA, as well as for foreign exchange rates and Ontario Hydro interest rates. An explanatory note on probability is also included. 23 tabs.

  11. Climate driven range divergence among host species affects range-wide patterns of parasitism

    Directory of Open Access Journals (Sweden)

    Richard E. Feldman

    2017-01-01

    Full Text Available Species interactions like parasitism influence the outcome of climate-driven shifts in species ranges. For some host species, parasitism can only occur in that part of its range that overlaps with a second host species. Thus, predicting future parasitism may depend on how the ranges of the two hosts change in relation to each other. In this study, we tested whether the climate driven species range shift of Odocoileus virginianus (white-tailed deer accounts for predicted changes in parasitism of two other species from the family Cervidae, Alces alces (moose and Rangifer tarandus (caribou, in North America. We used MaxEnt models to predict the recent (2000 and future (2050 ranges (probabilities of occurrence of the cervids and a parasite Parelaphostrongylus tenuis (brainworm taking into account range shifts of the parasite’s intermediate gastropod hosts. Our models predicted that range overlap between A. alces/R. tarandus and P. tenuis will decrease between 2000 and 2050, an outcome that reflects decreased overlap between A. alces/R. tarandus and O. virginianus and not the parasites, themselves. Geographically, our models predicted increasing potential occurrence of P. tenuis where A. alces/R. tarandus are likely to decline, but minimal spatial overlap where A. alces/R. tarandus are likely to increase. Thus, parasitism may exacerbate climate-mediated southern contraction of A. alces and R. tarandus ranges but will have limited influence on northward range expansion. Our results suggest that the spatial dynamics of one host species may be the driving force behind future rates of parasitism for another host species.

  12. Measurement of Plutonium-240 Angular Momentum Dependent Fission Probabilities Using the Alpha-Alpha' Reaction

    Science.gov (United States)

    Koglin, Johnathon

    Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to

  13. Hospital of Diagnosis Influences the Probability of Receiving Curative Treatment for Esophageal Cancer.

    Science.gov (United States)

    van Putten, Margreet; Koëter, Marijn; van Laarhoven, Hanneke W M; Lemmens, Valery E P P; Siersema, Peter D; Hulshof, Maarten C C M; Verhoeven, Rob H A; Nieuwenhuijzen, Grard A P

    2018-02-01

    The aim of this article was to study the influence of hospital of diagnosis on the probability of receiving curative treatment and its impact on survival among patients with esophageal cancer (EC). Although EC surgery is centralized in the Netherlands, the disease is often diagnosed in hospitals that do not perform this procedure. Patients with potentially curable esophageal or gastroesophageal junction tumors diagnosed between 2005 and 2013 who were potentially curable (cT1-3,X, any N, M0,X) were selected from the Netherlands Cancer Registry. Multilevel logistic regression was performed to examine the probability to undergo curative treatment (resection with or without neoadjuvant treatment, definitive chemoradiotherapy, or local tumor excision) according to hospital of diagnosis. Effects of variation in probability of undergoing curative treatment among these hospitals on survival were investigated by Cox regression. All 13,017 patients with potentially curable EC, diagnosed in 91 hospitals, were included. The proportion of patients receiving curative treatment ranged from 37% to 83% and from 45% to 86% in the periods 2005-2009 and 2010-2013, respectively, depending on hospital of diagnosis. After adjustment for patient- and hospital-related characteristics these proportions ranged from 41% to 77% and from 50% to 82%, respectively (both P < 0.001). Multivariable survival analyses showed that patients diagnosed in hospitals with a low probability of undergoing curative treatment had a worse overall survival (hazard ratio = 1.13, 95% confidence interval 1.06-1.20; hazard ratio = 1.15, 95% confidence interval 1.07-1.24). The variation in probability of undergoing potentially curative treatment for EC between hospitals of diagnosis and its impact on survival indicates that treatment decision making in EC may be improved.

  14. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  15. A new variable interval schedule with constant hazard rate and finite time range.

    Science.gov (United States)

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  16. Approximation for the Finite-Time Ruin Probability of a General Risk Model with Constant Interest Rate and Extended Negatively Dependent Heavy-Tailed Claims

    Directory of Open Access Journals (Sweden)

    Yang Yang

    2011-01-01

    Full Text Available We propose a general continuous-time risk model with a constant interest rate. In this model, claims arrive according to an arbitrary counting process, while their sizes have dominantly varying tails and fulfill an extended negative dependence structure. We obtain an asymptotic formula for the finite-time ruin probability, which extends a corresponding result of Wang (2008.

  17. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  18. Attributes of seasonal home range influence choice of migratory strategy in white-tailed deer

    Science.gov (United States)

    Henderson, Charles R.; Mitchell, Michael S.; Myers, Woodrow L.; Lukacs, Paul M.; Nelson, Gerald P.

    2018-01-01

    Partial migration is a common life-history strategy among ungulates living in seasonal environments. The decision to migrate or remain on a seasonal range may be influenced strongly by access to high-quality habitat. We evaluated the influence of access to winter habitat of high quality on the probability of a female white-tailed deer (Odocoileus virginianus) migrating to a separate summer range and the effects of this decision on survival. We hypothesized that deer with home ranges of low quality in winter would have a high probability of migrating, and that survival of an individual in winter would be influenced by the quality of their home range in winter. We radiocollared 67 female white-tailed deer in 2012 and 2013 in eastern Washington, United States. We estimated home range size in winter using a kernel density estimator; we assumed the size of the home range was inversely proportional to its quality and the proportion of crop land within the home range was proportional to its quality. Odds of migrating from winter ranges increased by 3.1 per unit increase in home range size and decreased by 0.29 per unit increase in the proportion of crop land within a home range. Annual survival rate for migrants was 0.85 (SD = 0.05) and 0.84 (SD = 0.09) for residents. Our finding that an individual with a low-quality home range in winter is likely to migrate to a separate summer range accords with the hypothesis that competition for a limited amount of home ranges of high quality should result in residents having home ranges of higher quality than migrants in populations experiencing density dependence. We hypothesize that density-dependent competition for high-quality home ranges in winter may play a leading role in the selection of migration strategy by female white-tailed deer.

  19. Simulation of statistical systems with not necessarily real and positive probabilities

    International Nuclear Information System (INIS)

    Kalkreuter, T.

    1991-01-01

    A new method to determine expectation values of observables in statistical systems with not necessarily real and positive probabilities is proposed. It is tested in a numerical study of the two-dimensional O(3)-symmetric nonlinear σ-model with Symanzik's one-loop improved lattice action. This model is simulated as polymer system with field dependent activities which can be made positive definite or indefinite by adjusting additive constants of the action. For a system with indefinite activities the new proposal is found to work. It is also verified that local observables are not affected by far-away ploymers with indefinite activities when the system has no long-range order. (orig.)

  20. Main factors for fatigue failure probability of pipes subjected to fluid thermal fluctuation

    International Nuclear Information System (INIS)

    Machida, Hideo; Suzuki, Masaaki; Kasahara, Naoto

    2015-01-01

    It is very important to grasp failure probability and failure mode appropriately to carry out risk reduction measures of nuclear power plants. To clarify the important factors for failure probability and failure mode of pipes subjected to fluid thermal fluctuation, failure probability analyses were performed by changing the values of a stress range, stress ratio, stress components and threshold of stress intensity factor range. The important factors for the failure probability are range, stress ratio (mean stress condition) and threshold of stress intensity factor range. The important factor for the failure mode is a circumferential angle range of fluid thermal fluctuation. When a large fluid thermal fluctuation acts on the entire circumferential surface of the pipe, the probability of pipe breakage increases, calling for measures to prevent such a failure and reduce the risk to the plant. When the circumferential angle subjected to fluid thermal fluctuation is small, the failure mode of piping is leakage and the corrective maintenance might be applicable from the viewpoint of risk to the plant. (author)

  1. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  2. Functional framework and hardware platform for dependability study in short range wireless embedded systems

    NARCIS (Netherlands)

    Senouci, B.; Annema, Anne J.; Bentum, Marinus Jan; Kerkhoff, Hans G.

    2011-01-01

    A new direction in short-range wireless applications has appeared in the form of high-speed data communication devices for distances of a few meters. Behind these embedded applications, a complex Hardware/Software architecture is built. Dependability is one of the major challenges in these systems.

  3. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  4. Landau parameters for finite range density dependent nuclear interactions

    International Nuclear Information System (INIS)

    Farine, M.

    1997-01-01

    The Landau parameters represent the effective particle-hole interaction at Fermi level. Since between the physical observables and the Landau parameters there is a direct relation their derivation from an effective interaction is of great interest. The parameter F 0 determines the incompressibility K of the system. The parameter F 1 determines the effective mass (which controls the level density at the Fermi level). In addition, F 0 ' determines the symmetry energy, G 0 the magnetic susceptibility, and G 0 ' the pion condensation threshold in nuclear matter. This paper is devoted to a general derivation of Landau parameters for an interaction with density dependent finite range terms. Particular carefulness is devoted to the inclusion of rearrangement terms. This report is part of a larger project which aims at defining a new nuclear interaction improving the well-known D1 force of Gogny et al. for describing the average nuclear properties and exotic nuclei and satisfying, in addition, the sum rules

  5. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  6. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  7. Ionization induced by strong electromagnetic field in low dimensional systems bound by short range forces

    Energy Technology Data Exchange (ETDEWEB)

    Eminov, P.A., E-mail: peminov@mail.ru [Moscow State University of Instrument Engineering and Computer Sciences, 20 Stromynka Street, Moscow 2107996 (Russian Federation); National Research University Higher School of Economics, 3/12 Bolshoy Trekhsvyatskiy pereulok, Moscow 109028 (Russian Federation)

    2013-10-01

    Ionization processes for a two dimensional quantum dot subjected to combined electrostatic and alternating electric fields of the same direction are studied using quantum mechanical methods. We derive analytical equations for the ionization probability in dependence on characteristic parameters of the system for both extreme cases of a constant electric field and of a linearly polarized electromagnetic wave. The ionization probabilities for a superposition of dc and low frequency ac electric fields of the same direction are calculated. The impulse distribution of ionization probability for a system bound by short range forces is found for a superposition of constant and alternating fields. The total probability for this process per unit of time is derived within exponential accuracy. For the first time the influence of alternating electric field on electron tunneling probability induced by an electrostatic field is studied taking into account the pre-exponential term.

  8. Scattering from extended targets in range-dependent fluctuating ocean-waveguides with clutter from theory and experiments.

    Science.gov (United States)

    Jagannathan, Srinivasan; Küsel, Elizabeth T; Ratilal, Purnima; Makris, Nicholas C

    2012-08-01

    Bistatic, long-range measurements of acoustic scattered returns from vertically extended, air-filled tubular targets were made during three distinct field experiments in fluctuating continental shelf waveguides. It is shown that Sonar Equation estimates of mean target-scattered intensity lead to large errors, differing by an order of magnitude from both the measurements and waveguide scattering theory. The use of the Ingenito scattering model is also shown to lead to significant errors in estimating mean target-scattered intensity in the field experiments because they were conducted in range-dependent ocean environments with large variations in sound speed structure over the depth of the targets, scenarios that violate basic assumptions of the Ingenito model. Green's theorem based full-field modeling that describes scattering from vertically extended tubular targets in range-dependent ocean waveguides by taking into account nonuniform sound speed structure over the target's depth extent is shown to accurately describe the statistics of the targets' scattered field in all three field experiments. Returns from the man-made targets are also shown to have a very different spectral dependence from the natural target-like clutter of the dominant fish schools observed, suggesting that judicious multi-frequency sensing may often provide a useful means of distinguishing fish from man-made targets.

  9. Temperature-dependent dielectric function of germanium in the UV–vis spectral range: A first-principles study

    International Nuclear Information System (INIS)

    Yang, J.Y.; Liu, L.H.; Tan, J.Y.

    2014-01-01

    The study of temperature dependence of thermophysical parameter dielectric function is key to understanding thermal radiative transfer in high-temperature environments. Limited by self-radiation and thermal oxidation, however, it is difficult to directly measure the high-temperature dielectric function of solids with present experimental technologies. In this work, we implement two first-principles methods, the ab initio molecular dynamics (AIMD) and density functional perturbation theory (DFPT), to study the temperature dependence of dielectric function of germanium (Ge) in the UV–vis spectral range in order to provide data of high-temperature dielectric function for radiative transfer study in high-temperature environments. Both the two methods successfully predict the temperature dependence of dielectric function of Ge. Moreover, the good agreement between the calculated results of the AIMD approach and experimental data at 825 K enables us to predict the high-temperature dielectric function of Ge with the AIMD method in the UV–vis spectral range. - Highlights: • The temperature dependence of dielectric function of germanium (Ge) is investigated with two first-principles methods. • The temperature effect on dielectric function of Ge is discussed. • The high-temperature dielectric function of Ge is predicted

  10. Effect of energy level sequences and neutron–proton interaction on α-particle preformation probability

    International Nuclear Information System (INIS)

    Ismail, M.; Adel, A.

    2013-01-01

    A realistic density-dependent nucleon–nucleon (NN) interaction with finite-range exchange part which produces the nuclear matter saturation curve and the energy dependence of the nucleon–nucleus optical model potential is used to calculate the preformation probability, S α , of α-decay from different isotones with neutron numbers N=124,126,128,130 and 132. We studied the variation of S α with the proton number, Z, for each isotone and found the effect of neutron and proton energy levels of parent nuclei on the behavior of the α-particle preformation probability. We found that S α increases regularly with the proton number when the proton pair in α-particle is emitted from the same level and the neutron level sequence is not changed during the Z-variation. In this case the neutron–proton (n–p) interaction of the two levels, contributing to emission process, is too small. On the contrary, if the proton or neutron level sequence is changed during the emission process, S α behaves irregularly, the irregular behavior increases if both proton and neutron levels are changed. This behavior is accompanied by change or rapid increase in the strength of n–p interaction

  11. The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation

    Science.gov (United States)

    Felder, Guido; Zischg, Andreas; Weingartner, Rolf

    2017-07-01

    Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.

  12. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  13. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  14. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  15. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    Science.gov (United States)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  16. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  17. Data systematics and semidirect decay probability of the giant dipole resonance

    International Nuclear Information System (INIS)

    Ishkhanov, B.S.; Kapitonov, I.M.; Tutyn', I.A.

    1998-01-01

    Information on probability of semidirect decay of giant dipole resonance of nuclei of sd- and fp-shells (A = 16-58) is elaborated on the base of the recent (γ, χγ ' ) experimental results. The shell effect in A-dependence of this probability is discovered

  18. Human error recovery failure probability when using soft controls in computerized control rooms

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Wondea

    2014-01-01

    selection and delayed operation were mutually dependent. The ranges of other recovery failure probabilities were 0.227 to 0.546 in terms of using soft controls. Since there is no recovery failure probability database regarding soft controls in advanced MCRs and recovery failure probabilities in other HRA method were obtained by expert judgment, the results in this study would be helpful for HRA experts to decide recovery failure probabilities under advanced MCR environment

  19. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  20. Altered Long- and Short-Range Functional Connectivity in Patients with Betel Quid Dependence: A Resting-State Functional MRI Study

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2016-12-01

    Full Text Available Objective: Addiction is a chronic relapsing brain disease. Brain structural abnormalities may constitute an abnormal neural network that underlies the risk of drug dependence. We hypothesized that individuals with Betel Quid Dependence (BQD have functional connectivity alterations that can be described by long- and short-range functional connectivity density(FCD maps. Methods: We tested this hypothesis using functional magnetic resonance imaging (fMRI data from subjects of the Han ethnic group in Hainan, China. Here, we examined BQD individuals (n = 33 and age-, sex-, and education-matched healthy controls (HCs (n = 32 in a rs-fMRI study to observe FCD alterations associated with the severity of BQD. Results: Compared with HCs, long-range FCD was decreased in the right anterior cingulate cortex (ACC and increased in the left cerebellum posterior lobe (CPL and bilateral inferior parietal lobule (IPL in the BQD group. Short-range FCD was reduced in the right ACC and left dorsolateral prefrontal cortex (dlPFC, and increased in the left CPL. The short-range FCD alteration in the right ACC displayed a negative correlation with the Betel Quid Dependence Scale (BQDS (r=-0.432, P=0.012, and the long-range FCD alteration of left IPL showed a positive correlation with the duration of BQD(r=0.519, P=0.002 in BQD individuals. Conclusions: fMRI revealed differences in long- and short- range FCD in BQD individuals, and these alterations might be due to BQ chewing, BQ dependency, or risk factors for developing BQD.

  1. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  2. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  3. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  4. Determination of Age-Dependent Reference Ranges for Coagulation Tests Performed Using Destiny Plus.

    Science.gov (United States)

    Arslan, Fatma Demet; Serdar, Muhittin; Merve Ari, Elif; Onur Oztan, Mustafa; Hikmet Kozcu, Sureyya; Tarhan, Huseyin; Cakmak, Ozgur; Zeytinli, Merve; Yasar Ellidag, Hamit

    2016-06-01

    In order to apply the right treatment for hemostatic disorders in pediatric patients, laboratory data should be interpreted with age-appropriate reference ranges. The purpose of this study was to determining age-dependent reference range values for prothrombin time (PT), activated partial thromboplastin time (aPTT), fibrinogen tests, and D-dimer tests. A total of 320 volunteers were included in the study with the following ages: 1 month - 1 year (n = 52), 2 - 5 years (n = 50), 6 - 10 years (n = 48), 11 - 17 years (n = 38), and 18 - 65 years (n = 132). Each volunteer completed a survey to exclude hemostatic system disorder. Using a nonparametric method, the lower and upper limits, including 95% distribution and 90% confidence intervals, were calculated. No statistically significant differences were found between PT and aPTT values in the groups consisting of children. Thus, the reference ranges were separated into child and adult age groups. PT and aPTT values were significantly higher in the children than in the adults. Fibrinogen values in the 6 - 10 age group and the adult age group were significantly higher than in the other groups. D-dimer levels were significantly lower in those aged 2 - 17; thus, a separate reference range was established. These results support other findings related to developmental hemostasis, confirming that adult and pediatric age groups should be evaluated using different reference ranges.

  5. Approximation of Measurement Results of “Emergency” Signal Reception Probability

    Directory of Open Access Journals (Sweden)

    Gajda Stanisław

    2017-08-01

    Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.

  6. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  7. Cross Check of NOvA Oscillation Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics

    2018-01-12

    In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.

  8. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  9. Context-dependent JPEG backward-compatible high-dynamic range image compression

    Science.gov (United States)

    Korshunov, Pavel; Ebrahimi, Touradj

    2013-10-01

    High-dynamic range (HDR) imaging is expected, together with ultrahigh definition and high-frame rate video, to become a technology that may change photo, TV, and film industries. Many cameras and displays capable of capturing and rendering both HDR images and video are already available in the market. The popularity and full-public adoption of HDR content is, however, hindered by the lack of standards in evaluation of quality, file formats, and compression, as well as large legacy base of low-dynamic range (LDR) displays that are unable to render HDR. To facilitate the wide spread of HDR usage, the backward compatibility of HDR with commonly used legacy technologies for storage, rendering, and compression of video and images are necessary. Although many tone-mapping algorithms are developed for generating viewable LDR content from HDR, there is no consensus of which algorithm to use and under which conditions. We, via a series of subjective evaluations, demonstrate the dependency of the perceptual quality of the tone-mapped LDR images on the context: environmental factors, display parameters, and image content itself. Based on the results of subjective tests, it proposes to extend JPEG file format, the most popular image format, in a backward compatible manner to deal with HDR images also. An architecture to achieve such backward compatibility with JPEG is proposed. A simple implementation of lossy compression demonstrates the efficiency of the proposed architecture compared with the state-of-the-art HDR image compression.

  10. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  11. Absolute transition probabilities for 559 strong lines of neutral cerium

    Energy Technology Data Exchange (ETDEWEB)

    Curry, J J, E-mail: jjcurry@nist.go [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2009-07-07

    Absolute radiative transition probabilities are reported for 559 strong lines of neutral cerium covering the wavelength range 340-880 nm. These transition probabilities are obtained by scaling published relative line intensities (Meggers et al 1975 Tables of Spectral Line Intensities (National Bureau of Standards Monograph 145)) with a smaller set of published absolute transition probabilities (Bisson et al 1991 J. Opt. Soc. Am. B 8 1545). All 559 new values are for lines for which transition probabilities have not previously been available. The estimated relative random uncertainty of the new data is +-35% for nearly all lines.

  12. The temperature dependence of intermediate range oxygen-oxygen correlations in liquid water

    International Nuclear Information System (INIS)

    Schlesinger, Daniel; Pettersson, Lars G. M.; Wikfeldt, K. Thor; Skinner, Lawrie B.; Benmore, Chris J.; Nilsson, Anders

    2016-01-01

    We analyze the recent temperature dependent oxygen-oxygen pair-distribution functions from experimental high-precision x-ray diffraction data of bulk water by Skinner et al. [J. Chem. Phys. 141, 214507 (2014)] with particular focus on the intermediate range where small, but significant, correlations are found out to 17 Å. The second peak in the pair-distribution function at 4.5 Å is connected to tetrahedral coordination and was shown by Skinner et al. to change behavior with temperature below the temperature of minimum isothermal compressibility. Here we show that this is associated also with a peak growing at 11 Å which strongly indicates a collective character of fluctuations leading to the enhanced compressibility at lower temperatures. We note that the peak at ∼13.2 Å exhibits a temperature dependence similar to that of the density with a maximum close to 277 K or 4 °C. We analyze simulations of the TIP4P/2005 water model in the same manner and find excellent agreement between simulations and experiment albeit with a temperature shift of ∼20 K.

  13. The temperature dependence of intermediate range oxygen-oxygen correlations in liquid water

    Energy Technology Data Exchange (ETDEWEB)

    Schlesinger, Daniel; Pettersson, Lars G. M., E-mail: Lars.Pettersson@fysik.su.se [Department of Physics, AlbaNova University Center, Stockholm University, SE-106 91 Stockholm (Sweden); Wikfeldt, K. Thor [Department of Physics, AlbaNova University Center, Stockholm University, SE-106 91 Stockholm (Sweden); Science Institute, University of Iceland, VR-III, 107 Reykjavik (Iceland); Skinner, Lawrie B.; Benmore, Chris J. [X-ray Science Division, Advanced Photon Source, Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nilsson, Anders [Department of Physics, AlbaNova University Center, Stockholm University, SE-106 91 Stockholm (Sweden); Stanford Synchrotron Radiation Lightsource, SLAC National Accelerator Laboratory, Menlo Park, California 94025 (United States)

    2016-08-28

    We analyze the recent temperature dependent oxygen-oxygen pair-distribution functions from experimental high-precision x-ray diffraction data of bulk water by Skinner et al. [J. Chem. Phys. 141, 214507 (2014)] with particular focus on the intermediate range where small, but significant, correlations are found out to 17 Å. The second peak in the pair-distribution function at 4.5 Å is connected to tetrahedral coordination and was shown by Skinner et al. to change behavior with temperature below the temperature of minimum isothermal compressibility. Here we show that this is associated also with a peak growing at 11 Å which strongly indicates a collective character of fluctuations leading to the enhanced compressibility at lower temperatures. We note that the peak at ∼13.2 Å exhibits a temperature dependence similar to that of the density with a maximum close to 277 K or 4 °C. We analyze simulations of the TIP4P/2005 water model in the same manner and find excellent agreement between simulations and experiment albeit with a temperature shift of ∼20 K.

  14. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  15. The use of normal tissue complication probability to predict radiation hepatitis

    International Nuclear Information System (INIS)

    Keum, Ki Chang; Seong, Jin Sil; Suh, Chang Ok; Lee, Sang Wook; Chung, Eun Ji; Shin, Hyun Soo; Kim, Gwi Eon

    2000-01-01

    Although it has been known that the tolerance of the liver to external beam irradiation depends on the irradiated volume and dose, few data exist which quantify this dependence. However, recently, with the development of three dimensional (3-D) treatment planning, have the tools to quantify the relationships between dose, volume, and normal tissue complications become available. The objective of this study is to investigate the relationships between normal tissue complication probability (NTCP) and the risk of radiation hepatitis for patients who received variant dose partial liver irradiation. From March 1992 to December 1994, 10 patients with hepatoma and 10 patients with bile duct cancer were included in this study. Eighteen patients had normal hepatic function, but 2 patients (prothrombin time 73%, 68%) had mild liver cirrhosis before irradiation. Radiation therapy was delivered with 10MV linear accelerator, 180-200 cGy fraction per day. The total dose ranged from 3,960 cGy to 6,000 cGy (median dose 5,040 cGy). The normal tissue complication probability was calculated by using Lyman's model. Radiation hepatitis was defined as the development of anicteric elevation of alkaline phosphatase of at least two fold and non-malignant ascites in the absence of documented progressive. The calculated NTCP ranged from 0.001 to 0.840 (median 0.05). Three of the 20 patients developed radiation hepatitis. The NTCP of the patients with radiation hepatitis were 0.390, 0.528, 0.844 (median: O.58±0.23), but that of the patients without radiation hepatitis ranged from 0.001 to 0.308 (median: 0.09±0.09). When the NTCP was calculated by using the volume factor of 0.32, a radiation hepatitis was observed only in patients with the NTCP value more than 0.39. By contrast, clinical results of evolving radiation hepatitis were not well correlated with NTCP value calculated when the volume factor of 0.69 was applied. On the basis of these observations, volume factor of 0.32 was more

  16. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  17. K-shell ionization probability in energetic nearly symmetric heavy-ion collisions

    International Nuclear Information System (INIS)

    Tserruya, I.; Schmidt-Boecking, H.; Schuch, R.

    1977-01-01

    Impact parameter dependent K-x-ray emission probabilities for the projectile and target atoms have been measured in 35 MeV Cl on Cl, Cl on Ti and Cl on Ni collisions. The sum of projectile plus target K-shell ionization probability is taken as a measure of the total 2psigma ionization probability. The 2pπ-2psigma totational coupling model is in clear disagreement with the present results. On the other hand the sum of probabilities is reproduced both in shape and absolute magnitude by the statistical model for inner-shell ionization. The K-shell ionization probability of the higher -Z collision partner is well described by this model including the 2psigma-1ssigma vacancy sharing probability calculated as a function of the impact parameter. (author)

  18. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  19. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  20. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  1. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  2. Breakdown of long-range temporal dependence in default mode and attention networks during deep sleep.

    Science.gov (United States)

    Tagliazucchi, Enzo; von Wegner, Frederic; Morzelewski, Astrid; Brodbeck, Verena; Jahnke, Kolja; Laufs, Helmut

    2013-09-17

    The integration of segregated brain functional modules is a prerequisite for conscious awareness during wakeful rest. Here, we test the hypothesis that temporal integration, measured as long-term memory in the history of neural activity, is another important quality underlying conscious awareness. For this aim, we study the temporal memory of blood oxygen level-dependent signals across the human nonrapid eye movement sleep cycle. Results reveal that this property gradually decreases from wakefulness to deep nonrapid eye movement sleep and that such decreases affect areas identified with default mode and attention networks. Although blood oxygen level-dependent spontaneous fluctuations exhibit nontrivial spatial organization, even during deep sleep, they also display a decreased temporal complexity in specific brain regions. Conversely, this result suggests that long-range temporal dependence might be an attribute of the spontaneous conscious mentation performed during wakeful rest.

  3. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  4. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr

    2016-08-01

    In this paper, we propose and derive a slotted-time model for analyzing the burst blocking probability in Optical Burst Switched (OBS) networks. We evaluated the immediate and delayed signaling reservation schemes. The proposed model compares the performance of both just-in-time (JIT) and just-enough-time (JET) signaling protocols associated with of void/non-void filling link scheduling schemes. It also considers none and limited range wavelength conversions scenarios. Our model is distinguished by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio compared with full wavelength conversion.

  5. Dispersal Kernel Determines Symmetry of Spread and Geographical Range for an Insect

    International Nuclear Information System (INIS)

    Holland, J.D.

    2009-01-01

    The distance from a source patch that dispersing insects reach depends on the number of dispersers, or random draws from a probability density function called a dispersal kernel, and the shape of that kernel. This can cause asymmetrical dispersal between habitat patches that produce different numbers of dispersers. Spatial distributions based on these dynamics can explain several ecological patterns including mega populations and geographic range boundaries. I hypothesized that a locally extirpated long horned beetle, the sugar maple borer, has a new geographical range shaped primarily by probabilistic dispersal distances. I used data on occurrence from Ontario, Canada to construct a model of geographical range in Indiana, USA based on maximum dispersal distance scaled by habitat area. This model predicted the new range boundary within 500 m very accurately. This beetle may be an ideal organism for exploring spatial dynamics driven by dispersal.

  6. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  7. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  8. Evaluation of DNA match probability in criminal case.

    Science.gov (United States)

    Lee, J W; Lee, H S; Park, M; Hwang, J J

    2001-02-15

    The new emphasis on quantification of evidence has led to perplexing courtroom decisions and it has been difficult for forensic scientists to pursue logical arguments. Especially, for evaluating DNA evidence, though both the genetic relationship for two compared persons and the examined locus system should be considered, the understanding for this has not yet drawn much attention. In this paper, we suggest to calculate the match probability by using coancestry coefficient when the family relationship is considered, and thus the performances of the identification values depending on the calculation of match probability are compared under various situations.

  9. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1992-01-01

    Evaluation of the radiological risks of accidents involving vehicles transporting radioactive materials requires consideration of both accident probability and consequences. The probability that an accident will occur may be estimated from historical accident data for the given mode of transport. In addition to an overall accident rate, information regarding accident severity and the resulting package environments across the range of all credible accidents is needed to determine the potential for a release of radioactive material from the package or for an increase in direct radiation from the package caused by damage to packaging shielding. This information is usually obtained from a variety of sources such as historical data, experimental data, analyses of accident and package environments, and expert opinion. The consequences of an accident depend on a number of factors including the type, quantity, and physical form of radioactive material being transported; the response of the package to accident environments; the fraction of material released from the package; and the dispersion of any released material. One approach for the classification and treatment of transportation accidents in risk analysis divides the complete range of critical accident environments resulting from all credible accidents into some number of accident-severity categories. The types of accident environments that a package may be subjected to in transportation are often classified into the following five groups: impact, fire, crush, puncture, and immersion. A open-quotes criticalclose quotes accident environment is one of a type that could present a plausible threat to a package. Each severity category represents a portion of all credible accidents, and the total of all severity categories covers the complete range of critical accident environments. This approach is used in the risk assessment codes RADTRAN (Neuhauser and Kanipe 1992) and INTERTRAN (Ericsson and Elert 1983)

  10. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-01-01

    of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs

  11. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  12. Measurement of angularly dependent spectra of betatron gamma-rays from a laser plasma accelerator with quadrant-sectored range filters

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Jong Ho, E-mail: jhjeon07@ibs.re.kr; Nakajima, Kazuhisa, E-mail: naka115@dia-net.ne.jp; Rhee, Yong Joo; Pathak, Vishwa Bandhu; Cho, Myung Hoon; Shin, Jung Hun; Yoo, Byung Ju; Jo, Sung Ha; Shin, Kang Woo [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 61005 (Korea, Republic of); Kim, Hyung Taek; Sung, Jae Hee; Lee, Seong Ku; Choi, Il Woo [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 61005 (Korea, Republic of); Advanced Photonics Research Institute, GIST, Gwangju 61005 (Korea, Republic of); Hojbota, Calin; Bae, Lee Jin; Jung, Jaehyung; Cho, Min Sang; Cho, Byoung Ick; Nam, Chang Hee [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 61005 (Korea, Republic of); Department of Physics and Photon Science, GIST, Gwangju 61005 (Korea, Republic of)

    2016-07-15

    Measurement of angularly dependent spectra of betatron gamma-rays radiated by GeV electron beams from laser wakefield accelerators (LWFAs) are presented. The angle-resolved spectrum of betatron radiation was deconvolved from the position dependent data measured for a single laser shot with a broadband gamma-ray spectrometer comprising four-quadrant sectored range filters and an unfolding algorithm, based on the Monte Carlo code GEANT4. The unfolded gamma-ray spectra in the photon energy range of 0.1–10 MeV revealed an approximately isotropic angular dependence of the peak photon energy and photon energy-integrated fluence. As expected by the analysis of betatron radiation from LWFAs, the results indicate that unpolarized gamma-rays are emitted by electrons undergoing betatron motion in isotropically distributed orbit planes.

  13. Communication: Orbital instabilities and triplet states from time-dependent density functional theory and long-range corrected functionals

    Science.gov (United States)

    Sears, John S.; Koerzdoerfer, Thomas; Zhang, Cai-Rong; Brédas, Jean-Luc

    2011-10-01

    Long-range corrected hybrids represent an increasingly popular class of functionals for density functional theory (DFT) that have proven to be very successful for a wide range of chemical applications. In this Communication, we examine the performance of these functionals for time-dependent (TD)DFT descriptions of triplet excited states. Our results reveal that the triplet energies are particularly sensitive to the range-separation parameter; this sensitivity can be traced back to triplet instabilities in the ground state coming from the large effective amounts of Hartree-Fock exchange included in these functionals. As such, the use of standard long-range corrected functionals for the description of triplet states at the TDDFT level is not recommended.

  14. Effect of spin-orbit coupling on the wave vector and spin dependent transmission probability for the GaN/AlGaN/GaN heterostructure

    International Nuclear Information System (INIS)

    Li, M; Zhao, Z B; Fan, L B

    2015-01-01

    The effect of the Rashba and Dresselhaus spin–orbit coupling (SOC) on the transmission of electrons through the GaN/AlGaN/GaN heterostructure is studied. It is found that the Dresselhaus SOC causes the evident dependence of the transmission probability on the spin polarization and the in-plane wave vector of electrons, and also induces evident spin splitting of the resonant peaks in the (E z -k) plane. Because the magnitude of the Rashba SOC is relatively small, its effect on the transmission of electrons is much less. As k increases, the peaks of transmission probability for spin-up electrons (T + ) shift to a higher energy region and increase in magnitude, while the peaks of transmission probability for spin-down electrons (T − ) shift to a lower energy region and decrease in magnitude. The polarization efficiency (P) is found to peak at the resonant energies and increases with the in-plane wave vector. Moreover, the built-in electric field caused by the spontaneous and piezoelectric polarization can increase the amplitude of P. Results obtained here are helpful for the efficient spin injection into the III-nitride heterostructures by nonmagnetic means from the device point of view. (paper)

  15. Probability of Interference-Optimal and Energy-Efficient Analysis for Topology Control in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-11-01

    Full Text Available Because wireless sensor networks (WSNs have been widely used in recent years, how to reduce their energy consumption and interference has become a major issue. Topology control is a common and effective approach to improve network performance, such as reducing the energy consumption and network interference, improving the network connectivity, etc. Many topology control algorithms reduce network interference by dynamically adjusting the node transmission range. However, reducing the network interference by adjusting the transmission range is probabilistic. Therefore, in this paper, we analyze the probability of interference-optimality for the WSNs and prove that the probability of interference-optimality increases with the increasing of the original transmission range. Under a specific transmission range, the probability reaches the maximum value when the transmission range is 0.85r in homogeneous networks and 0.84r in heterogeneous networks. In addition, we also prove that when the network is energy-efficient, the network is also interference-optimal with probability 1 both in the homogeneous and heterogeneous networks.

  16. Range dependent characteristics in the head-related transfer functions of a bat-head cast: part 1. Monaural characteristics

    International Nuclear Information System (INIS)

    Kim, S; Allen, R; Rowan, D

    2012-01-01

    Knowledge of biological sonar systems has revolutionized many aspects of sonar engineering and further advances will benefit from more detailed understanding of their underlying acoustical processes. The anatomically diverse, complex and dynamic heads and ears of bats are known to be important for echolocation although their range-dependent properties are not well understood, particularly across the wide frequency range of some bats' vocalizations. The aim of this and a companion paper Kim et al (2012 Bioinspir. Biomim.) is to investigate bat-head acoustics as a function of bat-target distance, based on measurements up to 100 kHz and more robust examination of hardware characteristics in measurements than previously reported, using a cast of a bat head. In this first paper, we consider the spectral features at either ear (i.e. monaural head-related transfer functions). The results show, for example, that there is both higher magnitude and a stronger effect of distance at close range at relatively low frequencies. This might explain, at least in part, why bats adopt a strategy of changing the frequency range of their vocalizations while approaching a target. There is also potential advantage in the design of bio-inspired receivers of using range-dependent HRTFs and utilizing their distinguished frequency characteristics over the distance. (paper)

  17. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  18. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  19. Modeling spatial processes with unknown extremal dependence class

    KAUST Repository

    Huser, Raphaël G.

    2017-03-17

    Many environmental processes exhibit weakening spatial dependence as events become more extreme. Well-known limiting models, such as max-stable or generalized Pareto processes, cannot capture this, which can lead to a preference for models that exhibit a property known as asymptotic independence. However, weakening dependence does not automatically imply asymptotic independence, and whether the process is truly asymptotically (in)dependent is usually far from clear. The distinction is key as it can have a large impact upon extrapolation, i.e., the estimated probabilities of events more extreme than those observed. In this work, we present a single spatial model that is able to capture both dependence classes in a parsimonious manner, and with a smooth transition between the two cases. The model covers a wide range of possibilities from asymptotic independence through to complete dependence, and permits weakening dependence of extremes even under asymptotic dependence. Censored likelihood-based inference for the implied copula is feasible in moderate dimensions due to closed-form margins. The model is applied to oceanographic datasets with ambiguous true limiting dependence structure.

  20. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  1. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. Simulation of Daily Weather Data Using Theoretical Probability Distributions.

    Science.gov (United States)

    Bruhn, J. A.; Fry, W. E.; Fick, G. W.

    1980-09-01

    A computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight. In the weather model Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, minimum relative humidity and total solar radiation. Each weather variable is described by a known theoretical probability distribution but the values of the parameters describing each distribution are dependent on the occurrence of rainfall. Precipitation occurrence is described by a first-order Markov chain. The amount of rain, given that rain has occurred, is described by a gamma probability distribution. Maximum and minimum temperature are simulated with a trivariate normal probability distribution involving maximum temperature on the previous day, maximum temperature on the current day and minimum temperature on the current day. Parameter values for this distribution are dependent on the occurrence of rain on the previous day. Both minimum relative humidity and total solar radiation are assumed to be normally distributed. The values of the parameters describing the distribution of minimum relative humidity is dependent on rainfall occurrence on the previous day and current day. Parameter values for total solar radiation are dependent on the occurrence of rain on the current day. The assumptions made during model construction were found to be appropriate for actual weather data from Geneva, New York. The performance of the weather model was evaluated by comparing the cumulative frequency distributions of simulated weather data with the distributions of actual weather data from Geneva, New York and Fort Collins, Colorado. For each location, simulated weather data were similar to actual weather data in terms of mean response, variability and autocorrelation. The possible applications of this model when used with models of other components of the agro-ecosystem are discussed.

  3. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  4. The continual reassessment method: comparison of Bayesian stopping rules for dose-ranging studies.

    Science.gov (United States)

    Zohar, S; Chevret, S

    2001-10-15

    The continual reassessment method (CRM) provides a Bayesian estimation of the maximum tolerated dose (MTD) in phase I clinical trials and is also used to estimate the minimal efficacy dose (MED) in phase II clinical trials. In this paper we propose Bayesian stopping rules for the CRM, based on either posterior or predictive probability distributions that can be applied sequentially during the trial. These rules aim at early detection of either the mis-choice of dose range or a prefixed gain in the point estimate or accuracy of estimated probability of response associated with the MTD (or MED). They were compared through a simulation study under six situations that could represent the underlying unknown dose-response (either toxicity or failure) relationship, in terms of sample size, probability of correct selection and bias of the response probability associated to the MTD (or MED). Our results show that the stopping rules act correctly, with early stopping by using the two first rules based on the posterior distribution when the actual underlying dose-response relationship is far from that initially supposed, while the rules based on predictive gain functions provide a discontinuation of inclusions whatever the actual dose-response curve after 20 patients on average, that is, depending mostly on the accumulated data. The stopping rules were then applied to a data set from a dose-ranging phase II clinical trial aiming at estimating the MED dose of midazolam in the sedation of infants during cardiac catheterization. All these findings suggest the early use of the two first rules to detect a mis-choice of dose range, while they confirm the requirement of including at least 20 patients at the same dose to reach an accurate estimate of MTD (MED). A two-stage design is under study. Copyright 2001 John Wiley & Sons, Ltd.

  5. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  6. Left passage probability of Schramm-Loewner Evolution

    Science.gov (United States)

    Najafi, M. N.

    2013-06-01

    SLE(κ,ρ⃗) is a variant of Schramm-Loewner Evolution (SLE) which describes the curves which are not conformal invariant, but are self-similar due to the presence of some other preferred points on the boundary. In this paper we study the left passage probability (LPP) of SLE(κ,ρ⃗) through field theoretical framework and find the differential equation governing this probability. This equation is numerically solved for the special case κ=2 and hρ=0 in which hρ is the conformal weight of the boundary changing (bcc) operator. It may be referred to loop erased random walk (LERW) and Abelian sandpile model (ASM) with a sink on its boundary. For the curve which starts from ξ0 and conditioned by a change of boundary conditions at x0, we find that this probability depends significantly on the factor x0-ξ0. We also present the perturbative general solution for large x0. As a prototype, we apply this formalism to SLE(κ,κ-6) which governs the curves that start from and end on the real axis.

  7. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  8. Memory effects, two color percolation, and the temperature dependence of Mott variable-range hopping

    Science.gov (United States)

    Agam, Oded; Aleiner, Igor L.

    2014-06-01

    There are three basic processes that determine hopping transport: (a) hopping between normally empty sites (i.e., having exponentially small occupation numbers at equilibrium), (b) hopping between normally occupied sites, and (c) transitions between normally occupied and unoccupied sites. In conventional theories all these processes are considered Markovian and the correlations of occupation numbers of different sites are believed to be small (i.e., not exponential in temperature). We show that, contrary to this belief, memory effects suppress the processes of type (c) and manifest themselves in a subleading exponential temperature dependence of the variable-range hopping conductivity. This temperature dependence originates from the property that sites of type (a) and (b) form two independent resistor networks that are weakly coupled to each other by processes of type (c). This leads to a two-color percolation problem which we solve in the critical region.

  9. The transmission probability method in one-dimensional cylindrical geometry

    International Nuclear Information System (INIS)

    Rubin, I.E.

    1983-01-01

    The collision probability method widely used in solving the problems of neutron transpopt in a reactor cell is reliable for simple cells with small number of zones. The increase of the number of zones and also taking into account the anisotropy of scattering greatly increase the scope of calculations. In order to reduce the time of calculation the transmission probability method is suggested to be used for flux calculation in one-dimensional cylindrical geometry taking into account the scattering anisotropy. The efficiency of the suggested method is verified using the one-group calculations for cylindrical cells. The use of the transmission probability method allows to present completely angular and spatial dependences is neutrons distributions without the increase in the scope of calculations. The method is especially effective in solving the multi-group problems

  10. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  11. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  12. Estimation of failure probability of the end induced current depending on uncertain parameters of a transmission line

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper treats about the risk analysis of an EMC default using a statistical approach based on reliability methods. A probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is computed by taking into account uncertainties on input parameters influencing extreme levels of interference in the context of transmission lines. Results are compared to Monte Carlo simulation (MCS). (authors)

  13. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  14. The watercolor effect: quantitative evidence for luminance-dependent mechanisms of long-range color assimilation.

    Science.gov (United States)

    Devinck, Frédéric; Delahunt, Peter B; Hardy, Joseph L; Spillmann, Lothar; Werner, John S

    2005-05-01

    When a dark chromatic contour delineating a figure is flanked on the inside by a brighter chromatic contour, the brighter color will spread into the entire enclosed area. This is known as the watercolor effect (WCE). Here we quantified the effect of color spreading using both color-matching and hue-cancellation tasks. Over a wide range of stimulus chromaticities, there was a reliable shift in color appearance that closely followed the direction of the inducing contour. When the contours were equated in luminance, the WCE was still present, but weak. The magnitude of the color spreading increased with increases in luminance contrast between the two contours. Additionally, as the luminance contrast between the contours increased, the chromaticity of the induced color more closely resembled that of the inside contour. The results support the hypothesis that the WCE is mediated by luminance-dependent mechanisms of long-range color assimilation.

  15. Unitarity corrections to short-range order long-range rapidity correlations

    CERN Document Server

    Capella, A

    1978-01-01

    Although the effective hadronic forces have short range in rapidity space, one nevertheless expects long-range dynamical correlations induced by unitarity constraints. This paper contains a thorough discussion of long-range rapidity correlations in high-multiplicity events. In particular, the authors analyze in detail the forward- backward multiplicity correlations, measured recently in the whole CERN ISR energy range. They find from these data that the normalized variance of the number n of exchanged cut Pomerons, ((n/(n)-1)/sup 2/) , is most probably in the range 0.32 to 0.36. They show that such a number is obtained from Reggeon theory in the eikonal approximation. The authors also predict a very specific violation of local compensation of charge in multiparticle events: The violation should appear in the fourth-order zone correlation function and is absent in the second-order correlation function, the only one measured until now. (48 refs).

  16. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  18. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  19. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  20. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  1. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr; Rabia, Sherif; Mahmoud, Mohamed; Aly, Moustafa H.; Shihada, Basem

    2016-01-01

    by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio

  2. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  3. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  4. Methods for estimating drought streamflow probabilities for Virginia streams

    Science.gov (United States)

    Austin, Samuel H.

    2014-01-01

    Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.

  5. Linking probabilities of off-lattice self-avoiding polygons and the effects of excluded volume

    International Nuclear Information System (INIS)

    Hirayama, Naomi; Deguchi, Tetsuo; Tsurusaki, Kyoichi

    2009-01-01

    We evaluate numerically the probability of linking, i.e. the probability of a given pair of self-avoiding polygons (SAPs) being entangled and forming a nontrivial link type L. In the simulation we generate pairs of SAPs of N spherical segments of radius r d such that they have no overlaps among the segments and each of the SAPs has the trivial knot type. We evaluate the probability of a self-avoiding pair of SAPs forming a given link type L for various link types with fixed distance R between the centers of mass of the two SAPs. We define normalized distance r by r=R/R g,0 1 where R g,0 1 denotes the square root of the mean square radius of gyration of SAP of the trivial knot 0 1 . We introduce formulae expressing the linking probability as a function of normalized distance r, which gives good fitting curves with respect to χ 2 values. We also investigate the dependence of linking probabilities on the excluded-volume parameter r d and the number of segments, N. Quite interestingly, the graph of linking probability versus normalized distance r shows no N-dependence at a particular value of the excluded volume parameter, r d = 0.2

  6. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  7. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  8. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  9. Development of a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence after curative radiotherapy/chemo-radiotherapy in head and neck cancer

    International Nuclear Information System (INIS)

    Wopken, Kim; Bijl, Hendrik P.; Schaaf, Arjen van der; Laan, Hans Paul van der; Chouvalova, Olga; Steenbakkers, Roel J.H.M.; Doornaert, Patricia; Slotman, Ben J.; Oosting, Sjoukje F.; Christianen, Miranda E.M.C.; Laan, Bernard F.A.M. van der; Roodenburg, Jan L.N.; René Leemans, C.; Verdonck-de Leeuw, Irma M.; Langendijk, Johannes A.

    2014-01-01

    Background and purpose: Curative radiotherapy/chemo-radiotherapy for head and neck cancer (HNC) may result in severe acute and late side effects, including tube feeding dependence. The purpose of this prospective cohort study was to develop a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence 6 months (TUBE M6 ) after definitive radiotherapy, radiotherapy plus cetuximab or concurrent chemoradiation based on pre-treatment and treatment characteristics. Materials and methods: The study included 355 patients with HNC. TUBE M6 was scored prospectively in a standard follow-up program. To design the prediction model, the penalized learning method LASSO was used, with TUBE M6 as the endpoint. Results: The prevalence of TUBE M6 was 10.7%. The multivariable model with the best performance consisted of the variables: advanced T-stage, moderate to severe weight loss at baseline, accelerated radiotherapy, chemoradiation, radiotherapy plus cetuximab, the mean dose to the superior and inferior pharyngeal constrictor muscle, to the contralateral parotid gland and to the cricopharyngeal muscle. Conclusions: We developed a multivariable NTCP model for TUBE M6 to identify patients at risk for tube feeding dependence. The dosimetric variables can be used to optimize radiotherapy treatment planning aiming at prevention of tube feeding dependence and to estimate the benefit of new radiation technologies

  10. Delay-Range-Dependent H∞ Control for Automatic Mooring Positioning System with Time-Varying Input Delay

    Directory of Open Access Journals (Sweden)

    Xiaoyu Su

    2014-01-01

    Full Text Available Aiming at the economy and security of the positioning system in semi-submersible platform, the paper presents a new scheme based on the mooring line switching strategy. Considering the input delay in switching process, H∞ control with time-varying input delay is designed to calculate the control forces to resist disturbing forces. In order to reduce the conservativeness, the information of the lower bound of delay is taken into account, and a Lyapunov function which contains the range of delay is constructed. Besides, the input constraint is considered to avoid breakage of mooring lines. The sufficient conditions for delay-range-dependent stabilization are derived in terms of LMI, and the controller is also obtained. The effectiveness of the proposed approach is illustrated by a realistic design example.

  11. Withdrawal of corticosteroids in inflammatory bowel disease patients after dependency periods ranging from 2 to 45 years: a proposed method.

    LENUS (Irish Health Repository)

    Murphy, S J

    2012-02-01

    BACKGROUND: Even in the biologic era, corticosteroid dependency in IBD patients is common and causes a lot of morbidity, but methods of withdrawal are not well described. AIM: To assess the effectiveness of a corticosteroid withdrawal method. METHODS: Twelve patients (10 men, 2 women; 6 ulcerative colitis, 6 Crohn\\'s disease), median age 53.5 years (range 29-75) were included. IBD patients with quiescent disease refractory to conventional weaning were transitioned to oral dexamethasone, educated about symptoms of the corticosteroid withdrawal syndrome (CWS) and weaned under the supervision of an endocrinologist. When patients failed to wean despite a slow weaning pace and their IBD remaining quiescent, low dose synthetic ACTH stimulation testing was performed to assess for adrenal insufficiency. Multivariate analysis was performed to assess predictors of a slow wean. RESULTS: Median durations for disease and corticosteroid dependency were 21 (range 3-45) and 14 (range 2-45) years respectively. Ten patients (83%) were successfully weaned after a median follow-up from final wean of 38 months (range 5-73). Disease flares occurred in two patients, CWS in five and ACTH testing was performed in 10. Multivariate analysis showed that longer duration of corticosteroid use appeared to be associated with a slower wean (P = 0.056). CONCLUSIONS: Corticosteroid withdrawal using this protocol had a high success rate and durable effect and was effective in patients with long-standing (up to 45 years) dependency. As symptoms of CWS mimic symptoms of IBD disease flares, gastroenterologists may have difficulty distinguishing them, which may be a contributory factor to the frequency of corticosteroid dependency in IBD patients.

  12. Time-Dependent Risk Estimation and Cost-Benefit Analysis for Mitigation Actions

    Science.gov (United States)

    van Stiphout, T.; Wiemer, S.; Marzocchi, W.

    2009-04-01

    Earthquakes strongly cluster in space and time. Consequently, the most dangerous time is right after a moderate earthquake has happened, because their is a ‘high' (i.e., 2-5 percent) probability that this event will be followed by a subsequent aftershock which happens to be as large or larger than the initiating event. The seismic hazard during this time-period exceeds the background probability significantly and by several orders of magnitude. Scientists have developed increasingly accurate forecast models that model this time-dependent hazard, and such models are currently being validated in prospective testing. However, this probabilistic information in the hazard space is difficult to digest for decision makers, the media and general public. Here, we introduce a possible bridge between seismology and decision makers (authorities, civil defense) by proposing a more objective way to estimate time-dependent risk assessment. Short Term Earthquake Risk assessment (STEER) combines aftershock hazard and loss assessments. We use site-specific information on site effects and building class distribution and combine this with existing loss models to compute site specific time-dependent risk curves (probability of exceedance for fatalities, injuries, damages etc). We show the effect of uncertainties in the different components using Monte Carlo Simulations of the input parameters. This time-dependent risk curves can act as a decision support. We extend the STEER approach by introducing a Cost-Benefit approach for certain mitigation actions after a medium-sized earthquake. Such Cost-Benefit approaches have been recently developed for volcanic risk assessment to rationalize precautionary evacuations in densely inhabitated areas threatened by volcanoes. Here we extend the concept to time-dependent probabilistic seismic risk assessment. For the Cost-Benefit analysis of mitigation actions we calculate the ratio between the cost for the mitigation actions and the cost of the

  13. Levy's zero-one law in game-theoretic probability

    OpenAIRE

    Shafer, Glenn; Vovk, Vladimir; Takemura, Akimichi

    2009-01-01

    We prove a game-theoretic version of Levy's zero-one law, and deduce several corollaries from it, including non-stochastic versions of Kolmogorov's zero-one law, the ergodicity of Bernoulli shifts, and a zero-one law for dependent trials. Our secondary goal is to explore the basic definitions of game-theoretic probability theory, with Levy's zero-one law serving a useful role.

  14. Method to Calculate Accurate Top Event Probability in a Seismic PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Woo Sik [Sejong Univ., Seoul (Korea, Republic of)

    2014-05-15

    ACUBE(Advanced Cutset Upper Bound Estimator) calculates the top event probability and importance measures from cutsets by dividing cutsets into major and minor groups depending on the cutset probability, where the cutsets that have higher cutset probability are included in the major group and the others in minor cutsets, converting major cutsets into a Binary Decision Diagram (BDD). By applying the ACUBE algorithm to the seismic PSA cutsets, the accuracy of a top event probability and importance measures can be significantly improved. ACUBE works by dividing the cutsets into two groups (higher and lower cutset probability groups), calculating the top event probability and importance measures in each group, and combining the two results from the two groups. Here, ACUBE calculates the top event probability and importance measures of the higher cutset probability group exactly. On the other hand, ACUBE calculates these measures of the lower cutset probability group with an approximation such as MCUB. The ACUBE algorithm is useful for decreasing the conservatism that is caused by approximating the top event probability and importance measure calculations with given cutsets. By applying the ACUBE algorithm to the seismic PSA cutsets, the accuracy of a top event probability and importance measures can be significantly improved. This study shows that careful attention should be paid and an appropriate method be provided in order to avoid the significant overestimation of the top event probability calculation. Due to the strength of ACUBE that is explained in this study, the ACUBE became a vital tool for calculating more accurate CDF of the seismic PSA cutsets than the conventional probability calculation method.

  15. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  16. Transition probabilities of Ce I obtained from Boltzmann analysis of visible and near-infrared emission spectra

    Science.gov (United States)

    Nitz, D. E.; Curry, J. J.; Buuck, M.; DeMann, A.; Mitchell, N.; Shull, W.

    2018-02-01

    We report radiative transition probabilities for 5029 emission lines of neutral cerium within the wavelength range 417-1110 nm. Transition probabilities for only 4% of these lines have been previously measured. These results are obtained from a Boltzmann analysis of two high resolution Fourier transform emission spectra used in previous studies of cerium, obtained from the digital archives of the National Solar Observatory at Kitt Peak. The set of transition probabilities used for the Boltzmann analysis are those published by Lawler et al (2010 J. Phys. B: At. Mol. Opt. Phys. 43 085701). Comparisons of branching ratios and transition probabilities for lines common to the two spectra provide important self-consistency checks and test for the presence of self-absorption effects. Estimated 1σ uncertainties for our transition probability results range from 10% to 18%.

  17. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  18. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  19. Temperature dependence of muonium spin exchange with O2 in the range 88 K to 478 K

    International Nuclear Information System (INIS)

    Senba, M.; Garner, D.M.; Arseneau, D.J.; Fleming, D.G.

    1984-01-01

    The authors have extended an earlier study of the spin exchange reactions of Mu with O 2 in the range 295 K to 478 K, to a low temperature region down to 88 K. From 135 K to 296 K, the spin depolarization rate constant was found to vary according to the relative velocity of the colliding species, which indicates that the spin exchange cross section of Mu-O 2 is temperature independent in this range. However, it was found that below 105 K and above 400 K, the spin depolarization rate constant tends to have stronger temperature dependences. (Auth.)

  20. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  1. Stochastic Stability for Time-Delay Markovian Jump Systems with Sector-Bounded Nonlinearities and More General Transition Probabilities

    Directory of Open Access Journals (Sweden)

    Dan Ye

    2013-01-01

    Full Text Available This paper is concerned with delay-dependent stochastic stability for time-delay Markovian jump systems (MJSs with sector-bounded nonlinearities and more general transition probabilities. Different from the previous results where the transition probability matrix is completely known, a more general transition probability matrix is considered which includes completely known elements, boundary known elements, and completely unknown ones. In order to get less conservative criterion, the state and transition probability information is used as much as possible to construct the Lyapunov-Krasovskii functional and deal with stability analysis. The delay-dependent sufficient conditions are derived in terms of linear matrix inequalities to guarantee the stability of systems. Finally, numerical examples are exploited to demonstrate the effectiveness of the proposed method.

  2. Oxygen-dependent sensitization of irradiated cells

    International Nuclear Information System (INIS)

    Ewing, D.; Powers, E.L.

    1979-01-01

    Attention is focused primarily on O 2 effects in three biological systems, all tested in suspension: bacterial spores, vegetative bacterial cells, and mammalian cells. Information from these systems shows that O 2 has more than one process through which it can act. Studies with bacterial spore suspensions provide clear evidence that multiple components to oxygen-dependent radiation sensitization exist. Studies with mammalian cell suspensions also show that at least two oxygen-dependent sensitization processes can be distinguished. Similar studies with vegetative bacteria in suspension have not resolved oxic sensitization into components. The roles of water-derived radicals in radiation sensitivity and, specifically, in sensitization by O 2 were examined. OH radicals are clearly implicated in damage in all three biological test systems. However, the specific roles proposed for OH radicals are different in these organisms. In bacterial spores, OH radical removal in itself does not protect in anoxia or in high concentrations of O 2 . OH radical removal over a limited intermediate range of O 2 concentrations will, however, protect. OH radical scavenging probably results in the formation of the actual protector. In bacteria, the supposition is that OH radical removal will protect both in anoxia and in the presence of O 2 . OH radicals probably react with a cellular target molecule and leave a radicalsite; this is the site which can then react with O 2 to cause damage; DNA is the likely cellular target. In mammalian cells, a reaction scheme, similar to that proposed for bacteria, has been suggested for O 2 -dependent sensitization

  3. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  4. Delay-range-dependent exponential H∞ synchronization of a class of delayed neural networks

    International Nuclear Information System (INIS)

    Karimi, Hamid Reza; Maass, Peter

    2009-01-01

    This article aims to present a multiple delayed state-feedback control design for exponential H ∞ synchronization problem of a class of delayed neural networks with multiple time-varying discrete delays. On the basis of the drive-response concept and by introducing a descriptor technique and using Lyapunov-Krasovskii functional, new delay-range-dependent sufficient conditions for exponential H ∞ synchronization of the drive-response structure of neural networks are driven in terms of linear matrix inequalities (LMIs). The explicit expression of the controller gain matrices are parameterized based on the solvability conditions such that the drive system and the response system can be exponentially synchronized. A numerical example is included to illustrate the applicability of the proposed design method.

  5. Variation of Time Domain Failure Probabilities of Jack-up with Wave Return Periods

    Science.gov (United States)

    Idris, Ahmad; Harahap, Indra S. H.; Ali, Montassir Osman Ahmed

    2018-04-01

    This study evaluated failure probabilities of jack up units on the framework of time dependent reliability analysis using uncertainty from different sea states representing different return period of the design wave. Surface elevation for each sea state was represented by Karhunen-Loeve expansion method using the eigenfunctions of prolate spheroidal wave functions in order to obtain the wave load. The stochastic wave load was propagated on a simplified jack up model developed in commercial software to obtain the structural response due to the wave loading. Analysis of the stochastic response to determine the failure probability in excessive deck displacement in the framework of time dependent reliability analysis was performed by developing Matlab codes in a personal computer. Results from the study indicated that the failure probability increases with increase in the severity of the sea state representing a longer return period. Although the results obtained are in agreement with the results of a study of similar jack up model using time independent method at higher values of maximum allowable deck displacement, it is in contrast at lower values of the criteria where the study reported that failure probability decreases with increase in the severity of the sea state.

  6. Dependence of wavelength of Xe ion-induced rippled structures on the fluence in the medium ion energy range

    Energy Technology Data Exchange (ETDEWEB)

    Hanisch, Antje; Grenzer, Joerg [Institute of Ion Beam Physics and Materials Research, Dresden (Germany); Biermanns, Andreas; Pietsch, Ullrich [Institute of Physics, University of Siegen (Germany)

    2010-07-01

    Ion-beam eroded self-organized nanostructures on semiconductors offer new ways for the fabrication of high density memory and optoelectronic devices. It is known that wavelength and amplitude of noble gas ion-induced rippled structures tune with the ion energy and the fluence depending on the energy range, ion type and substrate. The linear theory by Makeev predicts a linear dependence of the ion energy on the wavelength for low temperatures. For Ar{sup +} and O{sub 2}{sup +} it was observed by different groups that the wavelength grows with increasing fluence after being constant up to an onset fluence and before saturation. In this coarsening regime power-law or exponential behavior of the wavelength with the fluence was monitored. So far, investigations for Xe ions on silicon surfaces mainly concentrated on energies below 1 keV. We found a linear dependence of both the ion energy and the fluence on the wavelength and amplitude of rippled structures over a wide range of the Xe{sup +} ion energy between 5 and 70 keV. Moreover, we estimated the ratio of wavelength to amplitude to be constant meaning a shape stability when a threshold fluence of 2.10{sup 17} cm{sup -2} was exceeded.

  7. Human reliability analysis of dependent events

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  8. Survival and compound nucleus probability of super heavy element Z = 117

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First grade College, Department of Physics, Kolar, Karnataka (India)

    2017-05-15

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of {sup 289-297}Ts, we have calculated the transmission probability (T{sub l}), compound nucleus formation probabilities (P{sub CN}) and survival probability (P{sub sur}) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of {sup 289-297}Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei {sup 289-297}Ts are worked out and listed explicitly. We have also studied the variation of P{sub CN} and P{sub sur} with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  9. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  10. The relationship between operating cash flow per share and portfolio default probability

    Directory of Open Access Journals (Sweden)

    Mohammad Khodaei Valahzaghard

    2014-03-01

    Full Text Available One of the primary duties of the depositary banks is to protect themselves against any possibility of bankruptcy. This requires the identification and measurement of risks, including default risk, which is important given the nature of the activities of banks. This paper presents an empirical investigation to study the relationship between default probability and some financial figures including operating cash flow, liabilities and return of equities. The proposed study of this paper uses historical data of twenty-two firms listed on Tehran Stock Exchange over the period 2008-2012. Default probability as the dependent variable is measured by the method developed by Moody’s KMV Company. The study uses linear regression model to examine the relationship between default probability and some independent variables. The results of the present study suggest that there were some reverse relationship between operating cash flow per share, return on equities and default probability. In addition, there was a direct relationship between log facilities and default probability. However, there was not any relationship between net sales and default probability.

  11. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  12. Temperature Dependence of Short-Range Order in β-Brass

    DEFF Research Database (Denmark)

    Dietrich, O.W.; Als-Nielsen, Jens Aage

    1967-01-01

    Critical scattering of neutrons around the superlattice reflections (1, 0, 0) and (1, 1, 1) from a single crystal of beta-brass has been measured at temperatures from 2 to 25deg C above the transition temperature. The temperature dependence of the critical peak intensity, proportional to the susc......Critical scattering of neutrons around the superlattice reflections (1, 0, 0) and (1, 1, 1) from a single crystal of beta-brass has been measured at temperatures from 2 to 25deg C above the transition temperature. The temperature dependence of the critical peak intensity, proportional...

  13. Knotting probability of self-avoiding polygons under a topological constraint

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-01

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius rex. For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius rex. It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius rex corresponds to the screening length.

  14. Knotting probability of self-avoiding polygons under a topological constraint.

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-07

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius r ex . For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius r ex . It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius r ex corresponds to the screening length.

  15. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  16. Fluctuations and pseudo long range dependence in network flows: A non-stationary Poisson process model

    International Nuclear Information System (INIS)

    Yu-Dong, Chen; Li, Li; Yi, Zhang; Jian-Ming, Hu

    2009-01-01

    In the study of complex networks (systems), the scaling phenomenon of flow fluctuations refers to a certain power-law between the mean flux (activity) (F i ) of the i-th node and its variance σ i as σ i α (F i ) α . Such scaling laws are found to be prevalent both in natural and man-made network systems, but the understanding of their origins still remains limited. This paper proposes a non-stationary Poisson process model to give an analytical explanation of the non-universal scaling phenomenon: the exponent α varies between 1/2 and 1 depending on the size of sampling time window and the relative strength of the external/internal driven forces of the systems. The crossover behaviour and the relation of fluctuation scaling with pseudo long range dependence are also accounted for by the model. Numerical experiments show that the proposed model can recover the multi-scaling phenomenon. (general)

  17. Modified Feynman ratchet with velocity-dependent fluctuations

    Directory of Open Access Journals (Sweden)

    Jack Denur

    2004-03-01

    Full Text Available Abstract: The randomness of Brownian motion at thermodynamic equilibrium can be spontaneously broken by velocity-dependence of fluctuations, i.e., by dependence of values or probability distributions of fluctuating properties on Brownian-motional velocity. Such randomness-breaking can spontaneously obtain via interaction between Brownian-motional Doppler effects --- which manifest the required velocity-dependence --- and system geometrical asymmetry. A non random walk is thereby spontaneously superposed on Brownian motion, resulting in a systematic net drift velocity despite thermodynamic equilibrium. The time evolution of this systematic net drift velocity --- and of velocity probability density, force, and power output --- is derived for a velocity-dependent modification of Feynman's ratchet. We show that said spontaneous randomness-breaking, and consequent systematic net drift velocity, imply: bias from the Maxwellian of the system's velocity probability density, the force that tends to accelerate it, and its power output. Maximization, especially of power output, is discussed. Uncompensated decreases in total entropy, challenging the second law of thermodynamics, are thereby implied.

  18. Cytologic diagnosis: expression of probability by clinical pathologists.

    Science.gov (United States)

    Christopher, Mary M; Hotz, Christine S

    2004-01-01

    Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.

  19. A long range dependent model with nonlinear innovations for simulating daily river flows

    Directory of Open Access Journals (Sweden)

    P. Elek

    2004-01-01

    Full Text Available We present the analysis aimed at the estimation of flood risks of Tisza River in Hungary on the basis of daily river discharge data registered in the last 100 years. The deseasonalised series has skewed and leptokurtic distribution and various methods suggest that it possesses substantial long memory. This motivates the attempt to fit a fractional ARIMA model with non-Gaussian innovations as a first step. Synthetic streamflow series can then be generated from the bootstrapped innovations. However, there remains a significant difference between the empirical and the synthetic density functions as well as the quantiles. This brings attention to the fact that the innovations are not independent, both their squares and absolute values are autocorrelated. Furthermore, the innovations display non-seasonal periods of high and low variances. This behaviour is characteristic to generalised autoregressive conditional heteroscedastic (GARCH models. However, when innovations are simulated as GARCH processes, the quantiles and extremes of the discharge series are heavily overestimated. Therefore we suggest to fit a smooth transition GARCH-process to the innovations. In a standard GARCH model the dependence of the variance on the lagged innovation is quadratic whereas in our proposed model it is a bounded function. While preserving long memory and eliminating the correlation from both the generating noise and from its square, the new model is superior to the previously mentioned ones in approximating the probability density, the high quantiles and the extremal behaviour of the empirical river flows.

  20. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    Science.gov (United States)

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to

  1. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  2. ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals

    International Nuclear Information System (INIS)

    Vogel, J.E.

    1983-01-01

    1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x

  3. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  4. Probabilities for profitable fungicide use against gray leaf spot in hybrid maize.

    Science.gov (United States)

    Munkvold, G P; Martinson, C A; Shriver, J M; Dixon, P M

    2001-05-01

    ABSTRACT Gray leaf spot, caused by the fungus Cercospora zeae-maydis, causes considerable yield losses in hybrid maize grown in the north-central United States and elsewhere. Nonchemical management tactics have not adequately prevented these losses. The probability of profitably using fungicide application as a management tool for gray leaf spot was evaluated in 10 field experiments under conditions of natural inoculum in Iowa. Gray leaf spot severity in untreated control plots ranged from 2.6 to 72.8% for the ear leaf and from 3.0 to 7.7 (1 to 9 scale) for whole-plot ratings. In each experiment, fungicide applications with propiconazole or mancozeb significantly reduced gray leaf spot severity. Fungicide treatment significantly (P probability of achieving a positive net return with one or two propiconazole applications, based on the mean yields and standard deviations for treated and untreated plots, the price of grain, and the costs of the fungicide applications. For one application, the probability ranged from approximately 0.06 to more than 0.99, and exceeded 0.50 in six of nine scenarios (specific experiment/hybrid). The highest probabilities occurred in the 1995 experiments with the most susceptible hybrid. Probabilities were almost always higher for a single application of propiconazole than for two applications. These results indicate that a single application of propiconazole frequently can be profitable for gray leaf spot management in Iowa, but the probability of a profitable application is strongly influenced by hybrid susceptibility. The calculation of probabilities for positive net returns was more informative than mean separation in terms of assessing the economic success of the fungicide applications.

  5. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  6. A sequence-dependent rigid-base model of DNA

    Science.gov (United States)

    Gonzalez, O.; Petkevičiutė, D.; Maddocks, J. H.

    2013-02-01

    A novel hierarchy of coarse-grain, sequence-dependent, rigid-base models of B-form DNA in solution is introduced. The hierarchy depends on both the assumed range of energetic couplings, and the extent of sequence dependence of the model parameters. A significant feature of the models is that they exhibit the phenomenon of frustration: each base cannot simultaneously minimize the energy of all of its interactions. As a consequence, an arbitrary DNA oligomer has an intrinsic or pre-existing stress, with the level of this frustration dependent on the particular sequence of the oligomer. Attention is focussed on the particular model in the hierarchy that has nearest-neighbor interactions and dimer sequence dependence of the model parameters. For a Gaussian version of this model, a complete coarse-grain parameter set is estimated. The parameterized model allows, for an oligomer of arbitrary length and sequence, a simple and explicit construction of an approximation to the configuration-space equilibrium probability density function for the oligomer in solution. The training set leading to the coarse-grain parameter set is itself extracted from a recent and extensive database of a large number of independent, atomic-resolution molecular dynamics (MD) simulations of short DNA oligomers immersed in explicit solvent. The Kullback-Leibler divergence between probability density functions is used to make several quantitative assessments of our nearest-neighbor, dimer-dependent model, which is compared against others in the hierarchy to assess various assumptions pertaining both to the locality of the energetic couplings and to the level of sequence dependence of its parameters. It is also compared directly against all-atom MD simulation to assess its predictive capabilities. The results show that the nearest-neighbor, dimer-dependent model can successfully resolve sequence effects both within and between oligomers. For example, due to the presence of frustration, the model can

  7. A sequence-dependent rigid-base model of DNA.

    Science.gov (United States)

    Gonzalez, O; Petkevičiūtė, D; Maddocks, J H

    2013-02-07

    A novel hierarchy of coarse-grain, sequence-dependent, rigid-base models of B-form DNA in solution is introduced. The hierarchy depends on both the assumed range of energetic couplings, and the extent of sequence dependence of the model parameters. A significant feature of the models is that they exhibit the phenomenon of frustration: each base cannot simultaneously minimize the energy of all of its interactions. As a consequence, an arbitrary DNA oligomer has an intrinsic or pre-existing stress, with the level of this frustration dependent on the particular sequence of the oligomer. Attention is focussed on the particular model in the hierarchy that has nearest-neighbor interactions and dimer sequence dependence of the model parameters. For a Gaussian version of this model, a complete coarse-grain parameter set is estimated. The parameterized model allows, for an oligomer of arbitrary length and sequence, a simple and explicit construction of an approximation to the configuration-space equilibrium probability density function for the oligomer in solution. The training set leading to the coarse-grain parameter set is itself extracted from a recent and extensive database of a large number of independent, atomic-resolution molecular dynamics (MD) simulations of short DNA oligomers immersed in explicit solvent. The Kullback-Leibler divergence between probability density functions is used to make several quantitative assessments of our nearest-neighbor, dimer-dependent model, which is compared against others in the hierarchy to assess various assumptions pertaining both to the locality of the energetic couplings and to the level of sequence dependence of its parameters. It is also compared directly against all-atom MD simulation to assess its predictive capabilities. The results show that the nearest-neighbor, dimer-dependent model can successfully resolve sequence effects both within and between oligomers. For example, due to the presence of frustration, the model can

  8. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  9. Theoretical analysis on the probability of initiating persistent fission chain

    International Nuclear Information System (INIS)

    Liu Jianjun; Wang Zhe; Zhang Ben'ai

    2005-01-01

    For the finite multiplying system of fissile material in the presence of a weak neutron source, the authors analyses problems on the probability of initiating a persistent fission chain through reckoning the stochastic theory of neutron multiplication. In the theoretical treatment, the conventional point reactor conception model is developed to an improved form with position x and velocity v dependence. The estimated results including approximate value of the probability mentioned above and its distribution are given by means of diffusion approximation and compared with those with previous point reactor conception model. They are basically consistent, however the present model can provide details on the distribution. (authors)

  10. How Life History Can Sway the Fixation Probability of Mutants

    Science.gov (United States)

    Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne

    2016-01-01

    In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737

  11. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  12. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  13. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  14. Survival and compound nucleus probability of super heavy element Z = 117

    International Nuclear Information System (INIS)

    Manjunatha, H.C.; Sridhar, K.N.

    2017-01-01

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of "2"8"9"-"2"9"7Ts, we have calculated the transmission probability (T_l), compound nucleus formation probabilities (P_C_N) and survival probability (P_s_u_r) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of "2"8"9"-"2"9"7Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei "2"8"9"-"2"9"7Ts are worked out and listed explicitly. We have also studied the variation of P_C_N and P_s_u_r with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  15. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  16. Quantum return probability of a system of N non-interacting lattice fermions

    Science.gov (United States)

    Krapivsky, P. L.; Luck, J. M.; Mallick, K.

    2018-02-01

    We consider N non-interacting fermions performing continuous-time quantum walks on a one-dimensional lattice. The system is launched from a most compact configuration where the fermions occupy neighboring sites. We calculate exactly the quantum return probability (sometimes referred to as the Loschmidt echo) of observing the very same compact state at a later time t. Remarkably, this probability depends on the parity of the fermion number—it decays as a power of time for even N, while for odd N it exhibits periodic oscillations modulated by a decaying power law. The exponent also slightly depends on the parity of N, and is roughly twice smaller than what it would be in the continuum limit. We also consider the same problem, and obtain similar results, in the presence of an impenetrable wall at the origin constraining the particles to remain on the positive half-line. We derive closed-form expressions for the amplitudes of the power-law decay of the return probability in all cases. The key point in the derivation is the use of Mehta integrals, which are limiting cases of the Selberg integral.

  17. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  18. Eliciting conditional and unconditional rank correlations from conditional probabilities

    International Nuclear Information System (INIS)

    Morales, O.; Kurowicka, D.; Roelen, A.

    2008-01-01

    Causes of uncertainties may be interrelated and may introduce dependencies. Ignoring these dependencies may lead to large errors. A number of graphical models in probability theory such as dependence trees, vines and (continuous) Bayesian belief nets [Cooke RM. Markov and entropy properties of tree and vine-dependent variables. In: Proceedings of the ASA section on Bayesian statistical science, 1997; Kurowicka D, Cooke RM. Distribution-free continuous Bayesian belief nets. In: Proceedings of mathematical methods in reliability conference, 2004; Bedford TJ, Cooke RM. Vines-a new graphical model for dependent random variables. Ann Stat 2002; 30(4):1031-68; Kurowicka D, Cooke RM. Uncertainty analysis with high dimensional dependence modelling. New York: Wiley; 2006; Hanea AM, et al. Hybrid methods for quantifying and analyzing Bayesian belief nets. In: Proceedings of the 2005 ENBIS5 conference, 2005; Shachter RD, Kenley CR. Gaussian influence diagrams. Manage Sci 1998; 35(5) .] have been developed to capture dependencies between random variables. The input for these models are various marginal distributions and dependence information, usually in the form of conditional rank correlations. Often expert elicitation is required. This paper focuses on dependence representation, and dependence elicitation. The techniques presented are illustrated with an application from aviation safety

  19. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  20. Cueing spatial attention through timing and probability.

    Science.gov (United States)

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2013-01-01

    Even when focused on an effortful task we retain the ability to detect salient environmental information, and even irrelevant visual stimuli can be automatically detected. However, to which extent unattended information affects attentional control is not fully understood. Here we provide evidences of how the brain spontaneously organizes its cognitive resources by shifting attention between a selective-attending and a stimulus-driven modality within a single task. Using a spatial cueing paradigm we investigated the effect of cue-target asynchronies as a function of their probabilities of occurrence (i.e., relative frequency). Results show that this accessory information modulates attentional shifts. A valid spatial cue improved participants' performance as compared to an invalid one only in trials in which target onset was highly predictable because of its more robust occurrence. Conversely, cuing proved ineffective when spatial cue and target were associated according to a less frequent asynchrony. These patterns of response depended on asynchronies' probability and not on their duration. Our findings clearly demonstrate that through a fine decision-making, performed trial-by-trial, the brain utilizes implicit information to decide whether or not voluntarily shifting spatial attention. As if according to a cost-planning strategy, the cognitive effort of shifting attention depending on the cue is performed only when the expected advantages are higher. In a trade-off competition for cognitive resources, voluntary/automatic attending may thus be a more complex process than expected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Energy dependence of the zero-range DWBA normalization of the /sup 58/Ni(/sup 3/He,. cap alpha. )/sup 57/Ni reaction. [15 to 205 GeV, finite-range and nonlocality corrections

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, J R; Zimmerman, W R; Kraushaar, J J [Colorado Univ., Boulder (USA). Dept. of Physics and Astrophysics

    1977-01-04

    Strong transitions in the /sup 58/Ni(/sup 3/He,..cap alpha..)/sup 57/Ni reaction were analyzed using both the zero-range and exact finite-range DWBA. Data considered covered a range of bombarding energies from 15 to 205 MeV. The zero-range DWBA described all data well when finite-range and non-locality corrections were included in the local energy approximation. Comparison of zero-range and exact finite-range calculations showed the local energy approximation correction to be very accurate over the entire energy region. Empirically determined D/sub 0/ values showed no energy dependence. A theoretical D/sub 0/ value calculated using an ..cap alpha.. wave function which reproduced the measured ..cap alpha.. rms charge radius and the elastic electron scattering form factor agreed well the empirical values. Comparison was made between these values and D/sub 0/ values quoted previously in the literature.

  2. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  3. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  4. Diagnostic probability function for acute coronary heart disease garnered from experts' tacit knowledge.

    Science.gov (United States)

    Steurer, Johann; Held, Ulrike; Miettinen, Olli S

    2013-11-01

    Knowing about a diagnostic probability requires general knowledge about the way in which the probability depends on the diagnostic indicators involved in the specification of the case at issue. Diagnostic probability functions (DPFs) are generally unavailable at present. Our objective was to illustrate how diagnostic experts' case-specific tacit knowledge about diagnostic probabilities could be garnered in the form of DPFs. Focusing on diagnosis of acute coronary heart disease (ACHD), we presented doctors with extensive experience in hospitals' emergency departments a set of hypothetical cases specified in terms of an inclusive set of diagnostic indicators. We translated the medians of these experts' case-specific probabilities into a logistic DPF for ACHD. The principal result was the experts' typical diagnostic probability for ACHD as a joint function of the set of diagnostic indicators. A related result of note was the finding that the experts' probabilities in any given case had a surprising degree of variability. Garnering diagnostic experts' case-specific tacit knowledge about diagnostic probabilities in the form of DPFs is feasible to accomplish. Thus, once the methodology of this type of work has been "perfected," practice-guiding diagnostic expert systems can be developed. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Effect of velocity variation on secondary-ion-emission probability: Quantum stationary approach

    International Nuclear Information System (INIS)

    Goldberg, E.C.; Ferron, J.; Passeggi, M.C.G.

    1989-01-01

    The ion-velocity dependence of the ionization probability for an atom ejected from a surface is examined by using a quantum approach in which the coupled motion between electrons and the outgoing nucleus is followed along the whole trajectory by solving the stationary Schroedinger equation. We choose a very-small-cluster-model system in which the motion of the atom is restricted to one dimension, and with energy potential curves corresponding to the involved channels varying appreciably with the atom position. We found an exponential dependence on the inverse of the asymptotic ion velocity for high emission energies, and a smoother behavior with slight oscillations at low energies. These results are compared with those obtained within a dynamical-trajectory approximation using either a constant velocity equal to the asymptotic ionic value, or expressions for the velocity derived from the eikonal approximation and from the classical limit of the current vector. Both approaches give similar results provided the velocity is allowed to adjust self-consistently to potential energies and transition-amplitude variations. Strong oscillations are observed in the low-emission-energy range either if the transitions are neglected, or a constant velocity along the whole path is assumed for the ejected particle

  6. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  8. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  9. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  10. Soil pollution at outdoor shooting ranges: Health effects, bioavailability and best management practices.

    Science.gov (United States)

    Fayiga, A O; Saha, U K

    2016-09-01

    The total lead (Pb) concentrations of the surface soil, sub surface soil, vegetation and surface waters of outdoor shooting ranges are extremely high and above regulatory limits. Lead is dangerous at high concentrations and can cause a variety of serious health problems. Shooters and range workers are exposed to lead dust and can even take Pb dust home to their families while some animals around the shooting range can ingest the Pb bullets. The toxicity of Pb depends on its bioavailability which has been determined to be influenced greatly by the geochemical properties of each site. The bioavailability of Pb in shooting ranges has been found to be higher than other metal contaminated soils probably because of its very low residual Pb (soil, migration of Pb within shooting ranges and offsite has been reported in literature. Best management practices to reduce mobility of Pb in shooting ranges involve an integrated Pb management program which has been described in the paper. The adoption of the non-toxic "green bullet" which has been developed to replace Pb bullets may reduce or prevent environmental pollution at shooting ranges. However, the contaminated soil resulting from decades of operation of several shooting ranges still needs to be restored to its natural state. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  12. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Science.gov (United States)

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  13. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Science.gov (United States)

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  14. Temperature dependence of the short-range order parameter and the concentration dependence of the order disorder temperature for Ni-Pt and Ni-Fe systems in the improved statistical pseudopotential approximation

    International Nuclear Information System (INIS)

    Khwaja, F.A.

    1980-08-01

    The calculations for the temperature dependence of the first shell short-range order (SRO) parameter for Ni 3 Fe using the cubic approximation of Tahir Kheli, and the concentration dependence of order-disorder temperature Tsub(c) for Ni-Fe and Ni-Pt systems using the linear approximation, have been carried out in the framework of pseudopotential theory. It is shown that the cubic approximation yields a good agreement between the theoretical prediction of the α 1 and the experimental data. Results for the concentration dependence of the Tsub(c) show that improvements in the statistical pseudo-potential approach are essential to achieve a good agreement with experiment. (author)

  15. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  16. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  17. Framing of decision problem in short and long term and probability perception

    Directory of Open Access Journals (Sweden)

    Anna Wielicka-Regulska

    2010-01-01

    Full Text Available Consumer preferences are dependent on problem framing and time perspective. For experiment’s participants avoiding of losses was less probable in distant time perspective than in near term. On the contrary, achieving gains in near future was less probable than in remote time. One may expect different reactions when presenting problem in terms of gains than in terms of losses. This can be exploited in promotion of highly desired social behaviours like savings for retirement, keeping good diet, investing in learning, and other advantageous activities that are usually put forward by consumers.

  18. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    International Nuclear Information System (INIS)

    Caleyo, F.; Velazquez, J.C.; Valor, A.; Hallen, J.M.

    2009-01-01

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  19. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)

    2009-09-15

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  20. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  1. Fossil preservation and the stratigraphic ranges of taxa

    Science.gov (United States)

    Foote, M.; Raup, D. M.

    1996-01-01

    The incompleteness of the fossil record hinders the inference of evolutionary rates and patterns. Here, we derive relationships among true taxonomic durations, preservation probability, and observed taxonomic ranges. We use these relationships to estimate original distributions of taxonomic durations, preservation probability, and completeness (proportion of taxa preserved), given only the observed ranges. No data on occurrences within the ranges of taxa are required. When preservation is random and the original distribution of durations is exponential, the inference of durations, preservability, and completeness is exact. However, reasonable approximations are possible given non-exponential duration distributions and temporal and taxonomic variation in preservability. Thus, the approaches we describe have great potential in studies of taphonomy, evolutionary rates and patterns, and genealogy. Analyses of Upper Cambrian-Lower Ordovician trilobite species, Paleozoic crinoid genera, Jurassic bivalve species, and Cenozoic mammal species yield the following results: (1) The preservation probability inferred from stratigraphic ranges alone agrees with that inferred from the analysis of stratigraphic gaps when data on the latter are available. (2) Whereas median durations based on simple tabulations of observed ranges are biased by stratigraphic resolution, our estimates of median duration, extinction rate, and completeness are not biased.(3) The shorter geologic ranges of mammalian species relative to those of bivalves cannot be attributed to a difference in preservation potential. However, we cannot rule out the contribution of taxonomic practice to this difference. (4) In the groups studied, completeness (proportion of species [trilobites, bivalves, mammals] or genera [crinoids] preserved) ranges from 60% to 90%. The higher estimates of completeness at smaller geographic scales support previous suggestions that the incompleteness of the fossil record reflects loss of

  2. Fusing probability density function into Dempster-Shafer theory of evidence for the evaluation of water treatment plant.

    Science.gov (United States)

    Chowdhury, Shakhawat

    2013-05-01

    The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.

  3. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  4. Long-range correlations of different EEG derivations in rats: sleep stage-dependent generators may play a key role

    International Nuclear Information System (INIS)

    Fang, Guangzhan; Xia, Yang; Lai, Yongxiu; You, Zili; Yao, Dezhong

    2010-01-01

    For the electroencephalogram (EEG), topographic differences in the long-range temporal correlations would imply that these signals might be affected by specific mechanisms related to the generation of a given neuronal process. So the properties of the generators of various EEG oscillations might be investigated by their spatial differences of the long-range temporal correlations. In the present study, these correlations were characterized with respect to their topography during different vigilance states by detrended fluctuation analysis (DFA). The results indicated that (1) most of the scaling exponents acquired from different EEG derivations for various oscillations were significantly different in each vigilance state; these differences might be resulted from the different quantities and different locations of sleep stage-dependent generators of various neuronal processes; (2) there might be multiple generators of delta and theta over the brain and many of them were sleep stage-dependent; (3) the best site of the frontal electrode in a fronto-parietal bipolar electrode for sleep staging might be above the anterior midline cortex. We suggest that DFA analysis can be used to explore the properties of the generators of a given neuronal oscillation, and the localizations of these generators if more electrodes are involved

  5. On the long-range dependence properties of annual precipitation using a global network of instrumental measurements

    Science.gov (United States)

    Tyralis, Hristos; Dimitriadis, Panayiotis; Koutsoyiannis, Demetris; O'Connell, Patrick Enda; Tzouka, Katerina; Iliopoulou, Theano

    2018-01-01

    The long-range dependence (LRD) is considered an inherent property of geophysical processes, whose presence increases uncertainty. Here we examine the spatial behaviour of LRD in precipitation by regressing the Hurst parameter estimate of mean annual precipitation instrumental data which span from 1916-2015 and cover a big area of the earth's surface on location characteristics of the instrumental data stations. Furthermore, we apply the Mann-Kendall test under the LRD assumption (MKt-LRD) to reassess the significance of observed trends. To summarize the results, the LRD is spatially clustered, it seems to depend mostly on the location of the stations, while the predictive value of the regression model is good. Thus when investigating for LRD properties we recommend that the local characteristics should be considered. The application of the MKt-LRD suggests that no significant monotonic trend appears in global precipitation, excluding the climate type D (snow) regions in which positive significant trends appear.

  6. SU-F-T-221: An Assessment of the Potential for Improved Local Control of Skull- Base Chordomas Via Reduction of the Proton Beam Range Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Muller, L; Soldner, A; Kirk, M; Fager, M; Solberg, T; Robert, L; Dolney, D [University of Pennsylvania, Philadelphia, PA (United States)

    2016-06-15

    Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5% of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.

  7. A fluctuation relation for the probability of energy backscatter

    Science.gov (United States)

    Vela-Martin, Alberto; Jimenez, Javier

    2017-11-01

    We simulate the large scales of an inviscid turbulent flow in a triply periodic box using a dynamic Smagorinsky model for the sub-grid stresses. The flow, which is forced to constant kinetic energy, is fully reversible and can develop a sustained inverse energy cascade. However, due to the large number of degrees freedom, the probability of spontaneous mean inverse energy flux is negligible. In order to quantify the probability of inverse energy cascades, we test a local fluctuation relation of the form log P(A) = - c(V , t) A , where P(A) = p(| Cs|V,t = A) / p(| Cs|V , t = - A) , p is probability, and | Cs|V,t is the average of the least-squared dynamic model coefficient over volume V and time t. This is confirmed when Cs is averaged over sufficiently large domains and long times, and c is found to depend linearly on V and t. In the limit in which V 1 / 3 is of the order of the integral scale and t is of the order of the eddy-turnover time, we recover a global fluctuation relation that predicts a negligible probability of a sustained inverse energy cascade. For smaller V and t, the local fluctuation relation provides useful predictions on the occurrence of local energy backscatter. Funded by the ERC COTURB project.

  8. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  9. Temperature dependence of the Schottky-barrier heights of n-type semiconductors in the temperature range of 7 to 300 K

    International Nuclear Information System (INIS)

    Chen, T.P.; Lee, T.C.; Fung, S.; Beling, C.D.

    1994-01-01

    In this note we present the results of the temperature dependence of the SBH in Au/n-Si, Ag/n-GaAs, and Au/n-GaAs in the temperature range of 7 to 300 K from our internal photoemission measurements. (orig.)

  10. Optical transition probabilities in electron-vibration-rotation spectra of diatomic molecules

    International Nuclear Information System (INIS)

    Kuznetsova, L.A.; Kuz'menko, N.E.; Kuzyakov, Yu.Ya.; Plastinin, Yu.A.

    1974-01-01

    The present review systematizes the data on the absolute probabilities of electron transitions in diatomic molecules, which have been published since the beginning of 1961 and up to the end of 1973, and those on the relative transition probabilities, which have been published since the beginning of 1966 till the end of 1973. The review discussed the theoretical relationships underlying the experimental techniques of determining the absolute transition probabilities. Modifications of the techniques under discussion are not specially examined; the details of interest can be found, however, in the references cited. The factual material-, such as the values of the absolute probabilities of electron transitions, the dependences of the electron transition moments on the internuclear distance and the values of the Franck-Condon factors,- is presented in tables 1, 2 and 4, respectively, embracing all the relevant works known to the present authors. Along with a complete systematization of the transition probability data, the authors have attempted a critical analysis of the available data in order to select the most reliable results. The recommended values of the squared matrix elements of the electron transition dipole moments are given in table 3. The last chaper of the work compares the results of calculations of the Franck-Condon factors obtained with the different milecular potentials [ru

  11. Assembly for the measurement of the most probable energy of directed electron radiation

    International Nuclear Information System (INIS)

    Geske, G.

    1987-01-01

    This invention relates to a setup for the measurement of the most probable energy of directed electron radiation up to 50 MeV. The known energy-range relationship with regard to the absorption of electron radiation in matter is utilized by an absorber with two groups of interconnected radiation detectors embedded in it. The most probable electron beam energy is derived from the quotient of both groups' signals

  12. Quantification of a decision-making failure probability of the accident management using cognitive analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Yoshitaka; Ohtani, Masanori [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan); Fujita, Yushi [TECNOVA Corp., Tokyo (Japan)

    2002-09-01

    In the nuclear power plant, much knowledge is acquired through probabilistic safety assessment (PSA) of a severe accident, and accident management (AM) is prepared. It is necessary to evaluate the effectiveness of AM using the decision-making failure probability of an emergency organization, operation failure probability of operators, success criteria of AM and reliability of AM equipments in PSA. However, there has been no suitable qualification method for PSA so far to obtain the decision-making failure probability, because the decision-making failure of an emergency organization treats the knowledge based error. In this work, we developed a new method for quantification of the decision-making failure probability of an emergency organization using cognitive analysis model, which decided an AM strategy, in a nuclear power plant at the severe accident, and tried to apply it to a typical pressurized water reactor (PWR) plant. As a result: (1) It could quantify the decision-making failure probability adjusted to PSA for general analysts, who do not necessarily possess professional human factors knowledge, by choosing the suitable value of a basic failure probability and an error-factor. (2) The decision-making failure probabilities of six AMs were in the range of 0.23 to 0.41 using the screening evaluation method and in the range of 0.10 to 0.19 using the detailed evaluation method as the result of trial evaluation based on severe accident analysis of a typical PWR plant, and a result of sensitivity analysis of the conservative assumption, failure probability decreased about 50%. (3) The failure probability using the screening evaluation method exceeded that using detailed evaluation method by 99% of probability theoretically, and the failure probability of AM in this study exceeded 100%. From this result, it was shown that the decision-making failure probability was more conservative than the detailed evaluation method, and the screening evaluation method satisfied

  13. Quantification of a decision-making failure probability of the accident management using cognitive analysis model

    International Nuclear Information System (INIS)

    Yoshida, Yoshitaka; Ohtani, Masanori; Fujita, Yushi

    2002-01-01

    In the nuclear power plant, much knowledge is acquired through probabilistic safety assessment (PSA) of a severe accident, and accident management (AM) is prepared. It is necessary to evaluate the effectiveness of AM using the decision-making failure probability of an emergency organization, operation failure probability of operators, success criteria of AM and reliability of AM equipments in PSA. However, there has been no suitable qualification method for PSA so far to obtain the decision-making failure probability, because the decision-making failure of an emergency organization treats the knowledge based error. In this work, we developed a new method for quantification of the decision-making failure probability of an emergency organization using cognitive analysis model, which decided an AM strategy, in a nuclear power plant at the severe accident, and tried to apply it to a typical pressurized water reactor (PWR) plant. As a result: (1) It could quantify the decision-making failure probability adjusted to PSA for general analysts, who do not necessarily possess professional human factors knowledge, by choosing the suitable value of a basic failure probability and an error-factor. (2) The decision-making failure probabilities of six AMs were in the range of 0.23 to 0.41 using the screening evaluation method and in the range of 0.10 to 0.19 using the detailed evaluation method as the result of trial evaluation based on severe accident analysis of a typical PWR plant, and a result of sensitivity analysis of the conservative assumption, failure probability decreased about 50%. (3) The failure probability using the screening evaluation method exceeded that using detailed evaluation method by 99% of probability theoretically, and the failure probability of AM in this study exceeded 100%. From this result, it was shown that the decision-making failure probability was more conservative than the detailed evaluation method, and the screening evaluation method satisfied

  14. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  15. Numerical modelling of local deposition patients, activity distributions and cellular hit probabilities of inhaled radon progenies in human airways

    International Nuclear Information System (INIS)

    Farkas, A.; Balashazy, I.; Szoeke, I.

    2003-01-01

    The general objective of our research is modelling the biophysical processes of the effects of inhaled radon progenies. This effort is related to the rejection or support of the linear no threshold (LNT) dose-effect hypothesis, which seems to be one of the most challenging tasks of current radiation protection. Our approximation and results may also serve as a useful tool for lung cancer models. In this study, deposition patterns, activity distributions and alpha-hit probabilities of inhaled radon progenies in the large airways of the human tracheobronchial tree are computed. The airflow fields and related particle deposition patterns strongly depend on the shape of airway geometry and breathing pattern. Computed deposition patterns of attached an unattached radon progenies are strongly inhomogeneous creating hot spots at the carinal regions and downstream of the inner sides of the daughter airways. The results suggest that in the vicinity of the carinal regions the multiple hit probabilities are quite high even at low average doses and increase exponentially in the low-dose range. Thus, even the so-called low doses may present high doses for large clusters of cells. The cell transformation probabilities are much higher in these regions and this phenomenon cannot be modeled with average burdens. (authors)

  16. Electron transport in furfural: dependence of the electron ranges on the cross sections and the energy loss distribution functions

    Science.gov (United States)

    Ellis-Gibbings, L.; Krupa, K.; Colmenares, R.; Blanco, F.; Muńoz, A.; Mendes, M.; Ferreira da Silva, F.; Limá Vieira, P.; Jones, D. B.; Brunger, M. J.; García, G.

    2016-09-01

    Recent theoretical and experimental studies have provided a complete set of differential and integral electron scattering cross section data from furfural over a broad energy range. The energy loss distribution functions have been determined in this study by averaging electron energy loss spectra for different incident energies and scattering angles. All these data have been used as input parameters for an event by event Monte Carlo simulation procedure to obtain the electron energy deposition patterns and electron ranges in liquid furfural. The dependence of these results on the input cross sections is then analysed to determine the uncertainty of the simulated values.

  17. A stochastic model for the probability of malaria extinction by mass drug administration.

    Science.gov (United States)

    Pemberton-Ross, Peter; Chitnis, Nakul; Pothin, Emilie; Smith, Thomas A

    2017-09-18

    Mass drug administration (MDA) has been proposed as an intervention to achieve local extinction of malaria. Although its effect on the reproduction number is short lived, extinction may subsequently occur in a small population due to stochastic fluctuations. This paper examines how the probability of stochastic extinction depends on population size, MDA coverage and the reproduction number under control, R c . A simple compartmental model is developed which is used to compute the probability of extinction using probability generating functions. The expected time to extinction in small populations after MDA for various scenarios in this model is calculated analytically. The results indicate that mass drug administration (Firstly, R c must be sustained at R c  95% to have a non-negligible probability of successful elimination. Stochastic fluctuations only significantly affect the probability of extinction in populations of about 1000 individuals or less. The expected time to extinction via stochastic fluctuation is less than 10 years only in populations less than about 150 individuals. Clustering of secondary infections and of MDA distribution both contribute positively to the potential probability of success, indicating that MDA would most effectively be administered at the household level. There are very limited circumstances in which MDA will lead to local malaria elimination with a substantial probability.

  18. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  19. An M-estimator of multivariate tail dependence

    NARCIS (Netherlands)

    Krajina, A.

    2010-01-01

    AN M-ESTIMATOR OF TAIL DEPENDENCE. Extreme value theory is the part of probability and statistics that provides the theoretical background for modeling events that almost never happen. The estimation of the dependence between two or more such unlikely events (tail dependence) is the topic of this

  20. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  1. Electron-trapping probability in natural dosemeters as a function of irradiation temperature

    DEFF Research Database (Denmark)

    Wallinga, J.; Murray, A.S.; Wintle, A.G.

    2002-01-01

    The electron-trapping probability in OSL traps as a function of irradiation temperature is investigated for sedimentary quartz and feldspar. A dependency was found for both minerals; this phenomenon could give rise to errors in dose estimation when the irradiation temperature used in laboratory...... procedures is different from that in the natural environment. No evidence was found for the existence of shallow trap saturation effects that Could give rise to a dose-rate dependency of electron trapping....

  2. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  3. Has David Howden Vindicated Richard von Mises’s Definition of Probability?

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-11-01

    Full Text Available In my recent article on these pages (Crovelli 2009 I argued that members of the Austrian School of economics have adopted and defended a faulty definition of probability. I argued that the definition of probability necessarily depends upon the nature of the world in which we live. I claimed that if the nature of the world is such that every event and phenomenon which occurs has a cause of some sort, then probability must be defined subjectively; that is, “as a measure of our uncertainty about the likelihood of occurrence of some event or phenomenon, based upon evidence that need not derive solely from past frequencies of ‘collectives’ or ‘classes.’” I further claimed that the nature of the world is indeed such that all events and phenomena have prior causes, and that this fact compels us to adopt a subjective definition of probability.David Howden has recently published what he claims is a refutation of my argument in his article “Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations” (Howden 2009. Unfortunately, Mr. Howden appears to not have understood my argument, and his purported refutation of my subjective definition consequently amounts to nothing more than a concatenation of confused and fallacious ideas that are completely irrelevant to my argument. David Howden has thus failed in his attempt to vindicate Richard von Mises’s definition of probability.

  4. Effects of variability in probable maximum precipitation patterns on flood losses

    Science.gov (United States)

    Zischg, Andreas Paul; Felder, Guido; Weingartner, Rolf; Quinn, Niall; Coxon, Gemma; Neal, Jeffrey; Freer, Jim; Bates, Paul

    2018-05-01

    The assessment of the impacts of extreme floods is important for dealing with residual risk, particularly for critical infrastructure management and for insurance purposes. Thus, modelling of the probable maximum flood (PMF) from probable maximum precipitation (PMP) by coupling hydrological and hydraulic models has gained interest in recent years. Herein, we examine whether variability in precipitation patterns exceeds or is below selected uncertainty factors in flood loss estimation and if the flood losses within a river basin are related to the probable maximum discharge at the basin outlet. We developed a model experiment with an ensemble of probable maximum precipitation scenarios created by Monte Carlo simulations. For each rainfall pattern, we computed the flood losses with a model chain and benchmarked the effects of variability in rainfall distribution with other model uncertainties. The results show that flood losses vary considerably within the river basin and depend on the timing and superimposition of the flood peaks from the basin's sub-catchments. In addition to the flood hazard component, the other components of flood risk, exposure, and vulnerability contribute remarkably to the overall variability. This leads to the conclusion that the estimation of the probable maximum expectable flood losses in a river basin should not be based exclusively on the PMF. Consequently, the basin-specific sensitivities to different precipitation patterns and the spatial organization of the settlements within the river basin need to be considered in the analyses of probable maximum flood losses.

  5. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  6. Evolution of an array of elements with logistic transition probability

    International Nuclear Information System (INIS)

    Majernik, Vladimir; Surda, Anton

    1996-01-01

    The paper addresses the problem how the state of an array of elements changes if the transition probabilities of its elements is chosen in the form of a logistic map. This problem leads to a special type of a discrete-time Markov which we simulated numerically for the different transition probabilities and the number of elements in the array. We show that the time evolution of the array exhibits a wide scale of behavior depending on the value of the total number of its elements and on the logistic constant a. We point out that this problem can be applied for description of a spin system with a certain type of mean field and of the multispecies ecosystems with an internal noise. (authors)

  7. Magnetic field and temperature dependence of the fluorescence lifetime of Cr sup(3+) in GdA103

    International Nuclear Information System (INIS)

    Helman, J.S.; Caride, A.O.; Basso, H.C.; Terrile, M.C.; Carvalho, R.A.

    1991-01-01

    The fluorescence lifetime of Cr sup(3+) in GdA10 sub(3) was measured in the range 1.8 - 4.2 K in magnetic fields up to 6 T. The results show a remarkable dependence of the transition probabilities on magnetic order. A model based on the exchange interaction between Cr sup(3+) in highly excited states and the Gd sup(3+) ions is proposed. (author)

  8. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    Normally, a consistent basis for calculating partial factors focuses on a homogeneous reliability index neither depending on which material the structure is constructed of nor the ratio between the permanent and variable actions acting on the structure. Furthermore, the reliability index should n...... the characteristic shape coefficients are based on mean values as specified in background documents to the Eurocodes. Importance of hidden safeties judging the reliability is discussed for wind actions on low-rise structures....... not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  9. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  10. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  11. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  12. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    Science.gov (United States)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  13. Risk-taking in disorders of natural and drug rewards: neural correlates and effects of probability, valence, and magnitude.

    Science.gov (United States)

    Voon, Valerie; Morris, Laurel S; Irvine, Michael A; Ruck, Christian; Worbe, Yulia; Derbyshire, Katherine; Rankov, Vladan; Schreiber, Liana Rn; Odlaug, Brian L; Harrison, Neil A; Wood, Jonathan; Robbins, Trevor W; Bullmore, Edward T; Grant, Jon E

    2015-03-01

    Pathological behaviors toward drugs and food rewards have underlying commonalities. Risk-taking has a fourfold pattern varying as a function of probability and valence leading to the nonlinearity of probability weighting with overweighting of small probabilities and underweighting of large probabilities. Here we assess these influences on risk-taking in patients with pathological behaviors toward drug and food rewards and examine structural neural correlates of nonlinearity of probability weighting in healthy volunteers. In the anticipation of rewards, subjects with binge eating disorder show greater risk-taking, similar to substance-use disorders. Methamphetamine-dependent subjects had greater nonlinearity of probability weighting along with impaired subjective discrimination of probability and reward magnitude. Ex-smokers also had lower risk-taking to rewards compared with non-smokers. In the anticipation of losses, obesity without binge eating had a similar pattern to other substance-use disorders. Obese subjects with binge eating also have impaired discrimination of subjective value similar to that of the methamphetamine-dependent subjects. Nonlinearity of probability weighting was associated with lower gray matter volume in dorsolateral and ventromedial prefrontal cortex and orbitofrontal cortex in healthy volunteers. Our findings support a distinct subtype of binge eating disorder in obesity with similarities in risk-taking in the reward domain to substance use disorders. The results dovetail with the current approach of defining mechanistically based dimensional approaches rather than categorical approaches to psychiatric disorders. The relationship to risk probability and valence may underlie the propensity toward pathological behaviors toward different types of rewards.

  14. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible.

    Science.gov (United States)

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.

  15. Topological probability and connection strength induced activity in complex neural networks

    International Nuclear Information System (INIS)

    Du-Qu, Wei; Bo, Zhang; Dong-Yuan, Qiu; Xiao-Shu, Luo

    2010-01-01

    Recent experimental evidence suggests that some brain activities can be assigned to small-world networks. In this work, we investigate how the topological probability p and connection strength C affect the activities of discrete neural networks with small-world (SW) connections. Network elements are described by two-dimensional map neurons (2DMNs) with the values of parameters at which no activity occurs. It is found that when the value of p is smaller or larger, there are no active neurons in the network, no matter what the value of connection strength is; for a given appropriate connection strength, there is an intermediate range of topological probability where the activity of 2DMN network is induced and enhanced. On the other hand, for a given intermediate topological probability level, there exists an optimal value of connection strength such that the frequency of activity reaches its maximum. The possible mechanism behind the action of topological probability and connection strength is addressed based on the bifurcation method. Furthermore, the effects of noise and transmission delay on the activity of neural network are also studied. (general)

  16. Research advances in probability of causation calculation of radiogenic neoplasms

    International Nuclear Information System (INIS)

    Ning Jing; Yuan Yong; Xie Xiangdong; Yang Guoshan

    2009-01-01

    Probability of causation (PC) was used to facilitate the adjudication of compensation claims for cancers diagnosed following exposure to ionizing radiation. In this article, the excess cancer risk assessment models used for PC calculation are reviewed. Cancer risk transfer models between different populations, dependence of cancer risk on dose and dose rate, modification by epidemiological risk factors and application of PC are also discussed in brief. (authors)

  17. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  18. Multiple cyber attacks against a target with observation errors and dependent outcomes: Characterization and optimization

    International Nuclear Information System (INIS)

    Hu, Xiaoxiao; Xu, Maochao; Xu, Shouhuai; Zhao, Peng

    2017-01-01

    In this paper we investigate a cybersecurity model: An attacker can launch multiple attacks against a target with a termination strategy that says that the attacker will stop after observing a number of successful attacks or when the attacker is out of attack resources. However, the attacker's observation of the attack outcomes (i.e., random variables indicating whether the target is compromised or not) has an observation error that is specified by both a false-negative and a false-positive probability. The novelty of the model we study is the accommodation of the dependence between the attack outcomes, because the dependence was assumed away in the literature. In this model, we characterize the monotonicity and bounds of the compromise probability (i.e., the probability that the target is compromised). In addition to extensively showing the impact of dependence on quantities such as compromise probability and attack cost, we give methods for finding the optimal strategy that leads to maximum compromise probability or minimum attack cost. This study highlights that the dependence between random variables cannot be assumed away, because the results will be misleading. - Highlights: • A novel cybersecurity model is proposed to accommodate the dependence among attack outcomes. • The monotonicity and bounds of the compromise probability are studied. • The dependence effect on the compromise probability and attack cost is discussed via simulation. • The optimal strategy that leads to maximum compromise probability or minimum attack cost is presented.

  19. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  20. A closer look at the probabilities of the notorious three prisoners.

    Science.gov (United States)

    Falk, R

    1992-06-01

    The "problem of three prisoners", a counterintuitive teaser, is analyzed. It is representative of a class of probability puzzles where the correct solution depends on explication of underlying assumptions. Spontaneous beliefs concerning the problem and intuitive heuristics are reviewed. The psychological background of these beliefs is explored. Several attempts to find a simple criterion to predict whether and how the probability of the target event will change as a result of obtaining evidence are examined. However, despite the psychological appeal of these attempts, none proves to be valid in general. A necessary and sufficient condition for change in the probability of the target event, following observation of new data, is proposed. That criterion is an extension of the likelihood-ratio principle (which holds in the case of only two complementary alternatives) to any number of alternatives. Some didactic implications concerning the significance of the chance set-up and reliance on analogies are discussed.

  1. Stress Testing with Student's t Dependence

    NARCIS (Netherlands)

    H.J.W.G. Kole (Erik); C.G. Koedijk (Kees); M.J.C.M. Verbeek (Marno)

    2003-01-01

    textabstractIn this study we propose the use of the Student's t dependence function to model dependence between asset returns when conducting stress tests. To properly include stress testing in a risk management system, it is important to have accurate information about the (joint) probabilities of

  2. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  3. Probability of Criticality for MOX SNF

    International Nuclear Information System (INIS)

    P. Gottlieb

    1999-01-01

    The purpose of this calculation is to provide a conservative (upper bound) estimate of the probability of criticality for mixed oxide (MOX) spent nuclear fuel (SNF) of the Westinghouse pressurized water reactor (PWR) design that has been proposed for use. with the Plutonium Disposition Program (Ref. 1, p. 2). This calculation uses a Monte Carlo technique similar to that used for ordinary commercial SNF (Ref. 2, Sections 2 and 5.2). Several scenarios, covering a range of parameters, are evaluated for criticality. Parameters specifying the loss of fission products and iron oxide from the waste package are particularly important. This calculation is associated with disposal of MOX SNF

  4. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  5. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  6. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  7. Posterior probability of linkage and maximal lod score.

    Science.gov (United States)

    Génin, E; Martinez, M; Clerget-Darpoux, F

    1995-01-01

    To detect linkage between a trait and a marker, Morton (1955) proposed to calculate the lod score z(theta 1) at a given value theta 1 of the recombination fraction. If z(theta 1) reaches +3 then linkage is concluded. However, in practice, lod scores are calculated for different values of the recombination fraction between 0 and 0.5 and the test is based on the maximum value of the lod score Zmax. The impact of this deviation of the test on the probability that in fact linkage does not exist, when linkage was concluded, is documented here. This posterior probability of no linkage can be derived by using Bayes' theorem. It is less than 5% when the lod score at a predetermined theta 1 is used for the test. But, for a Zmax of +3, we showed that it can reach 16.4%. Thus, considering a composite alternative hypothesis instead of a single one decreases the reliability of the test. The reliability decreases rapidly when Zmax is less than +3. Given a Zmax of +2.5, there is a 33% chance that linkage does not exist. Moreover, the posterior probability depends not only on the value of Zmax but also jointly on the family structures and on the genetic model. For a given Zmax, the chance that linkage exists may then vary.

  8. Calculation of Fire Severity Factors and Fire Non-Suppression Probabilities For A DOE Facility Fire PRA

    International Nuclear Information System (INIS)

    Elicson, Tom; Harwood, Bentley; Lucek, Heather; Bouchard, Jim

    2011-01-01

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. The fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: Development of time-dependent fire heat release rate profiles (required as input to CFAST), Calculation of fire severity factors based on CFAST detailed fire modeling, and Calculation of fire non-suppression probabilities.

  9. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  10. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  11. The considering of the slowing down effect in the formalism of probability tables. Application to the effective cross section calculation

    International Nuclear Information System (INIS)

    Bouhelal, O.K.A.

    1990-01-01

    The exact determination of the effective multigroup cross sections imposes the numerical solution of the slowing down equation on a very fine energy mesh. Given the complexity of these calculations, different approximation methods have been developed but without a satisfactory treatment of the slowing-down effect. The usual methods are essentially based on interpolations using precalculated tables. The models that use the probability tables allow to reduce the amount of data and the computational effort. A variety of methods proposed by Soviets, then by Americans, and finally the French method, based on the ''moments of a probability distribution'' are incontestably valid within the framework of the statistical hypothesis. This stipulates that the collision densities do not depend on cross section and there is no ambiguity in the effective cross section calculation. The objective of our work is to show that the non statistical phenomena, such as the slowing-down effect which is taken into account, can be described by probability tables which are able to represent the neutronic values and collision densities. The formalism involved in the statistical hypothesis, is based on the Gauss quadrature of the cross sections moments. In the non-statistical hypothesis we introduce the crossed probability tables using the quadratures of double integrals of cross sections, comments. Moreover, a mathematical formalism allowing to establish a relationship between the crossed probability tables and the collision densities was developed. This method was applied on uranium-238 in the range of resolved resonances where the slowing down effect is significant. Validity of the method and the analysis of the obtained results are studied through a reference calculation based on a solution of a discretized slowing down equation using a very fine mesh in which each microgroup can be correctly defined via the statistical probability tables. 42 figs., 32 tabs., 49 refs. (author)

  12. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A study of the angular momentum dependence of the phase shift for finite range and Coulomb potentials

    International Nuclear Information System (INIS)

    Valluri, S.R.; Romo, W.J.

    1989-01-01

    The dependence of the phase shift δ l (k) on the angular momentum l is investigated. An analytic expression for the derivative of the phase shift with respect to angular momentum is derived for a class of potentials that includes complex and real potentials. The potentials behave like the finite range potential for small r and like a Coulomb potential for large r. Specific examples like the square well, the pure point charge Coulomb and a combination of a square well and the Coulomb potential are analytically treated. Possible applications are briefly indicated. (orig.)

  14. Multi-scale evaluation of the environmental controls on burn probability in a southern Sierra Nevada landscape

    Science.gov (United States)

    Sean A. Parks; Marc-Andre Parisien; Carol Miller

    2011-01-01

    We examined the scale-dependent relationship between spatial fire likelihood or burn probability (BP) and some key environmental controls in the southern Sierra Nevada, California, USA. Continuous BP estimates were generated using a fire simulation model. The correspondence between BP (dependent variable) and elevation, ignition density, fuels and aspect was evaluated...

  15. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    Science.gov (United States)

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  16. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  17. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  18. Modeling co-occurrence of northern spotted and barred owls: accounting for detection probability differences

    Science.gov (United States)

    Bailey, Larissa L.; Reid, Janice A.; Forsman, Eric D.; Nichols, James D.

    2009-01-01

    Barred owls (Strix varia) have recently expanded their range and now encompass the entire range of the northern spotted owl (Strix occidentalis caurina). This expansion has led to two important issues of concern for management of northern spotted owls: (1) possible competitive interactions between the two species that could contribute to population declines of northern spotted owls, and (2) possible changes in vocalization behavior and detection probabilities of northern spotted owls induced by presence of barred owls. We used a two-species occupancy model to investigate whether there was evidence of competitive exclusion between the two species at study locations in Oregon, USA. We simultaneously estimated detection probabilities for both species and determined if the presence of one species influenced the detection of the other species. Model selection results and associated parameter estimates provided no evidence that barred owls excluded spotted owls from territories. We found strong evidence that detection probabilities differed for the two species, with higher probabilities for northern spotted owls that are the object of current surveys. Non-detection of barred owls is very common in surveys for northern spotted owls, and detection of both owl species was negatively influenced by the presence of the congeneric species. Our results suggest that analyses directed at hypotheses of barred owl effects on demographic or occupancy vital rates of northern spotted owls need to deal adequately with imperfect and variable detection probabilities for both species.

  19. Searching for long-range dependence in real effective exchange rate: towards parity?

    Directory of Open Access Journals (Sweden)

    André M. Marques

    2015-12-01

    Full Text Available Abstract After the widespread adoption of flexible exchange rate regime since 1973 the volatility of the exchange rate has increased, as a consequence of greater trade openness and financial integration. As a result, it has become difficult to find evidence of the purchasing power parity hypothesis (PPP. This study investigates the possibility of a fall in the persistence of the real exchange rate as a consequence of the financial and commercial integration by employing monthly real effective exchange rate dataset provided by the International Monetary Fund (IMF. Beginning with an exploratory data analysis in the frequency domain, the fractional coefficient d was estimated employing the bias-reduced estimator on a sample of 20 countries over the period ranging from 1975 to 2011. As the main novelty, this study applies a bias-reduced log-periodogram regression estimator instead of the traditional method proposed by GPH which eliminates the first and higher orders biases by a data-dependent plug-in method for selecting the number of frequencies to minimize asymptotic mean-squared error (MSE. Additionally, this study also estimates a moving window of fifteen years to observe the path of the fractional coefficient in each country. No evidence was found of a statistically significant change in the persistence of the real exchange rate.

  20. Probability of expected climate stresses in North America in the next one My

    International Nuclear Information System (INIS)

    Kukla, G.

    1979-01-01

    Climates one million years ahead were predicted upon the assumption that the natural climate variability during the past My will continue. Response of environment and climate in the Basin and Range province of the western USA to global fluctuations was reconstructed; the most remarkable change was the filling of closed basins with large freshwater lakes. Probabilities of permanent ice cover and floods are discussed. It is believed that a site with minimal probability of climate-related breach can be selected

  1. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  2. The probability of containment failure by steam explosion in a PWR

    International Nuclear Information System (INIS)

    Briggs, A.J.

    1983-12-01

    The study of the risk associated with operation of a PWR includes assessment of severe accidents in which a combination of faults results in melting of the core. Probabilistic methods are used in such assessment, hence it is necessary to estimate the probability of key events. One such event is the occurrence of a large steam explosion when molten core debris slumps into the base of the reactor vessel. This report considers recent information, and recommends an upper limit to the range of probability values for containment failure by steam explosion for risk assessment for a plant such as the proposed Sizewell B station. (U.K.)

  3. Probability with applications in engineering, science, and technology

    CERN Document Server

    Carlton, Matthew A

    2017-01-01

    This updated and revised first-course textbook in applied probability provides a contemporary and lively post-calculus introduction to the subject of probability. The exposition reflects a desirable balance between fundamental theory and many applications involving a broad range of real problem scenarios. It is intended to appeal to a wide audience, including mathematics and statistics majors, prospective engineers and scientists, and those business and social science majors interested in the quantitative aspects of their disciplines. The textbook contains enough material for a year-long course, though many instructors will use it for a single term (one semester or one quarter). As such, three course syllabi with expanded course outlines are now available for download on the book’s page on the Springer website. A one-term course would cover material in the core chapters (1-4), supplemented by selections from one or more of the remaining chapters on statistical inference (Ch. 5), Markov chains (Ch. 6), stoch...

  4. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  5. Probability of success for phase III after exploratory biomarker analysis in phase II.

    Science.gov (United States)

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  6. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  7. Uniform Estimate of the Finite-Time Ruin Probability for All Times in a Generalized Compound Renewal Risk Model

    Directory of Open Access Journals (Sweden)

    Qingwu Gao

    2012-01-01

    Full Text Available We discuss the uniformly asymptotic estimate of the finite-time ruin probability for all times in a generalized compound renewal risk model, where the interarrival times of successive accidents and all the claim sizes caused by an accident are two sequences of random variables following a wide dependence structure. This wide dependence structure allows random variables to be either negatively dependent or positively dependent.

  8. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  9. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  10. Measurement of the spark probability in single gap parallel plate chambers

    International Nuclear Information System (INIS)

    Arefiev, A.; Bencze, Gy.L.; Choumilov, E.; Civinini, C.; Dalla Santa, F.; D'Alessandro, R.; Ferrando, A.; Fouz, M.C.; Golovkin, V.; Kholodenko, A.; Iglesias, A.; Ivochkin, V.; Josa, M.I.; Malinin, A.; Meschini, M.; Misyura, S.; Pojidaev, V.; Salicio, J.M.

    1996-01-01

    We present results on the measurements of the spark probability with CO 2 and CF 4 /CO 2 (80/20) mixture, at atmospheric pressure, using 1.5 mm gas gap parallel plate chambers, working at a gas gain ranging from 4.5 x 10 2 to 3.3 x 10 4 . (orig.)

  11. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  12. Collective fluctuations in magnetized plasma: Transition probability approach

    International Nuclear Information System (INIS)

    Sosenko, P.P.

    1997-01-01

    Statistical plasma electrodynamics is elaborated with special emphasis on the transition probability approach and quasi-particles, and on modern applications to magnetized plasmas. Fluctuation spectra in the magnetized plasma are calculated in the range of low frequencies (with respect to the cyclotron one), and the conditions for the transition from incoherent to collective fluctuations are established. The role of finite-Larmor-radius effects and particle polarization drift in such a transition is explained. The ion collective features in fluctuation spectra are studied. 63 refs., 30 figs

  13. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  14. Modeling Spatial Dependence of Rainfall Extremes Across Multiple Durations

    Science.gov (United States)

    Le, Phuong Dong; Leonard, Michael; Westra, Seth

    2018-03-01

    Determining the probability of a flood event in a catchment given that another flood has occurred in a nearby catchment is useful in the design of infrastructure such as road networks that have multiple river crossings. These conditional flood probabilities can be estimated by calculating conditional probabilities of extreme rainfall and then transforming rainfall to runoff through a hydrologic model. Each catchment's hydrological response times are unlikely to be the same, so in order to estimate these conditional probabilities one must consider the dependence of extreme rainfall both across space and across critical storm durations. To represent these types of dependence, this study proposes a new approach for combining extreme rainfall across different durations within a spatial extreme value model using max-stable process theory. This is achieved in a stepwise manner. The first step defines a set of common parameters for the marginal distributions across multiple durations. The parameters are then spatially interpolated to develop a spatial field. Storm-level dependence is represented through the max-stable process for rainfall extremes across different durations. The dependence model shows a reasonable fit between the observed pairwise extremal coefficients and the theoretical pairwise extremal coefficient function across all durations. The study demonstrates how the approach can be applied to develop conditional maps of the return period and return level across different durations.

  15. Comparison of three different concepts of high dynamic range and dependability optimised current measurement digitisers for beam loss systems

    CERN Document Server

    Viganò, W; Effinger, E; Venturini, G G; Zamantzas, C

    2012-01-01

    Three Different Concepts of High Dynamic Range and Dependability Optimised Current Measurement Digitisers for Beam Loss Systems will be compared on this paper. The first concept is based on Current to Frequency Conversion, enhanced with an ADC for extending the dynamic range and decreasing the response time. A summary of 3 years’ worth of operational experience with such a system for LHC beam loss monitoring will be given. The second principle is based on an Adaptive Current to Frequency Converter implemented in an ASIC. The basic parameters of the circuit are discussed and compared with measurements. Several measures are taken to harden both circuits against single event effects and to make them tolerant for operation in radioactive environments. The third circuit is based on a Fully Differential Integrator for enhanced dynamic range, where laboratory and test installation measurements will be presented. All circuits are designed to avoid any dead time in the acquisition and have reliability and fail safe...

  16. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  17. Scale Dependence of Spatiotemporal Intermittence of Rain

    Science.gov (United States)

    Kundu, Prasun K.; Siddani, Ravi K.

    2011-01-01

    It is a common experience that rainfall is intermittent in space and time. This is reflected by the fact that the statistics of area- and/or time-averaged rain rate is described by a mixed distribution with a nonzero probability of having a sharp value zero. In this paper we have explored the dependence of the probability of zero rain on the averaging space and time scales in large multiyear data sets based on radar and rain gauge observations. A stretched exponential fannula fits the observed scale dependence of the zero-rain probability. The proposed formula makes it apparent that the space-time support of the rain field is not quite a set of measure zero as is sometimes supposed. We also give an ex.planation of the observed behavior in tenus of a simple probabilistic model based on the premise that rainfall process has an intrinsic memory.

  18. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  19. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  20. The Effect of Incremental Changes in Phonotactic Probability and Neighborhood Density on Word Learning by Preschool Children

    Science.gov (United States)

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose: Phonotactic probability or neighborhood density has predominately been defined through the use of gross distinctions (i.e., low vs. high). In the current studies, the authors examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method: The authors examined the full range of…

  1. Experimental determination of the steady-state charging probabilities and particle size conservation in non-radioactive and radioactive bipolar aerosol chargers in the size range of 5–40 nm

    Energy Technology Data Exchange (ETDEWEB)

    Kallinger, Peter, E-mail: peter.kallinger@univie.ac.at; Szymanski, Wladyslaw W. [University of Vienna, Faculty of Physics (Austria)

    2015-04-15

    Three bipolar aerosol chargers, an AC-corona (Electrical Ionizer 1090, MSP Corp.), a soft X-ray (Advanced Aerosol Neutralizer 3087, TSI Inc.), and an α-radiation-based {sup 241}Am charger (tapcon & analysesysteme), were investigated on their charging performance of airborne nanoparticles. The charging probabilities for negatively and positively charged particles and the particle size conservation were measured in the diameter range of 5–40 nm using sucrose nanoparticles. Chargers were operated under various flow conditions in the range of 0.6–5.0 liters per minute. For particular experimental conditions, some deviations from the chosen theoretical model were found for all chargers. For very small particle sizes, the AC-corona charger showed particle losses at low flow rates and did not reach steady-state charge equilibrium at high flow rates. However, for all chargers, operating conditions were identified where the bipolar charge equilibrium was achieved. Practically, excellent particle size conservation was found for all three chargers.

  2. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  3. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  4. Application of geometric probability to the existence of faults in anisotropic media

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Donath, F.A.

    1980-01-01

    Three primary aspects of faults which relate to their potential for degradation of a repository site are: the possibility of an existing but undetected fault intersecting the repository site; the potential for a new fault occurring and propagating through the repository site; the ability of any such fault to transmit groundwater. Given that a fault might be present in the region surrounding the site, the probability that it intersects the site depends primarily on its orientation and on the density of faulting in the area. Once these parameters are known, a model can be developed to determine the probability that an existing but undetected fault will intersect the repository site. Similar techniques can be used to estimate the potential for new faults occurring and intersecting site, or intersection from propagation along existing faults. However, additional data includng in situ stress measurements and records of seismic activity would be needed. One can estimate the stress level at which the strength in the surrounding media will be exceeded, and thus determine a time-dependent probability of movement along a pre-existing fault or of a new fault occurring, from a predicted rate of change in local stresses. In situ stress measurements taken at intervals of time could aid in determining the rate of stress change in the surrounding media, although measurable changes might not occur over the available period of observation. In situ stress measurements might also aid in assessing the ability of existing faults to transmit fluids

  5. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  6. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  7. Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence

    Directory of Open Access Journals (Sweden)

    C. C. Wu

    2011-04-01

    Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.

  8. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  9. Tail-weighted dependence measures with limit being the tail dependence coefficient

    KAUST Repository

    Lee, David

    2017-12-02

    For bivariate continuous data, measures of monotonic dependence are based on the rank transformations of the two variables. For bivariate extreme value copulas, there is a family of estimators (Formula presented.), for (Formula presented.), of the extremal coefficient, based on a transform of the absolute difference of the α power of the ranks. In the case of general bivariate copulas, we obtain the probability limit (Formula presented.) of (Formula presented.) as the sample size goes to infinity and show that (i) (Formula presented.) for (Formula presented.) is a measure of central dependence with properties similar to Kendall\\'s tau and Spearman\\'s rank correlation, (ii) (Formula presented.) is a tail-weighted dependence measure for large α, and (iii) the limit as (Formula presented.) is the upper tail dependence coefficient. We obtain asymptotic properties for the rank-based measure (Formula presented.) and estimate tail dependence coefficients through extrapolation on (Formula presented.). A data example illustrates the use of the new dependence measures for tail inference.

  10. Tail-weighted dependence measures with limit being the tail dependence coefficient

    KAUST Repository

    Lee, David; Joe, Harry; Krupskii, Pavel

    2017-01-01

    For bivariate continuous data, measures of monotonic dependence are based on the rank transformations of the two variables. For bivariate extreme value copulas, there is a family of estimators (Formula presented.), for (Formula presented.), of the extremal coefficient, based on a transform of the absolute difference of the α power of the ranks. In the case of general bivariate copulas, we obtain the probability limit (Formula presented.) of (Formula presented.) as the sample size goes to infinity and show that (i) (Formula presented.) for (Formula presented.) is a measure of central dependence with properties similar to Kendall's tau and Spearman's rank correlation, (ii) (Formula presented.) is a tail-weighted dependence measure for large α, and (iii) the limit as (Formula presented.) is the upper tail dependence coefficient. We obtain asymptotic properties for the rank-based measure (Formula presented.) and estimate tail dependence coefficients through extrapolation on (Formula presented.). A data example illustrates the use of the new dependence measures for tail inference.

  11. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  12. Improved method for estimating particle scattering probabilities to finite detectors for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Mickael, M.; Gardner, R.P.; Verghese, K.

    1988-01-01

    An improved method for calculating the total probability of particle scattering within the solid angle subtended by finite detectors is developed, presented, and tested. The limiting polar and azimuthal angles subtended by the detector are measured from the direction that most simplifies their calculation rather than from the incident particle direction. A transformation of the particle scattering probability distribution function (pdf) is made to match the transformation of the direction from which the limiting angles are measured. The particle scattering probability to the detector is estimated by evaluating the integral of the transformed pdf over the range of the limiting angles measured from the preferred direction. A general formula for transforming the particle scattering pdf is derived from basic principles and applied to four important scattering pdf's; namely, isotropic scattering in the Lab system, isotropic neutron scattering in the center-of-mass system, thermal neutron scattering by the free gas model, and gamma-ray Klein-Nishina scattering. Some approximations have been made to these pdf's to enable analytical evaluations of the final integrals. These approximations are shown to be valid over a wide range of energies and for most elements. The particle scattering probability to spherical, planar circular, and right circular cylindrical detectors has been calculated using the new and previously reported direct approach. Results indicate that the new approach is valid and is computationally faster by orders of magnitude

  13. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  14. No evidence of enemy release in pathogen and microbial communities of common wasps (Vespula vulgaris in their native and introduced range.

    Directory of Open Access Journals (Sweden)

    Philip J Lester

    Full Text Available When invasive species move to new environments they typically experience population bottlenecks that limit the probability that pathogens and parasites are also moved. The invasive species may thus be released from biotic interactions that can be a major source of density-dependent mortality, referred to as enemy release. We examined for evidence of enemy release in populations of the common wasp (Vespula vulgaris, which attains high densities and represents a major threat to biodiversity in its invaded range. Mass spectrometry proteomic methods were used to compare the microbial communities in wasp populations in the native (Belgium and England and invaded range (Argentina and New Zealand. We found no evidence of enemy release, as the number of microbial taxa was similar in both the introduced and native range. However, some evidence of distinctiveness in the microbial communities was observed between countries. The pathogens observed were similar to a variety of taxa observed in honey bees. These taxa included Nosema, Paenibacillus, and Yersina spp. Genomic methods confirmed a diversity of Nosema spp., Actinobacteria, and the Deformed wing and Kashmir bee viruses. We also analysed published records of bacteria, viruses, nematodes and fungi from both V. vulgaris and the related invader V. germanica. Thirty-three different microorganism taxa have been associated with wasps including Kashmir bee virus and entomophagous fungi such as Aspergillus flavus. There was no evidence that the presence or absence of these microorganisms was dependent on region of wasp samples (i.e. their native or invaded range. Given the similarity of the wasp pathogen fauna to that from honey bees, the lack of enemy release in wasp populations is probably related to spill-over or spill-back from bees and other social insects. Social insects appear to form a reservoir of generalist parasites and pathogens, which makes the management of wasp and bee disease difficult.

  15. No evidence of enemy release in pathogen and microbial communities of common wasps (Vespula vulgaris) in their native and introduced range.

    Science.gov (United States)

    Lester, Philip J; Bosch, Peter J; Gruber, Monica A M; Kapp, Eugene A; Peng, Lifeng; Brenton-Rule, Evan C; Buchanan, Joe; Stanislawek, Wlodek L; Archer, Michael; Corley, Juan C; Masciocchi, Maitè; Van Oystaeyen, Annette; Wenseleers, Tom

    2015-01-01

    When invasive species move to new environments they typically experience population bottlenecks that limit the probability that pathogens and parasites are also moved. The invasive species may thus be released from biotic interactions that can be a major source of density-dependent mortality, referred to as enemy release. We examined for evidence of enemy release in populations of the common wasp (Vespula vulgaris), which attains high densities and represents a major threat to biodiversity in its invaded range. Mass spectrometry proteomic methods were used to compare the microbial communities in wasp populations in the native (Belgium and England) and invaded range (Argentina and New Zealand). We found no evidence of enemy release, as the number of microbial taxa was similar in both the introduced and native range. However, some evidence of distinctiveness in the microbial communities was observed between countries. The pathogens observed were similar to a variety of taxa observed in honey bees. These taxa included Nosema, Paenibacillus, and Yersina spp. Genomic methods confirmed a diversity of Nosema spp., Actinobacteria, and the Deformed wing and Kashmir bee viruses. We also analysed published records of bacteria, viruses, nematodes and fungi from both V. vulgaris and the related invader V. germanica. Thirty-three different microorganism taxa have been associated with wasps including Kashmir bee virus and entomophagous fungi such as Aspergillus flavus. There was no evidence that the presence or absence of these microorganisms was dependent on region of wasp samples (i.e. their native or invaded range). Given the similarity of the wasp pathogen fauna to that from honey bees, the lack of enemy release in wasp populations is probably related to spill-over or spill-back from bees and other social insects. Social insects appear to form a reservoir of generalist parasites and pathogens, which makes the management of wasp and bee disease difficult.

  16. Fast adaptation of the internal model of gravity for manual interceptions: evidence for event-dependent learning.

    Science.gov (United States)

    Zago, Myrka; Bosco, Gianfranco; Maffei, Vincenzo; Iosa, Marco; Ivanenko, Yuri P; Lacquaniti, Francesco

    2005-02-01

    We studied how subjects learn to deal with two conflicting sensory environments as a function of the probability of each environment and the temporal distance between repeated events. Subjects were asked to intercept a visual target moving downward on a screen with randomized laws of motion. We compared five protocols that differed in the probability of constant speed (0g) targets and accelerated (1g) targets. Probability ranged from 9 to 100%, and the time interval between consecutive repetitions of the same target ranged from about 1 to 20 min. We found that subjects systematically timed their responses consistent with the assumption of gravity effects, for both 1 and 0g trials. With training, subjects rapidly adapted to 0g targets by shifting the time of motor activation. Surprisingly, the adaptation rate was independent of both the probability of 0g targets and their temporal distance. Very few 0g trials sporadically interspersed as catch trials during immersive practice with 1g trials were sufficient for learning and consolidation in long-term memory, as verified by retesting after 24 h. We argue that the memory store for adapted states of the internal gravity model is triggered by individual events and can be sustained for prolonged periods of time separating sporadic repetitions. This form of event-related learning could depend on multiple-stage memory, with exponential rise and decay in the initial stages followed by a sample-and-hold module.

  17. Mass dependence of positive pion-induced fission

    International Nuclear Information System (INIS)

    Khan, H.A.; Khan, N.A.; Peterson, R.J.

    1991-01-01

    Fission cross sections for a range of targets have been measured by solid-state track detectors following 80 and 100 MeV π + bombardment. Fission probabilities have been inferred by comparison to computed reaction cross sections. Fission probabilities for heavy targets agree with those for other probes of comparable energy and with statistical calculations. Probabilities for lighter targets are much above those previously observed or computed. Ternary fission cross sections and multiplicities of light fragments have also been determined

  18. On the Full-range β Dependence of Ion-scale Spectral Break in the Solar Wind Turbulence

    Science.gov (United States)

    Wang, Xin; Tu, Chuanyi; He, Jiansen; Wang, Linghua

    2018-04-01

    The power spectrum of magnetic fluctuations has a break at the high-frequency end of the inertial range. Beyond this break, the spectrum becomes steeper than the Kolmogorov law f ‑5/3. The break frequency was found to be associated with plasma beta (β). However, the full-range β dependence of the ion-scale spectral break has not been presented before in observational studies. Here we show the continuous variation of the break frequency on full-range β in the solar wind turbulence. By using measurements from the WIND and Ulysses spacecraft, we show the break frequency (f b ) normalized, respectively, by the frequencies corresponding to ion inertial length (f di ), ion gyroradius ({f}ρ i), and cyclotron resonance scale (f ri ) as a function of β for 1306 intervals. Their β values spread from 0.005 to 20, which nearly covers the full β range of the observed solar wind turbulence. It is found that {f}b/{f}{di} ({f}b/{f}ρ i) generally decreases (increases) with β, while {f}b/{f}{ri} is nearly a constant. We perform a linear fit on the statistical result, and obtain the empirical formulas {f}b/{f}{di}∼ {β }-1/4, {f}b/{f}ρ i∼ {β }1/4, and {f}b/{f}{ri}∼ 0.90 to describe the relation between f b and β. We also compare our observations with a numerical simulation and the prediction by ion cyclotron resonance theory. Our result favors the idea that the cyclotron resonance is an important mechanism for energy dissipation at the spectral break. When β ≪ 1 and β ≫ 1, the break at f di and {f}ρ i may also be associated with other processes.

  19. Quantifying the range of cross-correlated fluctuations using a q- L dependent AHXA coefficient

    Science.gov (United States)

    Wang, Fang; Wang, Lin; Chen, Yuming

    2018-03-01

    Recently, based on analogous height cross-correlation analysis (AHXA), a cross-correlation coefficient ρ×(L) has been proposed to quantify the levels of cross-correlation on different temporal scales for bivariate series. A limitation of this coefficient is that it cannot capture the full information of cross-correlations on amplitude of fluctuations. In fact, it only detects the cross-correlation at a specific order fluctuation, which might neglect some important information inherited from other order fluctuations. To overcome this disadvantage, in this work, based on the scaling of the qth order covariance and time delay L, we define a two-parameter dependent cross-correlation coefficient ρq(L) to detect and quantify the range and level of cross-correlations. This new version of ρq(L) coefficient leads to the formation of a ρq(L) surface, which not only is able to quantify the level of cross-correlations, but also allows us to identify the range of fluctuation amplitudes that are correlated in two given signals. Applications to the classical ARFIMA models and the binomial multifractal series illustrate the feasibility of this new coefficient ρq(L) . In addition, a statistical test is proposed to quantify the existence of cross-correlations between two given series. Applying our method to the real life empirical data from the 1999-2000 California electricity market, we find that the California power crisis in 2000 destroys the cross-correlation between the price and the load series but does not affect the correlation of the load series during and before the crisis.

  20. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  1. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  2. A combined stochastic analysis of mean daily temperature and diurnal temperature range

    Science.gov (United States)

    Sirangelo, B.; Caloiero, T.; Coscarelli, R.; Ferrari, E.

    2018-03-01

    In this paper, a stochastic model, previously proposed for the maximum daily temperature, has been improved for the combined analysis of mean daily temperature and diurnal temperature range. In particular, the procedure applied to each variable sequentially performs the deseasonalization, by means of truncated Fourier series expansions, and the normalization of the temperature data, with the use of proper transformation functions. Then, a joint stochastic analysis of both the climatic variables has been performed by means of a FARIMA model, taking into account the stochastic dependency between the variables, namely introducing a cross-correlation between the standardized noises. The model has been applied to five daily temperature series of southern Italy. After the application of a Monte Carlo simulation procedure, the return periods of the joint behavior of the mean daily temperature and the diurnal temperature range have been evaluated. Moreover, the annual maxima of the temperature excursions in consecutive days have been analyzed for the synthetic series. The results obtained showed different behaviors probably linked to the distance from the sea and to the latitude of the station.

  3. Outage Probability Analysis of FSO Links over Foggy Channel

    KAUST Repository

    Esmail, Maged Abdullah

    2017-02-22

    Outdoor Free space optic (FSO) communication systems are sensitive to atmospheric impairments such as turbulence and fog, in addition to being subject to pointing errors. Fog is particularly severe because it induces an attenuation that may vary from few dBs up to few hundreds of dBs per kilometer. Pointing errors also distort the link alignment and cause signal fading. In this paper, we investigate and analyze the FSO systems performance under fog conditions and pointing errors in terms of outage probability. We then study the impact of several effective communication mitigation techniques that can improve the system performance including multi-hop, transmit laser selection (TLS) and hybrid RF/FSO transmission. Closed-form expressions for the outage probability are derived and practical and comprehensive numerical examples are suggested to assess the obtained results. We found that the FSO system has limited performance that prevents applying FSO in wireless microcells that have a 500 m minimum cell radius. The performance degrades more when pointing errors appear. Increasing the transmitted power can improve the performance under light to moderate fog. However, under thick and dense fog the improvement is negligible. Using mitigation techniques can play a major role in improving the range and outage probability.

  4. Outage Probability Analysis of FSO Links over Foggy Channel

    KAUST Repository

    Esmail, Maged Abdullah; Fathallah, Habib; Alouini, Mohamed-Slim

    2017-01-01

    Outdoor Free space optic (FSO) communication systems are sensitive to atmospheric impairments such as turbulence and fog, in addition to being subject to pointing errors. Fog is particularly severe because it induces an attenuation that may vary from few dBs up to few hundreds of dBs per kilometer. Pointing errors also distort the link alignment and cause signal fading. In this paper, we investigate and analyze the FSO systems performance under fog conditions and pointing errors in terms of outage probability. We then study the impact of several effective communication mitigation techniques that can improve the system performance including multi-hop, transmit laser selection (TLS) and hybrid RF/FSO transmission. Closed-form expressions for the outage probability are derived and practical and comprehensive numerical examples are suggested to assess the obtained results. We found that the FSO system has limited performance that prevents applying FSO in wireless microcells that have a 500 m minimum cell radius. The performance degrades more when pointing errors appear. Increasing the transmitted power can improve the performance under light to moderate fog. However, under thick and dense fog the improvement is negligible. Using mitigation techniques can play a major role in improving the range and outage probability.

  5. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    International Nuclear Information System (INIS)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-01-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=| | 2 decays exactly as |1-at| 2 e -Γ EP t/(ℎ/2π) , where Γ EP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  6. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    Science.gov (United States)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-07-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=||2 decays exactly as |1-at|2e-ΓEPt/ℏ, where ΓEP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  7. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    International Nuclear Information System (INIS)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure

  8. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  9. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  10. The sticking probability for H-2 on some transition metals at a hydrogen pressure of 1 bar

    DEFF Research Database (Denmark)

    Johansson, Martin; Lytken, Ole; Chorkendorff, Ib

    2008-01-01

    The sticking probability for hydrogen on films of Co, Ni, Cu, Ru, Rh, Pd, Ir, and Pt supported on graphite has been measured at a hydrogen pressure of 1 bar in the temperature range 40–200 °C. The sticking probability is found to increase in the order Ni, Co, Ir, Pd, Pt, Rh, and Ru at temperature...

  11. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. Fluctuations in Fission Characteristics in the Resonance Range

    International Nuclear Information System (INIS)

    Fort, E.; Courcelle, A.

    2006-01-01

    In the resonance range, experimental data exhibit meaningful fluctuations of the number of prompt neutrons ν p (E) and γ-rays emitted in fission. Fluctuations of delayed-neutrons multiplicity ν d (E) are also expected. Although these fluctuations may have a non-negligible impact on reactor integral parameters (such as k eff , β eff ), they are usually not described in the current nuclear-data libraries Endf, JENDL or Jeff (except for 239 Pu evaluation in Jeff.1). Experiments by Hambsch et al. on 235 U have justified the fluctuations of total kinetic energy of fission fragments [i.e TKE(E)] by the fluctuations in the mass distributions. An interesting channel-mode formalism, described by Furman, provides a methodology to assess the fluctuations of fission characteristics in the resonance range. This approach is based on ideas relating fission channels or transition states as proposed by Bohr and fission modes as parameterized for instance by Brosa et al. This formalism requires the knowledge of physical parameters rarely measured up to now, such as PP JK (E), the energy dependant probability to form a transition state with a spin J and its projection along the deformation axis K, w m JK , the probability to feed the fission mode m from a (J,K) transition state. Nevertheless, in the case of 3 - and 4 - resonances of 235 U, various experiments permit these data to be extracted. The present study proposes a tentative evaluation of ν p of 235 U based on these ideas. The evaluation of νp for 239 Pu, performed in the 80's for the JEF library, was also revisited. At that time, the model was based on the existence of pre-fission gamma (the so called n-γf effect) as well as a spin effect (prescription of different ν p values for each spin state 0 + and 1 + ). This paper emphasizes the need for further measurements to provide more accurate information on the parameters used in this formalism, and improve the present work. (authors)

  13. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Stochastic time-dependent vehicle routing problem: Mathematical models and ant colony algorithm

    Directory of Open Access Journals (Sweden)

    Zhengyu Duan

    2015-11-01

    Full Text Available This article addresses the stochastic time-dependent vehicle routing problem. Two mathematical models named robust optimal schedule time model and minimum expected schedule time model are proposed for stochastic time-dependent vehicle routing problem, which can guarantee delivery within the time windows of customers. The robust optimal schedule time model only requires the variation range of link travel time, which can be conveniently derived from historical traffic data. In addition, the robust optimal schedule time model based on robust optimization method can be converted into a time-dependent vehicle routing problem. Moreover, an ant colony optimization algorithm is designed to solve stochastic time-dependent vehicle routing problem. As the improvements in initial solution and transition probability, ant colony optimization algorithm has a good performance in convergence. Through computational instances and Monte Carlo simulation tests, robust optimal schedule time model is proved to be better than minimum expected schedule time model in computational efficiency and coping with the travel time fluctuations. Therefore, robust optimal schedule time model is applicable in real road network.

  15. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  16. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  17. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  18. Probable causes of increasing brucellosis in free-ranging elk of the Greater Yellowstone Ecosystem

    Science.gov (United States)

    Cross, P.C.; Cole, E.K.; Dobson, A.P.; Edwards, W.H.; Hamlin, K.L.; Luikart, G.; Middleton, A.D.; Scurlock, B.M.; White, P.J.

    2010-01-01

    While many wildlife species are threatened, some populations have recovered from previous overexploitation, and data linking these population increases with disease dynamics are limited. We present data suggesting that free-ranging elk (Cervus elaphus) are a maintenance host for Brucella abortus in new areas of the Greater Yellowstone Ecosystem (GYE). Brucellosis seroprevalence in free-ranging elk increased from 0-7% in 1991-1992 to 8-20% in 2006-2007 in four of six herd units around the GYE. These levels of brucellosis are comparable to some herd units where elk are artificially aggregated on supplemental feeding grounds. There are several possible mechanisms for this increase that we evaluated using statistical and population modeling approaches. Simulations of an age-structured population model suggest that the observed levels of seroprevalence are unlikely to be sustained by dispersal from supplemental feeding areas with relatively high seroprevalence or an older age structure. Increases in brucellosis seroprevalence and the total elk population size in areas with feeding grounds have not been statistically detectable. Meanwhile, the rate of seroprevalence increase outside the feeding grounds was related to the population size and density of each herd unit. Therefore, the data suggest that enhanced elk-to-elk transmission in free-ranging populations may be occurring due to larger winter elk aggregations. Elk populations inside and outside of the GYE that traditionally did not maintain brucellosis may now be at risk due to recent population increases. In particular, some neighboring populations of Montana elk were 5-9 times larger in 2007 than in the 1970s, with some aggregations comparable to the Wyoming feeding-ground populations. Addressing the unintended consequences of these increasing populations is complicated by limited hunter access to private lands, which places many ungulate populations out of administrative control. Agency-landowner hunting access

  19. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  1. Zn I spectra in the 1300–6500 cm−1 range

    International Nuclear Information System (INIS)

    Civiš, S.; Ferus, M.; Chernov, V.E.; Zanozina, E.M.; Juha, L.

    2014-01-01

    We study spectra of a plasma created by the laser ablation of ZnS targets in a vacuum and report 47 (not observed previously) Zn I lines in the range of 1300–6400 cm −1 . From the recorded spectra we determine energies of 5g, 6g, 7f, 6h, 7h and 8h Zn I levels. We also calculate a large list of transition probabilities and oscillator strengths for Zn I in the observed spectral range. -- Highlights: • We report 47 new Zn I lines in the range of 1300–6400 cm −1 . • We determine energies of 5g, 6g, 7f, 6h, 7h and 8h states of Zn I. • Using quantum defect theory, we calculate a large list of transition probabilities

  2. Planning of technical flood retention measures in large river basins under consideration of imprecise probabilities of multivariate hydrological loads

    Directory of Open Access Journals (Sweden)

    D. Nijssen

    2009-08-01

    Full Text Available As a result of the severe floods in Europe at the turn of the millennium, the ongoing shift from safety oriented flood control towards flood risk management was accelerated. With regard to technical flood control measures it became evident that the effectiveness of flood control measures depends on many different factors, which cannot be considered with single events used as design floods for planning. The multivariate characteristics of the hydrological loads have to be considered to evaluate complex flood control measures. The effectiveness of spatially distributed flood control systems differs for varying flood events. Event-based characteristics such as the spatial distribution of precipitation, the shape and volume of the resulting flood waves or the interactions of flood waves with the technical elements, e.g. reservoirs and flood polders, result in varying efficiency of these systems. Considering these aspects a flood control system should be evaluated with a broad range of hydrological loads to get a realistic assessment of its performance under different conditions. The consideration of this variety in flood control planning design was one particular aim of this study. Hydrological loads were described by multiple criteria. A statistical characterization of these criteria is difficult, since the data base is often not sufficient to analyze the variety of possible events. Hydrological simulations were used to solve this problem. Here a deterministic-stochastic flood generator was developed and applied to produce a large quantity of flood events which can be used as scenarios of possible hydrological loads. However, these simulations imply many uncertainties. The results will be biased by the basic assumptions of the modeling tools. In flood control planning probabilities are applied to characterize uncertainties. The probabilities of the simulated flood scenarios differ from probabilities which would be derived from long time series

  3. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  4. Short-range fundamental forces

    International Nuclear Information System (INIS)

    Antoniadis, I.; Baessler, S.; Buchner, M.; Fedorov, V.V.; Hoedl, S.; Nesvizhevsky, V.V.; Pignol, G.; Protasov, K.V.; Lambrecht, A.; Reynaud, S.; Sobolev, Y.

    2010-01-01

    We consider theoretical motivations to search for extra short-range fundamental forces as well as experiments constraining their parameters. The forces could be of two types: 1) spin-independent forces; 2) spin-dependent axion-like forces. Different experimental techniques are sensitive in respective ranges of characteristic distances. The techniques include measurements of gravity at short distances, searches for extra interactions on top of the Casimir force, precision atomic and neutron experiments. We focus on neutron constraints, thus the range of characteristic distances considered here corresponds to the range accessible for neutron experiments

  5. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  6. Low Probability of Intercept Laser Range Finder

    Science.gov (United States)

    2017-07-19

    time of arrival, and it may also include wavelength, pulse width, and pulse repetition frequency (PRF). Second photodetector 38 in conjunction with... conjunction with lens 32 and telescope 36 that can correct for turbulence along the free space path. [0024] In all embodiments, the time interval

  7. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  8. Probability of inadvertent operation of electrical components in harsh environments

    International Nuclear Information System (INIS)

    Knoll, A.

    1989-01-01

    Harsh environment, which means humidity and high temperature, may and will affect unsealed electrical components by causing leakage ground currents in ungrounded direct current systems. The concern in a nuclear power plant is that such harsh environment conditions could cause inadvertent operation of normally deenergized components, which may have a safety-related isolation function. Harsh environment is a common cause failure, and one way to approach the problem is to assume that all the unsealed electrical components will simultaneously and inadvertently energize as a result of the environmental common cause failure. This assumption is unrealistically conservative. Test results indicated that insulating resistences of any terminal block in harsh environments have a random distribution in the range of 1 to 270 kΩ, with a mean value ∼59 kΩ. The objective of this paper is to evaluate a realistic conditional failure probability for inadvertent operation of electrical components in harsh environments. This value will be used thereafter in probabilistic safety evaluations of harsh environment events and will replace both the overconservative common cause probability of 1 and the random failure probability used for mild environments

  9. Recursive recovery of Markov transition probabilities from boundary value data

    Energy Technology Data Exchange (ETDEWEB)

    Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.

  10. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  11. Impact-parameter-averaged probability of 3dσ - Vacancy sharing in heavy systems

    International Nuclear Information System (INIS)

    Marble, D.K.; McDaniel, F.D.; Zoran, V.; Szilagyi, Z.; Piticu, I.; Fluerasu, D.; Enulescu, A.; Dumitriu, D.; Bucur, B.I.; Ciortea, C.

    1993-01-01

    The probabilities for the 3dσ molecular vacancy sharing in the 0.08 - 1.75 MeV/u F, Co, Ni, Cu + Bi collisions have been estimated by using integral X-ray spectrum measurement. The analytic two-state exponential model of Nikitin has been applied to 3dσ -2p 3/2 vacancy sharing in these collisions systems. This describes satisfactory the velocity dependence at low energies, < 0.5 MeV/u, but around 1 MeV/u the velocity dependence changes its character, indicating departure from the hypotheses of the model. (Author)

  12. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  13. Study of the Wavelength Dependence in Laser Ablation of Advanced Ceramics and Glass-Ceramic Materials in the Nanosecond Range.

    Science.gov (United States)

    Sola, Daniel; Peña, Jose I

    2013-11-19

    In this work, geometrical dimensions and ablation yields as a function of the machining method and reference position were studied when advanced ceramics and glass-ceramic materials were machined with pulsed lasers in the nanosecond range. Two laser systems, emitting at 1064 and 532 nm, were used. It was shown that the features obtained depend on whether the substrate is processed by means of pulse bursts or by grooves. In particular, when the samples were processed by grooves, machined depth, removed volume and ablation yields reached their maximum, placing the sample out of focus. It was shown that these characteristics do not depend on the processing conditions, the wavelength or the optical configuration, and that this is intrinsic behavior of the processing method. Furthermore, the existence of a close relation between material hardness and ablation yields was demonstrated.

  14. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  15. Long-range dependence in returns and volatility of global gold market amid financial crises

    Science.gov (United States)

    Omane-Adjepong, Maurice; Boako, Gideon

    2017-04-01

    Using sampled historical daily gold market data from 07-03-1985 to 06-01-2015, and building on a related work by Bentes (2016), this paper examines the presence of long-range dependence (LRD) in the world's gold market returns and volatility, accounting for structural breaks. The sampled gold market data was divided into subsamples based on four global crises: the September 1992 collapse of the European Exchange Rate Mechanism (ERM), the Asian financial crisis of mid-1997, the Subprime meltdown of 2007, and the recent European sovereign debt crisis, which hit the world's market with varying effects. LRD test was carried-out on the full-sample and subsample periods using three semiparametric methods-before and after adjusting for structural breaks. The results show insignificant evidence of LRD in gold returns. However, very diminutive evidence is found for periods characterized by financial/economic shocks, with no significant detections for post-shock periods. Collectively, this is indicative that the gold market is less speculative, and hence could be somehow less risky for hedging and portfolio diversification.

  16. Measurements of excited-state-to-excited-state transition probabilities and photoionization cross-sections using laser-induced fluorescence and photoionization signals

    International Nuclear Information System (INIS)

    Shah, M.L.; Sahoo, A.C.; Pulhani, A.K.; Gupta, G.P.; Dikshit, B.; Bhatia, M.S.; Suri, B.M.

    2014-01-01

    Laser-induced photoionization and fluorescence signals were simultaneously observed in atomic samarium using Nd:YAG-pumped dye lasers. Two-color, three-photon photoionization and two-color fluorescence signals were recorded simultaneously as a function of the second-step laser power for two photoionization pathways. The density matrix formalism has been employed to analyze these signals. Two-color laser-induced fluorescence signal depends on the laser powers used for the first and second-step transitions as well as the first and second-step transition probability whereas two-color, three-photon photoionization signal depends on the third-step transition cross-section at the second-step laser wavelength along with the laser powers and transition probability for the first and second-step transitions. Two-color laser-induced fluorescence was used to measure the second-step transition probability. The second-step transition probability obtained was used to infer the photoionization cross-section. Thus, the methodology combining two-color, three-photon photoionization and two-color fluorescence signals in a single experiment has been established for the first time to measure the second-step transition probability as well as the photoionization cross-section. - Highlights: • Laser-induced photoionization and fluorescence signals have been simultaneously observed. • The density matrix formalism has been employed to analyze these signals. • Two-color laser-induced fluorescence was used to measure the second-step transition probability. • The second-step transition probability obtained was used to infer the photoionization cross-section. • Transition probability and photoionization cross-section have been measured in a single experiment

  17. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  18. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  19. Ozone deposition velocities, reaction probabilities and product yields for green building materials

    Science.gov (United States)

    Lamble, S. P.; Corsi, R. L.; Morrison, G. C.

    2011-12-01

    Indoor surfaces can passively remove ozone that enters buildings, reducing occupant exposure without an energy penalty. However, reactions between ozone and building surfaces can generate and release aerosols and irritating and carcinogenic gases. To identify desirable indoor surfaces the deposition velocity, reaction probability and carbonyl product yields of building materials considered green (listed, recycled, sustainable, etc.) were quantified. Nineteen separate floor, wall or ceiling materials were tested in a 10 L, flow-through laboratory reaction chamber. Inlet ozone concentrations were maintained between 150 and 200 ppb (generally much lower in chamber air), relative humidity at 50%, temperature at 25 °C and exposure occurred over 24 h. Deposition velocities ranged from 0.25 m h -1 for a linoleum style flooring up to 8.2 m h -1 for a clay based paint; reaction probabilities ranged from 8.8 × 10 -7 to 6.9 × 10 -5 respectively. For all materials, product yields of C 1 thru C 12 saturated n-aldehydes, plus acetone ranged from undetectable to greater than 0.70 The most promising material was a clay wall plaster which exhibited a high deposition velocity (5.0 m h -1) and a low product yield (

  20. The dependence level analysis between the human actions in NPP Operation

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Florescu, G.; Prisecaru, Ilie

    2009-01-01

    The Human Reliability Analysis (HRA) is an important method in Probabilistic Safety Assessment (PSA) studies and offers desirability for concrete improvement of the man - machine - organization interfaces, reliability and safety. An important step in HRA is the dependence level analysis between the human actions performed by the same person or between the actions performed by different persons, step in quantitative analysis of the human errors probabilities. The purpose of this paper is to develop a model to analyze the dependence level between human actions for Nuclear Power Plant (NPP) operation. The model estimates the conditional human error probabilities (CHEP) and joint human error probabilities (JHEP). The achieved sensitivity analyses determine human performance sensibility to systematic variations for dependence level between human actions. The human error probabilities estimated in this paper are adequate values for integration both in HRA and in PSA realized for NPP. This type of analysis helps in finding and analyzing the ways of reducing the likelihood of human errors, so that the impact of human factor to systems availability, reliability and safety can be realistically estimated. In order to demonstrate the usability of this model an analysis is performed upon the dependences between the necessary human actions in mitigating the consequences of LOCA events, particularly for the case of Cernavoda NPP. (authors)

  1. Systematic variations in multi-spectral lidar representations of canopy height profiles and gap probability

    Science.gov (United States)

    Chasmer, L.; Hopkinson, C.; Gynan, C.; Mahoney, C.; Sitar, M.

    2015-12-01

    Airborne and terrestrial lidar are increasingly used in forest attribute modeling for carbon, ecosystem and resource monitoring. The near infra-red wavelength at 1064nm has been utilised most in airborne applications due to, for example, diode manufacture costs, surface reflectance and eye safety. Foliage reflects well at 1064nm and most of the literature on airborne lidar forest structure is based on data from this wavelength. However, lidar systems also operate at wavelengths further from the visible spectrum (e.g. 1550nm) for eye safety reasons. This corresponds to a water absorption band and can be sensitive to attenuation if surfaces contain moisture. Alternatively, some systems operate in the visible range (e.g. 532nm) for specialised applications requiring simultaneous mapping of terrestrial and bathymetric surfaces. All these wavelengths provide analogous 3D canopy structure reconstructions and thus offer the potential to be combined for spatial comparisons or temporal monitoring. However, a systematic comparison of wavelength-dependent foliage profile and gap probability (index of transmittance) is needed. Here we report on two multispectral lidar missions carried out in 2013 and 2015 over conifer, deciduous and mixed stands in Ontario, Canada. The first used separate lidar sensors acquiring comparable data at three wavelengths, while the second used a single sensor with 3 integrated laser systems. In both cases, wavelenegths sampled were 532nm, 1064nm and 1550nm. The experiment revealed significant differences in proportions of returns at ground level, the vertical foliage distribution and gap probability across wavelengths. Canopy attenuation was greatest at 532nm due to photosynthetic plant tissue absorption. Relative to 1064nm, foliage was systematically undersampled at the 10% to 60% height percentiles at both 1550nm and 532nm (this was confirmed with coincident terrestrial lidar data). When using all returns to calculate gap probability, all

  2. Matrix product representation of the stationary state of the open zero range process

    Science.gov (United States)

    Bertin, Eric; Vanicat, Matthieu

    2018-06-01

    Many one-dimensional lattice particle models with open boundaries, like the paradigmatic asymmetric simple exclusion process (ASEP), have their stationary states represented in the form of a matrix product, with matrices that do not explicitly depend on the lattice site. In contrast, the stationary state of the open 1D zero-range process (ZRP) takes an inhomogeneous factorized form, with site-dependent probability weights. We show that in spite of the absence of correlations, the stationary state of the open ZRP can also be represented in a matrix product form, where the matrices are site-independent, non-commuting and determined from algebraic relations resulting from the master equation. We recover the known distribution of the open ZRP in two different ways: first, using an explicit representation of the matrices and boundary vectors; second, from the sole knowledge of the algebraic relations satisfied by these matrices and vectors. Finally, an interpretation of the relation between the matrix product form and the inhomogeneous factorized form is proposed within the framework of hidden Markov chains.

  3. The dependence of radiation response on the dose per fraction

    International Nuclear Information System (INIS)

    Joiner, M.C.

    1989-01-01

    The linear-quadratic (LQ) model explains the dependence of total dose in a fractionated course on the dose per fraction, in a very wide range of tumour and normal tissue studies, providing the dose per fraction remains above 2 Gy. In the range 2-1 Gy per fraction, some experimental studies show less increase in total dose than predicted by LQ; a probable explanation is incomplete repair between fractions given 2 seen between 1 and 0.1 Gy per fraction. This cannot be explained by incomplete repair; a modified LQ model where α decreases sharply with increasing dose per fraction in the range 0-1 Gy fits these data. The basic LQ model describes data from neutron fractionation studies, so the relationship between relative biological effectiveness (RBE) and X-ray dose per fraction can be expressed in terms of LQ parameters and fitted directly to RBE data. Results from different experiments, different assays and both top-up and full-course fractionation techniques, can all be included in one analysis. (author)

  4. Dependence of O{sub 2} diffusion dynamics on pressure and temperature in silica nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Iovino, G., E-mail: giuseppe.iovino@unipa.it; Agnello, S., E-mail: simonpietro.agnello@unipa.it; Gelardi, F. M., E-mail: franco.gelardi@unipa.it [University of Palermo, Department of Physics and Chemistry (Italy)

    2013-10-15

    An experimental study of the molecular O{sub 2} diffusion process in high purity non-porous silica nanoparticles having 50 m{sup 2}/g BET specific surface and 20 nm average radius was carried out in the temperature range from 127 to 177 Degree-Sign C at O{sub 2} pressure in the range from 0.2 to 66 bar. The study was performed by measuring the volume average interstitial O{sub 2} concentration by a Raman and photoluminescence technique using a 1,064 nm excitation laser to detect the singlet to triplet emission at 1,272 nm of the molecular oxygen in silica. A dependence of the diffusion kinetics on the O{sub 2} absolute pressure, in addition to temperature dependence, was found. The kinetics can be fit by the solution of Fick's diffusion equation using an effective diffusion coefficient related to temperature and O{sub 2} external pressure. The fit results have evidenced that the temperature and pressure dependencies can be disentangled and that the pressure effects are more pronounced at lower temperatures. An Arrhenius temperature law is determined for the effective diffusion coefficient and the activation energy and pre-exponential factor are found in the explored experimental range. The reported findings have not been evidenced previously in the studies in bulk silica and could probably be originated by the reduced spatial extension of the considered system.

  5. A transmission/escape probabilities model for neutral particle transport in the outer regions of a diverted tokamak

    International Nuclear Information System (INIS)

    Stacey, W.M.

    1992-12-01

    A new computational model for neutral particle transport in the outer regions of a diverted tokamak plasma chamber is presented. The model is based on the calculation of transmission and escape probabilities using first-flight integral transport theory and the balancing of fluxes across the surfaces bounding the various regions. The geometrical complexity of the problem is included in precomputed probabilities which depend only on the mean free path of the region

  6. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  7. The sticking probability for H-2 in presence of CO on some transition metals at a hydrogen pressure of 1 bar

    DEFF Research Database (Denmark)

    Johansson, Martin; Lytken, Ole; Chorkendorff, Ib

    2008-01-01

    The sticking probability for H-2 on Ni, Co, Cu, Rh, Ru, Pd, it and Pt metal films supported on graphite has been investigated in a gas mixture consisting of 10 ppm carbon monoxide in hydrogen at a total pressure of 1 bar in the temperature range 40-200 degrees C. Carbon monoxide inhibits the stic......The sticking probability for H-2 on Ni, Co, Cu, Rh, Ru, Pd, it and Pt metal films supported on graphite has been investigated in a gas mixture consisting of 10 ppm carbon monoxide in hydrogen at a total pressure of 1 bar in the temperature range 40-200 degrees C. Carbon monoxide inhibits...... the sticking probability significantly for all the metals, even at 200 degrees C. In the presence of 10 ppm CO, the sticking probability increases in the order It, Pt, Ni, Co, Pd, Rh, Ru, whereas for Cu, it is below the detection limit of the measurement, even in pure H2. The sticking probability for H2...

  8. Effect of Urban Green Spaces and Flooded Area Type on Flooding Probability

    Directory of Open Access Journals (Sweden)

    Hyomin Kim

    2016-01-01

    Full Text Available Countermeasures to urban flooding should consider long-term perspectives, because climate change impacts are unpredictable and complex. Urban green spaces have emerged as a potential option to reduce urban flood risks, and their effectiveness has been highlighted in notable urban water management studies. In this study, flooded areas in Seoul, Korea, were divided into four flooded area types by cluster analysis based on topographic and physical characteristics and verified using discriminant analysis. After division by flooded area type, logistic regression analysis was performed to determine how the flooding probability changes with variations in green space area. Type 1 included regions where flooding occurred in a drainage basin that had a flood risk management infrastructure (FRMI. In Type 2, the slope was steep; the TWI (Topographic Wetness Index was relatively low; and soil drainage was favorable. Type 3 represented the gentlest sloping areas, and these were associated with the highest TWI values. In addition, these areas had the worst soil drainage. Type 4 had moderate slopes, imperfect soil drainage and lower than average TWI values. We found that green spaces exerted a considerable influence on urban flooding probabilities in Seoul, and flooding probabilities could be reduced by over 50% depending on the green space area and the locations where green spaces were introduced. Increasing the area of green spaces was the most effective method of decreasing flooding probability in Type 3 areas. In Type 2 areas, the maximum hourly precipitation affected the flooding probability significantly, and the flooding probability in these areas was high despite the extensive green space area. These findings can contribute towards establishing guidelines for urban spatial planning to respond to urban flooding.

  9. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  10. Cognitive-psychology expertise and the calculation of the probability of a wrongful conviction.

    Science.gov (United States)

    Rouder, Jeffrey N; Wixted, John T; Christenfeld, Nicholas J S

    2018-05-08

    Cognitive psychologists are familiar with how their expertise in understanding human perception, memory, and decision-making is applicable to the justice system. They may be less familiar with how their expertise in statistical decision-making and their comfort working in noisy real-world environments is just as applicable. Here we show how this expertise in ideal-observer models may be leveraged to calculate the probability of guilt of Gary Leiterman, a man convicted of murder on the basis of DNA evidence. We show by common probability theory that Leiterman is likely a victim of a tragic contamination event rather than a murderer. Making any calculation of the probability of guilt necessarily relies on subjective assumptions. The conclusion about Leiterman's innocence is not overly sensitive to the assumptions-the probability of innocence remains high for a wide range of reasonable assumptions. We note that cognitive psychologists may be well suited to make these calculations because as working scientists they may be comfortable with the role a reasonable degree of subjectivity plays in analysis.

  11. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-01-01

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k eff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  12. Synaptic convergence regulates synchronization-dependent spike transfer in feedforward neural networks.

    Science.gov (United States)

    Sailamul, Pachaya; Jang, Jaeson; Paik, Se-Bum

    2017-12-01

    Correlated neural activities such as synchronizations can significantly alter the characteristics of spike transfer between neural layers. However, it is not clear how this synchronization-dependent spike transfer can be affected by the structure of convergent feedforward wiring. To address this question, we implemented computer simulations of model neural networks: a source and a target layer connected with different types of convergent wiring rules. In the Gaussian-Gaussian (GG) model, both the connection probability and the strength are given as Gaussian distribution as a function of spatial distance. In the Uniform-Constant (UC) and Uniform-Exponential (UE) models, the connection probability density is a uniform constant within a certain range, but the connection strength is set as a constant value or an exponentially decaying function, respectively. Then we examined how the spike transfer function is modulated under these conditions, while static or synchronized input patterns were introduced to simulate different levels of feedforward spike synchronization. We observed that the synchronization-dependent modulation of the transfer function appeared noticeably different for each convergence condition. The modulation of the spike transfer function was largest in the UC model, and smallest in the UE model. Our analysis showed that this difference was induced by the different spike weight distributions that was generated from convergent synapses in each model. Our results suggest that, the structure of the feedforward convergence is a crucial factor for correlation-dependent spike control, thus must be considered important to understand the mechanism of information transfer in the brain.

  13. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  14. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  15. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  16. ENVIRONMENTAL DEPENDENCE OF THE GALAXY MERGER RATE IN A ΛCDM UNIVERSE

    International Nuclear Information System (INIS)

    Jian, Hung-Yu; Chiueh, Tzihong; Lin Lihwai

    2012-01-01

    We make use of four galaxy catalogs based on four different semi-analytical models (SAMs) implemented in the Millennium Simulation to study the environmental effects and the model dependence of the galaxy merger rate. We begin the analyses by finding that the galaxy merger rate in SAMs has a mild redshift evolution with luminosity-selected samples in the evolution-corrected B-band magnitude range,–21 ≤ M e B ≤ –19, consistent with the results of previous works. To study the environmental dependence of the galaxy merger rate, we adopt two estimators, the local overdensity (1 + δ n ), defined as the surface density from the nth nearest neighbor (n = 6 is chosen in this study), and the host halo mass M h . We find that the galaxy merger rate F mg shows a strong dependence on the local overdensity (1 + δ n ) and the dependence is similar at all redshifts. For the overdensity estimator, the merger rate F mg is found to be about twenty times larger in the densest regions than in underdense ones in two of the four SAMs, while it is roughly four times higher in the other two. In other words, the discrepancies of the merger rate difference between the two extremes can differ by a factor of ∼5 depending on the SAMs adopted. On the other hand, for the halo mass estimator, F mg does not monotonically increase with the host halo mass M h but peaks in the M h range between 10 12 and 10 13 h –1 M ☉ , which corresponds to group environments. The high merger rate in high local density regions corresponds primarily to the high merger rate in group environments. In addition, we also study the merger probability of 'close pairs' identified using the projected separation and the line-of-sight velocity difference C mg and the merger timescale T mg ; these are two important quantities for observations to convert the pair fraction N c into the galaxy merger rate. We discover that T mg has a weak dependence on environment and different SAMs, and is about 2 Gyr old at z

  17. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Computer simulation of probability of detection

    International Nuclear Information System (INIS)

    Fertig, K.W.; Richardson, J.M.

    1983-01-01

    This paper describes an integrated model for assessing the performance of a given ultrasonic inspection system for detecting internal flaws, where the performance of such a system is measured by probability of detection. The effects of real part geometries on sound propagations are accounted for and the noise spectra due to various noise mechanisms are measured. An ultrasonic inspection simulation computer code has been developed to be able to detect flaws with attributes ranging over an extensive class. The detection decision is considered to be a binary decision based on one received waveform obtained in a pulse-echo or pitch-catch setup. This study focuses on the detectability of flaws using an amplitude thresholding type. Some preliminary results on the detectability of radially oriented cracks in IN-100 for bore-like geometries are given

  19. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  20. Dependent Neyman type A processes based on common shock Poisson approach

    Science.gov (United States)

    Kadilar, Gamze Özel; Kadilar, Cem

    2016-04-01

    The Neyman type A process is used for describing clustered data since the Poisson process is insufficient for clustering of events. In a multivariate setting, there may be dependencies between multivarite Neyman type A processes. In this study, dependent form of the Neyman type A process is considered under common shock approach. Then, the joint probability function are derived for the dependent Neyman type A Poisson processes. Then, an application based on forest fires in Turkey are given. The results show that the joint probability function of the dependent Neyman type A processes, which is obtained in this study, can be a good tool for the probabilistic fitness for the total number of burned trees in Turkey.

  1. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  2. Estimation of the common cause failure probabilities on the component group with mixed testing scheme

    International Nuclear Information System (INIS)

    Hwang, Meejeong; Kang, Dae Il

    2011-01-01

    Highlights: ► This paper presents a method to estimate the common cause failure probabilities on the common cause component group with mixed testing schemes. ► The CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing. ► There are many CCCGs with specific mixed testing schemes in real plant operation. ► Therefore, a general formula which is applicable to both alternate periodic testing scheme and train level mixed testing scheme was derived. - Abstract: This paper presents a method to estimate the common cause failure (CCF) probabilities on the common cause component group (CCCG) with mixed testing schemes such as the train level mixed testing scheme or the alternate periodic testing scheme. In the train level mixed testing scheme, the components are tested in a non-staggered way within the same train, but the components are tested in a staggered way between the trains. The alternate periodic testing scheme indicates that all components in the same CCCG are tested in a non-staggered way during the planned maintenance period, but they are tested in a staggered way during normal plant operation. Since the CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing, CCF estimators have two kinds of formulas in accordance with the testing schemes. Thus, there are general formulas to estimate the CCF probability on the staggered testing scheme and non-staggered testing scheme. However, in real plant operation, there are many CCCGs with specific mixed testing schemes. Recently, Barros () and Kang () proposed a CCF factor estimation method to reflect the alternate periodic testing scheme and the train level mixed testing scheme. In this paper, a general formula which is applicable to both the alternate periodic testing scheme and the train level mixed testing scheme was derived.

  3. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  4. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  5. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  6. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    International Nuclear Information System (INIS)

    Lehua Pan; G.S. Bodvarsson

    2001-01-01

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions

  7. Reactive scattering theory for molecular transitions in time-dependent fields

    International Nuclear Information System (INIS)

    Peskin, U.; Miller, W.H.

    1995-01-01

    A new approach is introduced for computing probabilities of molecular transitions in time-dependent fields. The method is based on the stationary (t,t') representation of the Schroedinger equation and is shown to be equivalent to infinite order time-dependent perturbation theory. Bound-to-bound (i.e., photoexcitation) and bound-to-continuum (i.e., photoreaction) transitions are regarded as reactive collisions with the ''time coordinate'' as the reaction coordinate in an extended Hilbert space. A numerical method based on imposing absorbing boundary conditions for the time coordinate in a discrete variable representation framework is introduced. A single operation of the Green's operator provides all the state-specific transition probabilities as well as partial state-resolved (inclusive) reaction probabilities. Illustrative numerical applications are given for model systems

  8. The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.

    Science.gov (United States)

    Kühberger; Schulte-Mecklenbeck; Perner

    1999-06-01

    A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.

  9. Theoretical determination of gamma spectrometry systems efficiency based on probability functions. Application to self-attenuation correction factors

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, Manuel, E-mail: manuel.barrera@uca.es [Escuela Superior de Ingeniería, University of Cadiz, Avda, Universidad de Cadiz 10, 11519 Puerto Real, Cadiz (Spain); Suarez-Llorens, Alfonso [Facultad de Ciencias, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cadiz (Spain); Casas-Ruiz, Melquiades; Alonso, José J.; Vidal, Juan [CEIMAR, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cádiz (Spain)

    2017-05-11

    A generic theoretical methodology for the calculation of the efficiency of gamma spectrometry systems is introduced in this work. The procedure is valid for any type of source and detector and can be applied to determine the full energy peak and the total efficiency of any source-detector system. The methodology is based on the idea of underlying probability of detection, which describes the physical model for the detection of the gamma radiation at the particular studied situation. This probability depends explicitly on the direction of the gamma radiation, allowing the use of this dependence the development of more realistic and complex models than the traditional models based on the point source integration. The probability function that has to be employed in practice must reproduce the relevant characteristics of the detection process occurring at the particular studied situation. Once the probability is defined, the efficiency calculations can be performed in general by using numerical methods. Monte Carlo integration procedure is especially useful to perform the calculations when complex probability functions are used. The methodology can be used for the direct determination of the efficiency and also for the calculation of corrections that require this determination of the efficiency, as it is the case of coincidence summing, geometric or self-attenuation corrections. In particular, we have applied the procedure to obtain some of the classical self-attenuation correction factors usually employed to correct for the sample attenuation of cylindrical geometry sources. The methodology clarifies the theoretical basis and approximations associated to each factor, by making explicit the probability which is generally hidden and implicit to each model. It has been shown that most of these self-attenuation correction factors can be derived by using a common underlying probability, having this probability a growing level of complexity as it reproduces more precisely

  10. A range of equipment for dental radiography

    International Nuclear Information System (INIS)

    Bergman, G.P.M.; Clement, S.L.

    1980-01-01

    A brief review of the history of dental radiography is followed by a description of the latest Philips equipment, ranging from compact units for intra-oral radiography to advanced systems for panoramic techniques and skull radiography. The advantages of automatic exposure control and automatic film processing are also discussed. In conclusion, some probable future trends are forecast. (Auth.)

  11. Time-Dependent Quantum Wave Packet Study of the Si + OH → SiO + H Reaction: Cross Sections and Rate Constants.

    Science.gov (United States)

    Rivero Santamaría, Alejandro; Dayou, Fabrice; Rubayo-Soneira, Jesus; Monnerville, Maurice

    2017-03-02

    The dynamics of the Si( 3 P) + OH(X 2 Π) → SiO(X 1 Σ + ) + H( 2 S) reaction is investigated by means of the time-dependent wave packet (TDWP) approach using an ab initio potential energy surface recently developed by Dayou et al. ( J. Chem. Phys. 2013 , 139 , 204305 ) for the ground X 2 A' electronic state. Total reaction probabilities have been calculated for the first 15 rotational states j = 0-14 of OH(v=0,j) at a total angular momentum J = 0 up to a collision energy of 1 eV. Integral cross sections and state-selected rate constants for the temperature range 10-500 K were obtained within the J-shifting approximation. The reaction probabilities display highly oscillatory structures indicating the contribution of long-lived quasibound states supported by the deep SiOH/HSiO wells. The cross sections behave with collision energies as expected for a barrierless reaction and are slightly sensitive to the initial rotational excitation of OH. The thermal rate constants show a marked temperature dependence below 200 K with a maximum value around 15 K. The TDWP results globally agree with the results of earlier quasi-classical trajectory (QCT) calculations carried out by Rivero-Santamaria et al. ( Chem. Phys. Lett. 2014 , 610-611 , 335 - 340 ) with the same potential energy surface. In particular, the thermal rate constants display a similar temperature dependence, with TDWP values smaller than the QCT ones over the whole temperature range.

  12. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  13. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  14. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  15. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  16. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  17. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    examined, which in turn leads to any of the known stereological estimates, including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...... geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator...

  18. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  19. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  20. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  1. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.

    Science.gov (United States)

    Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A

    2016-07-27

    The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.

  2. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  3. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  4. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  5. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    Science.gov (United States)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice

  6. Calculating failure probabilities for TRISO-coated fuel particles using an integral formulation

    International Nuclear Information System (INIS)

    Miller, Gregory K.; Maki, John T.; Knudson, Darrell L.; Petti, David A.

    2010-01-01

    The fundamental design for a gas-cooled reactor relies on the safe behavior of the coated particle fuel. The coating layers surrounding the fuel kernels in these spherical particles, termed the TRISO coating, act as a pressure vessel that retains fission products. The quality of the fuel is reflected in the number of particle failures that occur during reactor operation, where failed particles become a source for fission products that can then diffuse through the fuel element. The failure probability for any batch of particles, which has traditionally been calculated using the Monte Carlo method, depends on statistical variations in design parameters and on variations in the strengths of coating layers among particles in the batch. An alternative approach to calculating failure probabilities is developed herein that uses direct numerical integration of a failure probability integral. Because this is a multiple integral where the statistically varying parameters become integration variables, a fast numerical integration approach is also developed. In sample cases analyzed involving multiple failure mechanisms, results from the integration methods agree closely with Monte Carlo results. Additionally, the fast integration approach, particularly, is shown to significantly improve efficiency of failure probability calculations. These integration methods have been implemented in the PARFUME fuel performance code along with the Monte Carlo method, where each serves to verify accuracy of the others.

  7. Postobductional extension along and within the Frontal Range of the Eastern Oman Mountains

    Science.gov (United States)

    Mattern, Frank; Scharf, Andreas

    2018-04-01

    The Oman Mountains formed by late Cretaceous obduction of the Tethys-derived Semail Ophiolite. This study concerns the postobductional extension on the northern flank of the mountain belt. Nine sites at the northern margins of the Jabal Akhdar/Nakhl and Saih Hatat domes of the Eastern Oman ("Hajar") Mountains were investigated. The northern margins are marked by a system of major interconnected extensional faults, the "Frontal Range Fault". While the vertical displacements along the Saih Hatat and westerly located Jabal Nakhl domes measure 2.25-6.25 km, 0.5-4.5 km and 4-7 km, respectively, it amounts to 1-5 km along the Jabal Akhdar Dome. Extension had started during the late Cretaceous, towards the end of ophiolite emplacement. Two stages of extension can be ascertained (late Cretaceous to early Eocene and probably Oligocene) at the eastern part of the Frontal Range Fault System (Wadi Kabir and Fanja Graben faults of similar strike). Along the intervening and differently striking fault segments at Sad and Sunub the same two stages of deformation are deduced. The first stage is characterized again by extension. The second stage is marked by dextral motion, including local transtension. Probable Oligocene extension affected the Batinah Coast Fault while it also affected the Wadi Kabir Fault and the Fanja Graben. It is unclear whether the western portion of the Frontal Range Fault also went through two stages of deformation. Bedding-parallel ductile and brittle deformation is a common phenomenon. Hot springs and listwaenite are associated with dextral releasing bends within the fault system, as well as a basalt intrusion of probable Oligocene age. A structural transect through the Frontal Range along the superbly exposed Wadi Bani Kharous (Jabal Akhdar Dome) revealed that extension affected the Frontal Range at least 2.5 km south of the Frontal Range Fault. Also here, bedding-parallel shearing is important, but not exclusive. A late Cretaceous thrust was

  8. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  9. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  10. Anisotropy evidence for the K-shell ionization probability in the use of Ag(p,p)Ag reaction

    International Nuclear Information System (INIS)

    Andriamonje, S.

    1976-01-01

    The ionization probability of silver by 1MeV protons has been measured at large angles up to 110 0 C. The experimental results have been obtained using the coincidence between scattered protons and KX rays. The angular dependence in the ionization probability at small impact parameters indicates an anisotropy as expected by Ciochetti and Molinari in their theoretical study of K-shell ionization probability associated with nuclear reactions. The results have been compared to the predictions of the BEA (Binary Exchange Approximation) method, including relativistic corrections of deflection and binding energy. The anisotropy coefficient deduced from the comparison of experimental and theoretical results is in good agreement with expected values [fr

  11. Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-05-23

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{sub eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.

  12. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  13. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  14. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  15. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  16. Survival probability of diffusion with trapping in cellular neurobiology

    Science.gov (United States)

    Holcman, David; Marchewka, Avi; Schuss, Zeev

    2005-09-01

    The problem of diffusion with absorption and trapping sites arises in the theory of molecular signaling inside and on the membranes of biological cells. In particular, this problem arises in the case of spine-dendrite communication, where the number of calcium ions, modeled as random particles, is regulated across the spine microstructure by pumps, which play the role of killing sites, while the end of the dendritic shaft is an absorbing boundary. We develop a general mathematical framework for diffusion in the presence of absorption and killing sites and apply it to the computation of the time-dependent survival probability of ions. We also compute the ratio of the number of absorbed particles at a specific location to the number of killed particles. We show that the ratio depends on the distribution of killing sites. The biological consequence is that the position of the pumps regulates the fraction of calcium ions that reach the dendrite.

  17. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  18. Evaluation of the probability distribution of intake from a single measurement on a personal air sampler

    International Nuclear Information System (INIS)

    Birchall, A.; Muirhead, C.R.; James, A.C.

    1988-01-01

    An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)

  19. On the regularity of the extinction probability of a branching process in varying and random environments

    International Nuclear Information System (INIS)

    Alili, Smail; Rugh, Hans Henrik

    2008-01-01

    We consider a supercritical branching process in time-dependent environment ξ. We assume that the offspring distributions depend regularly (C k or real-analytically) on real parameters λ. We show that the extinction probability q λ (ξ), given the environment ξ 'inherits' this regularity whenever the offspring distributions satisfy a condition of contraction-type. Our proof makes use of the Poincaré metric on the complex unit disc and a real-analytic implicit function theorem

  20. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.