WorldWideScience

Sample records for range dependent probability

  1. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  2. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  3. Impact parameter dependence of inner-shell ionization probabilities

    International Nuclear Information System (INIS)

    Cocke, C.L.

    1974-01-01

    The probability for ionization of an inner shell of a target atom by a heavy charged projectile is a sensitive function of the impact parameter characterizing the collision. This probability can be measured experimentally by detecting the x-ray resulting from radiative filling of the inner shell in coincidence with the projectile scattered at a determined angle, and by using the scattering angle to deduce the impact parameter. It is conjectured that the functional dependence of the ionization probability may be a more sensitive probe of the ionization mechanism than is a total cross section measurement. Experimental results for the K-shell ionization of both solid and gas targets by oxygen, carbon and fluorine projectiles in the MeV/amu energy range will be presented, and their use in illuminating the inelastic collision process discussed

  4. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  5. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  6. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  7. Time dependent non-extinction probability for prompt critical systems

    International Nuclear Information System (INIS)

    Gregson, M. W.; Prinja, A. K.

    2009-01-01

    The time dependent non-extinction probability equation is presented for slab geometry. Numerical solutions are provided for a nested inner/outer iteration routine where the fission terms (both linear and non-linear) are updated and then held fixed over the inner scattering iteration. Time dependent results are presented highlighting the importance of the injection position and angle. The iteration behavior is also described as the steady state probability of initiation is approached for both small and large time steps. Theoretical analysis of the nested iteration scheme is shown and highlights poor numerical convergence for marginally prompt critical systems. An acceleration scheme for the outer iterations is presented to improve convergence of such systems. Theoretical analysis of the acceleration scheme is also provided and the associated decrease in computational run time addressed. (authors)

  8. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  10. Voltage dependency of transmission probability of aperiodic DNA molecule

    Science.gov (United States)

    Wiliyanti, V.; Yudiarsah, E.

    2017-07-01

    Characteristics of electron transports in aperiodic DNA molecules have been studied. Double stranded DNA model with the sequences of bases, GCTAGTACGTGACGTAGCTAGGATATGCCTGA, in one chain and its complements on the other chains has been used. Tight binding Hamiltonian is used to model DNA molecules. In the model, we consider that on-site energy of the basis has a linearly dependency on the applied electric field. Slater-Koster scheme is used to model electron hopping constant between bases. The transmission probability of electron from one electrode to the next electrode is calculated using a transfer matrix technique and scattering matrix method simultaneously. The results show that, generally, higher voltage gives a slightly larger value of the transmission probability. The applied voltage seems to shift extended states to lower energy. Meanwhile, the value of the transmission increases with twisting motion frequency increment.

  11. Propagation in a waveguide with range-dependent seabed properties.

    Science.gov (United States)

    Holland, Charles W

    2010-11-01

    The ocean environment contains features affecting acoustic propagation that vary on a wide range of time and space scales. A significant body of work over recent decades has aimed at understanding the effects of water column spatial and temporal variability on acoustic propagation. Much less is understood about the impact of spatial variability of seabed properties on propagation, which is the focus of this study. Here, a simple, intuitive expression for propagation with range-dependent boundary properties and uniform water depth is derived. It is shown that incoherent range-dependent propagation depends upon the geometric mean of the seabed plane-wave reflection coefficient and the arithmetic mean of the cycle distance. Thus, only the spatial probability distributions (pdfs) of the sediment properties are required. Also, it is shown that the propagation over a range-dependent seabed tends to be controlled by the lossiest, not the hardest, sediments. Thus, range-dependence generally leads to higher propagation loss than would be expected, due for example to lossy sediment patches and/or nulls in the reflection coefficient. In a few instances, propagation over a range-dependent seabed can be calculated using range-independent sediment properties. The theory may be useful for other (non-oceanic) waveguides.

  12. Stochastic processes and long range dependence

    CERN Document Server

    Samorodnitsky, Gennady

    2016-01-01

    This monograph is a gateway for researchers and graduate students to explore the profound, yet subtle, world of long-range dependence (also known as long memory). The text is organized around the probabilistic properties of stationary processes that are important for determining the presence or absence of long memory. The first few chapters serve as an overview of the general theory of stochastic processes which gives the reader sufficient background, language, and models for the subsequent discussion of long memory. The later chapters devoted to long memory begin with an introduction to the subject along with a brief history of its development, followed by a presentation of what is currently the best known approach, applicable to stationary processes with a finite second moment. The book concludes with a chapter devoted to the author’s own, less standard, point of view of long memory as a phase transition, and even includes some novel results. Most of the material in the book has not previously been publis...

  13. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  14. Using the probability method for multigroup calculations of reactor cells in a thermal energy range

    International Nuclear Information System (INIS)

    Rubin, I.E.; Pustoshilova, V.S.

    1984-01-01

    The possibility of using the transmission probability method with performance inerpolation for determining spatial-energy neutron flux distribution in cells of thermal heterogeneous reactors is considered. The results of multigroup calculations of several uranium-water plane and cylindrical cells with different fuel enrichment in a thermal energy range are given. A high accuracy of results is obtained with low computer time consumption. The use of the transmission probability method is particularly reasonable in algorithms of the programmes compiled computer with significant reserve of internal memory

  15. Probability tables and gauss quadrature: application to neutron cross-sections in the unresolved energy range

    International Nuclear Information System (INIS)

    Ribon, P.; Maillard, J.M.

    1986-09-01

    The idea of describing neutron cross-section fluctuations by sets of discrete values, called ''probability tables'', was formulated some 15 years ago. We propose to define the probability tables from moments by equating the moments of the actual cross-section distribution in a given energy range to the moments of the table. This definition introduces PADE approximants, orthogonal polynomials and GAUSS quadrature. This mathematical basis applies very well to the total cross-section. Some difficulties appear when partial cross-sections are taken into account, linked to the ambiguity of the definition of multivariate PADE approximants. Nevertheless we propose solutions and choices which appear to be satisfactory. Comparisons are made with other definitions of probability tables and an example of the calculation of a mixture of nuclei is given. 18 refs

  16. Probability tables and gauss quadrature: application to neutron cross-sections in the unresolved energy range

    International Nuclear Information System (INIS)

    Ribon, P.; Maillard, J.M.

    1986-01-01

    The idea of describing neutron cross-section fluctuations by sets of discrete values, called probability tables, was formulated some 15 years ago. The authors propose to define the probability tables from moments by equating the moments of the actual cross-section distribution in a given energy range to the moments of the table. This definition introduces PADE approximants, orthogonal polynomials and GAUSS quadrature. This mathematical basis applies very well to the total cross-section. Some difficulties appear when partial cross-sections are taken into account, linked to the ambiguity of the definition of multivariate PADE approximants. Nevertheless the authors propose solutions and choices which appear to be satisfactory. Comparisons are made with other definition of probability tables and an example of the calculation of a mixture of nuclei is given

  17. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  18. Measuring sensitivity in pharmacoeconomic studies. Refining point sensitivity and range sensitivity by incorporating probability distributions.

    Science.gov (United States)

    Nuijten, M J

    1999-07-01

    The aim of the present study is to describe a refinement of a previously presented method, based on the concept of point sensitivity, to deal with uncertainty in economic studies. The original method was refined by the incorporation of probability distributions which allow a more accurate assessment of the level of uncertainty in the model. In addition, a bootstrap method was used to create a probability distribution for a fixed input variable based on a limited number of data points. The original method was limited in that the sensitivity measurement was based on a uniform distribution of the variables and that the overall sensitivity measure was based on a subjectively chosen range which excludes the impact of values outside the range on the overall sensitivity. The concepts of the refined method were illustrated using a Markov model of depression. The application of the refined method substantially changed the ranking of the most sensitive variables compared with the original method. The response rate became the most sensitive variable instead of the 'per diem' for hospitalisation. The refinement of the original method yields sensitivity outcomes, which greater reflect the real uncertainty in economic studies.

  19. An analytical evaluation for spatial-dependent intra-pebble Dancoff factor and escape probability

    International Nuclear Information System (INIS)

    Kim, Songhyun; Kim, Hong-Chul; Kim, Jong Kyung; Kim, Soon Young; Noh, Jae Man

    2009-01-01

    The analytical evaluation of spatial-dependent intra-pebble Dancoff factors and their escape probabilities is pursued by the model developed in this study. Intra-pebble Dancoff factors and their escape probabilities are calculated as a function of fuel kernel radius, number of fuel kernels, and fuel region radius. The method in this study can be easily utilized to analyze the tendency of spatial-dependent intra-pebble Dancoff factor and spatial-dependent fuel region escape probability for the various geometries because it is faster than the MCNP method as well as good accuracy. (author)

  20. Antitrust Enforcement Under Endogenous Fines and Price-Dependent Detection Probabilities

    NARCIS (Netherlands)

    Houba, H.E.D.; Motchenkova, E.; Wen, Q.

    2010-01-01

    We analyze the effectiveness of antitrust regulation in a repeated oligopoly model in which both fines and detection probabilities depend on the cartel price. Such fines are closer to actual guidelines than the commonly assumed fixed fines. Under a constant detection probability, we confirm the

  1. Unresolved resonance range cross section, probability tables and self shielding factor

    International Nuclear Information System (INIS)

    Sublet, J.Ch.; Blomquist, R.N.; Goluoglu, S.; Mac Farlane, R.E.

    2009-07-01

    The performance and methodology of 4 processing codes have been compared in the unresolved resonance range of a selected set of isotopes. Those isotopes have been chosen to encompass most cases encountered in the unresolved energy range contained in major libraries like Endf/B-7 or Jeff-3.1.1. The code results comparison is accompanied by data format and formalism examinations and processing code fine-interpretation study. After some improvements, the results showed generally good agreement, although not perfect with infinite dilute cross-sections. However, much larger differences occur when shelf-shielded effective cross-sections are compared. The infinitely dilute cross-section are often plot checked but it is the probability table derived and shelf-shielded cross sections that are used and interpreted in criticality and transport calculations. This suggests that the current evaluation data format and formalism, in the unresolved resonance range should be tightened up, ambiguities removed. In addition production of the shelf shielded cross-sections should be converged to a much greater accuracy. (author)

  2. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  3. Evolution of density-dependent movement during experimental range expansions.

    Science.gov (United States)

    Fronhofer, E A; Gut, S; Altermatt, F

    2017-12-01

    Range expansions and biological invasions are prime examples of transient processes that are likely impacted by rapid evolutionary changes. As a spatial process, range expansions are driven by dispersal and movement behaviour. Although it is widely accepted that dispersal and movement may be context-dependent, for instance density-dependent, and best represented by reaction norms, the evolution of density-dependent movement during range expansions has received little experimental attention. We therefore tested current theory predicting the evolution of increased movement at low densities at range margins using highly replicated and controlled range expansion experiments across multiple genotypes of the protist model system Tetrahymena thermophila. Although rare, we found evolutionary changes during range expansions even in the absence of initial standing genetic variation. Range expansions led to the evolution of negatively density-dependent movement at range margins. In addition, we report the evolution of increased intrastrain competitive ability and concurrently decreased population growth rates in range cores. Our findings highlight the importance of understanding movement and dispersal as evolving reaction norms and plastic life-history traits of central relevance for range expansions, biological invasions and the dynamics of spatially structured systems in general. © 2017 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2017 European Society For Evolutionary Biology.

  4. Testing for long-range dependence in world stock markets

    OpenAIRE

    Cajueiro, Daniel Oliveira; Tabak, Benjamin Miranda

    2008-01-01

    In this paper, we show a novel approach to rank stock market indices in terms of weak form efficiency using state of the art methodology in statistical physics. We employ the R/S and V/S methodologies to test for long-range dependence in equity returns and volatility. Empirical results suggests that although emerging markets possess stronger long-range dependence in equity returns than developed economies, this is not true for volatility. In the case of volatility, Hurst exponents...

  5. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ferson, Scott [Applied Biomathematics, Setauket, NY (United States); Nelsen, Roger B. [Lewis & Clark College, Portland OR (United States); Hajagos, Janos [Applied Biomathematics, Setauket, NY (United States); Berleant, Daniel J. [Iowa State Univ., Ames, IA (United States); Zhang, Jianzhong [Iowa State Univ., Ames, IA (United States); Tucker, W. Troy [Applied Biomathematics, Setauket, NY (United States); Ginzburg, Lev R. [Applied Biomathematics, Setauket, NY (United States); Oberkampf, William L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  6. The transition probability and the probability for the left-most particle's position of the q-totally asymmetric zero range process

    Energy Technology Data Exchange (ETDEWEB)

    Korhonen, Marko [Department of Mathematics and Statistics, University of Helsinki, FIN-00014 (Finland); Lee, Eunghyun [Centre de Recherches Mathématiques (CRM), Université de Montréal, Quebec H3C 3J7 (Canada)

    2014-01-15

    We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle's position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.

  7. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  8. On Z-dependence of probability of atomic capture of mesons in matter

    International Nuclear Information System (INIS)

    Vasil'ev, V.A.; Petrukhin, V.I.; Suvorov, V.M.; Khorvat, D.

    1976-01-01

    All experimental data available on the atomic capture of negative muons and pions are systematically studied to find more appropriate empirical expression for the capture probability as a function of the atomic number. It is shown that Z-dependence, as a rule, does not hold. Zsup(1/3)-dependence gives more satisfactory results. A modified Zsup(1/3-dependence is proposed which is more appropriate for hydrogen - containing compounds

  9. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  10. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  11. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  12. Probability of primordial black hole formation and its dependence on the radial profile of initial configurations

    International Nuclear Information System (INIS)

    Hidalgo, J. C.; Polnarev, A. G.

    2009-01-01

    In this paper we derive the probability of the radial profiles of spherically symmetric inhomogeneities in order to provide an improved estimation of the number density of primordial black holes (PBHs). We demonstrate that the probability of PBH formation depends sensitively on the radial profile of the initial configuration. We do this by characterizing this profile with two parameters chosen heuristically: the amplitude of the inhomogeneity and the second radial derivative, both evaluated at the center of the configuration. We calculate the joint probability of initial cosmological inhomogeneities as a function of these two parameters and then find a correspondence between these parameters and those used in numerical computations of PBH formation. Finally, we extend our heuristic study to evaluate the probability of PBH formation taking into account for the first time the radial profile of curvature inhomogeneities.

  13. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    Science.gov (United States)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  14. Search for an intermediate-range composition-dependent force

    International Nuclear Information System (INIS)

    Boynton, P.E.; Crosby, D.; Ekstrom, P.; Szumilo, A.

    1987-01-01

    We have conducted an experiment to detect a composition-dependent force with range λ between 10 m and 1 km, and find a statistically significant effect. If interpreted as arising from a new force, this result and other recent measurementes would be consistent in strength only if the coupling were predominantly to nuclear isospin

  15. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Directory of Open Access Journals (Sweden)

    Michael R W Dawson

    Full Text Available Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  16. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability

    Science.gov (United States)

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent’s environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned. PMID:28212422

  17. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Science.gov (United States)

    Dawson, Michael R W; Gupta, Maya

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  18. Measurements of transition probabilities in the range from vacuum ultraviolet to infrared

    International Nuclear Information System (INIS)

    Peraza Fernandez, M.C.

    1992-01-01

    In this memory we describe the design, testing and calibration of different spectrometers to measure transition probabilities from the vacuum ultraviolet to the infrared spectral region. For the infrared measurements we have designed and performed a phase sensitive detection system, using an InGaAs photodiode like detector. With this system we have determined the transition probabilities of infrared lines of KrI and XeI. For these lines we haven't found previous measurements. In the vacuum ultraviolet spectral region we have designed a 3 m normal incidence monochromator where we have installed an optical multichannel analyzer. We have tested its accurate working, obtaining the absorption spectrum of KrI. In the visible region we have obtained the emission spectrum of Al using different spectral: hallow-cathode lamp and Nd: YAG laser produced Al plasma. With these spectra we have determined different atomic parameters like transition probabilities and electron temperatures.(author). 83 refs

  19. Effect of field-dependent mobility on the escape probability. I. Electrons photoinjected in neopentane

    International Nuclear Information System (INIS)

    Mozumder, A.; Carmichael, I.

    1978-01-01

    A general procedure is described for calculating the escape probability of an electron against neutralization in the presence of an external field after it has been ejected into a dielectric liquid from a planar surface. The present paper utilizes the field-dependent electron mobility measurement in neopentane by Bakale and Schmidt. The calculated escape probability, upon averaging over the initial distribution, is compared with the current efficiency measurement of Holroyd et al. The median thermalization legnth, inferred from this comparison, depends in general upon the assumed form of initial distribution. It is less than the value obtained when the field dependence of the mobility is ignored but greater than that applicable to the high energy irradiation case. A plausible explanation is offered

  20. Off-diagonal long-range order, cycle probabilities, and condensate fraction in the ideal Bose gas.

    Science.gov (United States)

    Chevallier, Maguelonne; Krauth, Werner

    2007-11-01

    We discuss the relationship between the cycle probabilities in the path-integral representation of the ideal Bose gas, off-diagonal long-range order, and Bose-Einstein condensation. Starting from the Landsberg recursion relation for the canonic partition function, we use elementary considerations to show that in a box of size L3 the sum of the cycle probabilities of length k>L2 equals the off-diagonal long-range order parameter in the thermodynamic limit. For arbitrary systems of ideal bosons, the integer derivative of the cycle probabilities is related to the probability of condensing k bosons. We use this relation to derive the precise form of the pik in the thermodynamic limit. We also determine the function pik for arbitrary systems. Furthermore, we use the cycle probabilities to compute the probability distribution of the maximum-length cycles both at T=0, where the ideal Bose gas reduces to the study of random permutations, and at finite temperature. We close with comments on the cycle probabilities in interacting Bose gases.

  1. Class dependency of fuzzy relational database using relational calculus and conditional probability

    Science.gov (United States)

    Deni Akbar, Mohammad; Mizoguchi, Yoshihiro; Adiwijaya

    2018-03-01

    In this paper, we propose a design of fuzzy relational database to deal with a conditional probability relation using fuzzy relational calculus. In the previous, there are several researches about equivalence class in fuzzy database using similarity or approximate relation. It is an interesting topic to investigate the fuzzy dependency using equivalence classes. Our goal is to introduce a formulation of a fuzzy relational database model using the relational calculus on the category of fuzzy relations. We also introduce general formulas of the relational calculus for the notion of database operations such as ’projection’, ’selection’, ’injection’ and ’natural join’. Using the fuzzy relational calculus and conditional probabilities, we introduce notions of equivalence class, redundant, and dependency in the theory fuzzy relational database.

  2. Energy dependence of polymer gels in the orthovoltage energy range

    Directory of Open Access Journals (Sweden)

    Yvonne Roed

    2014-03-01

    Full Text Available Purpose: Ortho-voltage energies are often used for treatment of patients’ superficial lesions, and also for small- animal irradiations. Polymer-Gel dosimeters such as MAGAT (Methacrylic acid Gel and THPC are finding increasing use for 3-dimensional verification of radiation doses in a given treatment geometry. For mega-voltage beams, energy dependence of MAGAT has been quoted as nearly energy-independent. In the kilo-voltage range, there is hardly any literature to shade light on its energy dependence.Methods: MAGAT was used to measure depth-dose for 250 kVp beam. Comparison with ion-chamber data showed a discrepancy increasing significantly with depth. An over-response as much as 25% was observed at a depth of 6 cm.Results and Conclusion: Investigation concluded that 6 cm water in the beam resulted in a half-value-layer (HVL change from 1.05 to 1.32 mm Cu. This amounts to an effective-energy change from 81.3 to 89.5 keV. Response measurements of MAGAT at these two energies explained the observed discrepancy in depth-dose measurements. Dose-calibration curves of MAGAT for (i 250 kVp beam, and (ii 250 kVp beam through 6 cm of water column are presented showing significant energy dependence.-------------------Cite this article as: Roed Y, Tailor R, Pinksy L, Ibbott G. Energy dependence of polymer gels in the orthovoltage energy range. Int J Cancer Ther Oncol 2014; 2(2:020232. DOI: 10.14319/ijcto.0202.32 

  3. Long-range dependence and sea level forecasting

    CERN Document Server

    Ercan, Ali; Abbasov, Rovshan K

    2013-01-01

    This study shows that the Caspian Sea level time series possess long range dependence even after removing linear trends, based on analyses of the Hurst statistic, the sample autocorrelation functions, and the periodogram of the series. Forecasting performance of ARMA, ARIMA, ARFIMA and Trend Line-ARFIMA (TL-ARFIMA) combination models are investigated. The forecast confidence bands and the forecast updating methodology, provided for ARIMA models in the literature, are modified for the ARFIMA models. Sample autocorrelation functions are utilized to estimate the differencing lengths of the ARFIMA

  4. Time dependent and asymptotic neutron number probability distribution calculation using discrete Fourier transform

    International Nuclear Information System (INIS)

    Humbert, Ph.

    2005-01-01

    In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)

  5. Probable causes of increasing brucellosis in free-ranging elk of the Greater Yellowstone Ecosystem

    Science.gov (United States)

    Cross, P.C.; Cole, E.K.; Dobson, A.P.; Edwards, W.H.; Hamlin, K.L.; Luikart, G.; Middleton, A.D.; Scurlock, B.M.; White, P.J.

    2010-01-01

    While many wildlife species are threatened, some populations have recovered from previous overexploitation, and data linking these population increases with disease dynamics are limited. We present data suggesting that free-ranging elk (Cervus elaphus) are a maintenance host for Brucella abortus in new areas of the Greater Yellowstone Ecosystem (GYE). Brucellosis seroprevalence in free-ranging elk increased from 0-7% in 1991-1992 to 8-20% in 2006-2007 in four of six herd units around the GYE. These levels of brucellosis are comparable to some herd units where elk are artificially aggregated on supplemental feeding grounds. There are several possible mechanisms for this increase that we evaluated using statistical and population modeling approaches. Simulations of an age-structured population model suggest that the observed levels of seroprevalence are unlikely to be sustained by dispersal from supplemental feeding areas with relatively high seroprevalence or an older age structure. Increases in brucellosis seroprevalence and the total elk population size in areas with feeding grounds have not been statistically detectable. Meanwhile, the rate of seroprevalence increase outside the feeding grounds was related to the population size and density of each herd unit. Therefore, the data suggest that enhanced elk-to-elk transmission in free-ranging populations may be occurring due to larger winter elk aggregations. Elk populations inside and outside of the GYE that traditionally did not maintain brucellosis may now be at risk due to recent population increases. In particular, some neighboring populations of Montana elk were 5-9 times larger in 2007 than in the 1970s, with some aggregations comparable to the Wyoming feeding-ground populations. Addressing the unintended consequences of these increasing populations is complicated by limited hunter access to private lands, which places many ungulate populations out of administrative control. Agency-landowner hunting access

  6. Landau parameters for finite range density dependent nuclear interactions

    International Nuclear Information System (INIS)

    Farine, M.

    1997-01-01

    The Landau parameters represent the effective particle-hole interaction at Fermi level. Since between the physical observables and the Landau parameters there is a direct relation their derivation from an effective interaction is of great interest. The parameter F 0 determines the incompressibility K of the system. The parameter F 1 determines the effective mass (which controls the level density at the Fermi level). In addition, F 0 ' determines the symmetry energy, G 0 the magnetic susceptibility, and G 0 ' the pion condensation threshold in nuclear matter. This paper is devoted to a general derivation of Landau parameters for an interaction with density dependent finite range terms. Particular carefulness is devoted to the inclusion of rearrangement terms. This report is part of a larger project which aims at defining a new nuclear interaction improving the well-known D1 force of Gogny et al. for describing the average nuclear properties and exotic nuclei and satisfying, in addition, the sum rules

  7. Prediction suppression in monkey inferotemporal cortex depends on the conditional probability between images.

    Science.gov (United States)

    Ramachandran, Suchitra; Meyer, Travis; Olson, Carl R

    2016-01-01

    When monkeys view two images in fixed sequence repeatedly over days and weeks, neurons in area TE of the inferotemporal cortex come to exhibit prediction suppression. The trailing image elicits only a weak response when presented following the leading image that preceded it during training. Induction of prediction suppression might depend either on the contiguity of the images, as determined by their co-occurrence and captured in the measure of joint probability P(A,B), or on their contingency, as determined by their correlation and as captured in the measures of conditional probability P(A|B) and P(B|A). To distinguish between these possibilities, we measured prediction suppression after imposing training regimens that held P(A,B) constant but varied P(A|B) and P(B|A). We found that reducing either P(A|B) or P(B|A) during training attenuated prediction suppression as measured during subsequent testing. We conclude that prediction suppression depends on contingency, as embodied in the predictive relations between the images, and not just on contiguity, as embodied in their co-occurrence. Copyright © 2016 the American Physiological Society.

  8. The effect of fog on the probability density distribution of the ranging data of imaging laser radar

    Science.gov (United States)

    Song, Wenhua; Lai, JianCheng; Ghassemlooy, Zabih; Gu, Zhiyong; Yan, Wei; Wang, Chunyong; Li, Zhenhua

    2018-02-01

    This paper outlines theoretically investigations of the probability density distribution (PDD) of ranging data for the imaging laser radar (ILR) system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.

  9. The effect of fog on the probability density distribution of the ranging data of imaging laser radar

    Directory of Open Access Journals (Sweden)

    Wenhua Song

    2018-02-01

    Full Text Available This paper outlines theoretically investigations of the probability density distribution (PDD of ranging data for the imaging laser radar (ILR system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.

  10. The temperature dependence of the BK channel activity - kinetics, thermodynamics, and long-range correlations.

    Science.gov (United States)

    Wawrzkiewicz-Jałowiecka, Agata; Dworakowska, Beata; Grzywna, Zbigniew J

    2017-10-01

    Large-conductance, voltage dependent, Ca 2+ -activated potassium channels (BK) are transmembrane proteins that regulate many biological processes by controlling potassium flow across cell membranes. Here, we investigate to what extent temperature (in the range of 17-37°C with ΔT=5°C step) is a regulating parameter of kinetic properties of the channel gating and memory effect in the series of dwell-time series of subsequent channel's states, at membrane depolarization and hyperpolarization. The obtained results indicate that temperature affects strongly the BK channels' gating, but, counterintuitively, it exerts no effect on the long-range correlations, as measured by the Hurst coefficient. Quantitative differences between dependencies of appropriate channel's characteristics on temperature are evident for different regimes of voltage. Examining the characteristics of BK channel activity as a function of temperature allows to estimate the net activation energy (E act ) and changes of thermodynamic parameters (ΔH, ΔS, ΔG) by channel opening. Larger E act corresponds to the channel activity at membrane hyperpolarization. The analysis of entropy and enthalpy changes of closed to open channel's transition suggest the entropy-driven nature of the increase of open state probability during voltage activation and supports the hypothesis about the voltage-dependent geometry of the channel vestibule. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    Science.gov (United States)

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  12. HELIOS: transformation laws for multiple-collision probabilities with angular dependence

    International Nuclear Information System (INIS)

    Villarino, E.A.; Stamm'ler, R.J.J.

    1996-01-01

    In the lattice code HELIOS, neutron and gamma transport in a given system is treated by the CCCP (current-coupling collision-probability) method. The system is partitioned into space elements which are coupled by currents. Inside the space elements first-flight probabilities are used to obtain the coefficients of the coupling equation and of the equations for the fluxes. The calculation of these coefficients is expensive in CPU time on two scores: the evaluation of the first-flight probabilities, and the matrix inversion to convert these probabilities into the desired coefficients. If the cross sections of two geometrically equal space elements, or of the same element at an earlier burnup level, differ less than a small fraction, considerable CPU time can be saved by using transformation laws. Previously, such laws were derived for first-flight probabilities; here, they are derived for the multiple-collision coefficients of the CCCP equations. They avoid not only the expensive calculations of the first-flight probabilities, but also the subsequent matrix inversion. Various examples illustrate the savings achieved by using these new transformation laws - or by directly using earlier calculated coefficients, if the cross section differences are negligible. (author)

  13. Linker-dependent Junction Formation Probability in Single-Molecule Junctions

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Pil Sun; Kim, Taekyeong [HankukUniversity of Foreign Studies, Yongin (Korea, Republic of)

    2015-01-15

    We compare the junction formation probabilities of single-molecule junctions with different linker molecules by using a scanning tunneling microscope-based break-junction technique. We found that the junction formation probability varies as SH > SMe > NH2 for the benzene backbone molecule with different types of anchoring groups, through quantitative statistical analysis. These results are attributed to different bonding forces according to the linker groups formed with Au atoms in the electrodes, which is consistent with previous works. Our work allows a better understanding of the contact chemistry in the metal.molecule junction for future molecular electronic devices.

  14. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  15. Exploring flavor-dependent long-range forces in long-baseline neutrino oscillation experiments

    Science.gov (United States)

    Chatterjee, Sabya Sachi; Dasgupta, Arnab; Agarwalla, Sanjib Kumar

    2015-12-01

    The Standard Model gauge group can be extended with minimal matter content by introducing anomaly free U(1) symmetry, such as L e - L μ or L e - L τ . If the neutral gauge boson corresponding to this abelian symmetry is ultra-light, then it will give rise to flavor-dependent long-range leptonic force, which can have significant impact on neutrino oscillations. For an instance, the electrons inside the Sun can generate a flavor-dependent long-range potential at the Earth surface, which can suppress the ν μ → ν e appearance probability in terrestrial experiments. The sign of this potential is opposite for anti-neutrinos, and affects the oscillations of (anti-)neutrinos in different fashion. This feature invokes fake CP-asymmetry like the SM matter effect and can severely affect the leptonic CP-violation searches in long-baseline experiments. In this paper, we study in detail the possible impacts of these long-range flavor-diagonal neutral current interactions due to L e - L μ symmetry, when (anti-)neutrinos travel from Fermilab to Homestake (1300 km) and CERN to Pyhäsalmi (2290 km) in the context of future high-precision superbeam facilities, DUNE and LBNO respectively. If there is no signal of long-range force, DUNE (LBNO) can place stringent constraint on the effective gauge coupling α eμ < 1.9 × 10-53 (7.8 × 10-54) at 90% C.L., which is almost 30 (70) times better than the existing bound from the Super-Kamiokande experiment. We also observe that if α eμ ≥ 2 × 10-52, the CP-violation discovery reach of these future facilities vanishes completely. The mass hierarchy measurement remains robust in DUNE (LBNO) if α eμ < 5 × 10-52 (10-52).

  16. Measurement of Plutonium-240 Angular Momentum Dependent Fission Probabilities Using the Alpha-Alpha' Reaction

    Science.gov (United States)

    Koglin, Johnathon

    Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to

  17. Attempts to reduce alcohol intake and treatment needs among people with probable alcohol dependence in England: a general population survey.

    Science.gov (United States)

    Dunne, Jacklyn; Kimergård, Andreas; Brown, Jamie; Beard, Emma; Buykx, Penny; Michie, Susan; Drummond, Colin

    2018-03-25

    To compare the proportion of people in England with probable alcohol dependence [Alcohol Use Disorders Identification Test (AUDIT) score ≥ 20] with those with other drinking patterns (categorized by AUDIT scores) in terms of motivation to reduce drinking and use of alcohol support resources. A combination of random probability and simple quota sampling to conduct monthly cross-sectional household computer-assisted interviews between March 2014 and August 2017. The general population in all nine regions of England. Participants in the Alcohol Toolkit Study (ATS), a monthly household survey of alcohol consumption among people aged 16 years and over in England (n = 69 826). The mean age was 47 years [standard deviation (SD) = 18.78; 95% confidence interval (CI) = 46.8-47] and 51% (n = 35 560) were female. χ 2 tests were used to investigate associations with demographic variables, motivation to quit drinking, attempts to quit drinking, general practitioner (GP) engagement and types of support accessed in the last 12 months across AUDIT risk zones. A total of 0.6% were classified as people with probable alcohol dependence (95% CI = 0.5-0.7). Motivation to quit (χ 2  = 1692.27, P AUDIT risk zone. People with probable dependence were more likely than other ATS participants to have a past-year attempt to cut down or quit (51.8%) and have received a specialist referral from their GP about drinking (13.7%), and less likely to report no motivation to reduce their drinking (26.2%). Those with probable dependence had higher use of self-help books and mobile applications (apps) than other ATS participants; however, 27.7% did not access any resources during their most recent attempt to cut down. Adults in England with probable alcohol dependence, measured through the Alcohol Use Disorders Identification Test, demonstrate higher motivation to quit drinking and greater use of both specialist treatment and self-driven support compared with those in other

  18. Experimental impact-parameter--dependent probabilities for K-shell vacancy production by fast heavy-ion projectiles

    International Nuclear Information System (INIS)

    Randall, R.R.; Bednar, J.A.; Curnutte, B.; Cocke, C.L.

    1976-01-01

    The impact-parameter dependence of the probability for production of target K x rays has been measured for oxygen projectiles on copper and for carbon and fluorine projectiles on argon at scaled velocities near 0.5. The O-on-Cu data were taken for 1.56-, 1.88-, and 2.69-MeV/amu O beams incident upon thin Cu foils. A thin Ar-gas target was used for 1.56-MeV/amu C and F beams, permitting measurements to be made for charge-pure C +4 , C +6 , F +9 and F +5 projectiles. Ar and Cu K x rays were observed with a Si(Li) detector and scattered projectiles with a collimated surface-barrier detector. Comparison of the shapes of the measured K-vacancy--production probability curves with predictions of the semiclassical Coulomb approximation (SCA) shows adequate agreement for the O-on-Cu system. For the higher ratio of projectile-to-target nuclear charge (Z 1 /Z 2 ) characterizing the C-on-Ar and F-on-Ar systems, the SCA predictions are entirely inadequate in describing the observed impact-parameter dependence. In particular, they cannot account for large probabilities found at large impact parameters. Furthermore, the dependence of the shapes on the projectile charge state is found to become pronounced at larger Z 1 /Z 2 . Attempts to account for this behavior in terms of alternative vacancy-production processes are discussed

  19. Long-range dependence in returns and volatility of Central European Stock Indices

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2010-01-01

    Roč. 2010, č. 3 (2010), s. 1-19 R&D Projects: GA ČR GD402/09/H045 Institutional research plan: CEZ:AV0Z10750506 Keywords : long-range dependence * rescaled range * modified rescaled range * bootstrapping Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-long-range dependence in returns and volatility of central european stock indices.pdf

  20. A better understanding of long-range temporal dependence of traffic flow time series

    Science.gov (United States)

    Feng, Shuo; Wang, Xingmin; Sun, Haowei; Zhang, Yi; Li, Li

    2018-02-01

    Long-range temporal dependence is an important research perspective for modelling of traffic flow time series. Various methods have been proposed to depict the long-range temporal dependence, including autocorrelation function analysis, spectral analysis and fractal analysis. However, few researches have studied the daily temporal dependence (i.e. the similarity between different daily traffic flow time series), which can help us better understand the long-range temporal dependence, such as the origin of crossover phenomenon. Moreover, considering both types of dependence contributes to establishing more accurate model and depicting the properties of traffic flow time series. In this paper, we study the properties of daily temporal dependence by simple average method and Principal Component Analysis (PCA) based method. Meanwhile, we also study the long-range temporal dependence by Detrended Fluctuation Analysis (DFA) and Multifractal Detrended Fluctuation Analysis (MFDFA). The results show that both the daily and long-range temporal dependence exert considerable influence on the traffic flow series. The DFA results reveal that the daily temporal dependence creates crossover phenomenon when estimating the Hurst exponent which depicts the long-range temporal dependence. Furthermore, through the comparison of the DFA test, PCA-based method turns out to be a better method to extract the daily temporal dependence especially when the difference between days is significant.

  1. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  2. OL-DEC-MDP Model for Multiagent Online Scheduling with a Time-Dependent Probability of Success

    Directory of Open Access Journals (Sweden)

    Cheng Zhu

    2014-01-01

    Full Text Available Focusing on the on-line multiagent scheduling problem, this paper considers the time-dependent probability of success and processing duration and proposes an OL-DEC-MDP (opportunity loss-decentralized Markov Decision Processes model to include opportunity loss into scheduling decision to improve overall performance. The success probability of job processing as well as the process duration is dependent on the time at which the processing is started. The probability of completing the assigned job by an agent would be higher when the process is started earlier, but the opportunity loss could also be high due to the longer engaging duration. As a result, OL-DEC-MDP model introduces a reward function considering the opportunity loss, which is estimated based on the prediction of the upcoming jobs by a sampling method on the job arrival. Heuristic strategies are introduced in computing the best starting time for an incoming job by each agent, and an incoming job will always be scheduled to the agent with the highest reward among all agents with their best starting policies. The simulation experiments show that the OL-DEC-MDP model will improve the overall scheduling performance compared with models not considering opportunity loss in heavy-loading environment.

  3. Betting on change: modeling transitional probabilities to guide therapy development for opioid dependence.

    Science.gov (United States)

    Carpenter, Kenneth M; Jiang, Huiping; Sullivan, Maria A; Bisaga, Adam; Comer, Sandra D; Raby, Wilfrid Noel; Brooks, Adam C; Nunes, Edward V

    2009-03-01

    This study investigated the process of change by modeling transitions among four clinical states encountered in 64 detoxified opiate-dependent individuals treated with daily oral naltrexone: no opiate use, blocked opiate use (i.e., opiate use while adhering to oral naltrexone), unblocked opiate use (i.e., opiate use after having discontinued oral naltrexone), and treatment dropout. The effects of baseline characteristics and two psychosocial interventions of differing intensity, behavioral naltrexone therapy (BNT) and compliance enhancement (CE), on these transitions were studied. Participants using greater quantities of opiates were more likely than other participants to be retained in BNT relative to CE. Markov modeling indicated a transition from abstinence to treatment dropout was approximately 3.56 times greater among participants in CE relative to participants in BNT, indicating the more comprehensive psychosocial intervention kept participants engaged in treatment longer. Transitions to stopping treatment were more likely to occur after unblocked opiate use in both treatments. Continued opiate use while being blocked accounted for a relatively low proportion of transitions to abstinence and may have more deleterious effects later in a treatment episode. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  4. A Modified Generalized Fisher Method for Combining Probabilities from Dependent Tests

    Directory of Open Access Journals (Sweden)

    Hongying (Daisy eDai

    2014-02-01

    Full Text Available Rapid developments in molecular technology have yielded a large amount of high throughput genetic data to understand the mechanism for complex traits. The increase of genetic variants requires hundreds and thousands of statistical tests to be performed simultaneously in analysis, which poses a challenge to control the overall Type I error rate. Combining p-values from multiple hypothesis testing has shown promise for aggregating effects in high-dimensional genetic data analysis. Several p-value combining methods have been developed and applied to genetic data; see [Dai, et al. 2012b] for a comprehensive review. However, there is a lack of investigations conducted for dependent genetic data, especially for weighted p-value combining methods. Single nucleotide polymorphisms (SNPs are often correlated due to linkage disequilibrium. Other genetic data, including variants from next generation sequencing, gene expression levels measured by microarray, protein and DNA methylation data, etc. also contain complex correlation structures. Ignoring correlation structures among genetic variants may lead to severe inflation of Type I error rates for omnibus testing of p-values. In this work, we propose modifications to the Lancaster procedure by taking the correlation structure among p-values into account. The weight function in the Lancaster procedure allows meaningful biological information to be incorporated into the statistical analysis, which can increase the power of the statistical testing and/or remove the bias in the process. Extensive empirical assessments demonstrate that the modified Lancaster procedure largely reduces the Type I error rates due to correlation among p-values, and retains considerable power to detect signals among p-values. We applied our method to reassess published renal transplant data, and identified a novel association between B cell pathways and allograft tolerance.

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  6. On discriminating between long-range dependence and changes in mean

    OpenAIRE

    Berkes, István; Horváth, Lajos; Kokoszka, Piotr; Shao, Qi-Man

    2006-01-01

    We develop a testing procedure for distinguishing between a long-range dependent time series and a weakly dependent time series with change-points in the mean. In the simplest case, under the null hypothesis the time series is weakly dependent with one change in mean at an unknown point, and under the alternative it is long-range dependent. We compute the CUSUM statistic Tn, which allows us to construct an estimator k̂ of a change-point. We then compute the statistic Tn,1 based on the observa...

  7. Ruin Probabilities in a Dependent Discrete-Time Risk Model With Gamma-Like Tailed Insurance Risks

    Directory of Open Access Journals (Sweden)

    Xing-Fang Huang

    2017-03-01

    Full Text Available This paper considered a dependent discrete-time risk model, in which the insurance risks are represented by a sequence of independent and identically distributed real-valued random variables with a common Gamma-like tailed distribution; the financial risks are denoted by another sequence of independent and identically distributed positive random variables with a finite upper endpoint, but a general dependence structure exists between each pair of the insurance risks and the financial risks. Following the works of Yang and Yuen in 2016, we derive some asymptotic relations for the finite-time and infinite-time ruin probabilities. As a complement, we demonstrate our obtained result through a Crude Monte Carlo (CMC simulation with asymptotics.

  8. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  9. Normal tissue complication probabilities: dependence on choice of biological model and dose-volume histogram reduction scheme

    International Nuclear Information System (INIS)

    Moiseenko, Vitali; Battista, Jerry; Van Dyk, Jake

    2000-01-01

    Purpose: To evaluate the impact of dose-volume histogram (DVH) reduction schemes and models of normal tissue complication probability (NTCP) on ranking of radiation treatment plans. Methods and Materials: Data for liver complications in humans and for spinal cord in rats were used to derive input parameters of four different NTCP models. DVH reduction was performed using two schemes: 'effective volume' and 'preferred Lyman'. DVHs for competing treatment plans were derived from a sample DVH by varying dose uniformity in a high dose region so that the obtained cumulative DVHs intersected. Treatment plans were ranked according to the calculated NTCP values. Results: Whenever the preferred Lyman scheme was used to reduce the DVH, competing plans were indistinguishable as long as the mean dose was constant. The effective volume DVH reduction scheme did allow us to distinguish between these competing treatment plans. However, plan ranking depended on the radiobiological model used and its input parameters. Conclusions: Dose escalation will be a significant part of radiation treatment planning using new technologies, such as 3-D conformal radiotherapy and tomotherapy. Such dose escalation will depend on how the dose distributions in organs at risk are interpreted in terms of expected complication probabilities. The present study indicates considerable variability in predicted NTCP values because of the methods used for DVH reduction and radiobiological models and their input parameters. Animal studies and collection of standardized clinical data are needed to ascertain the effects of non-uniform dose distributions and to test the validity of the models currently in use

  10. Testing for long-range dependence in the Brazilian term structure of interest rates

    International Nuclear Information System (INIS)

    Cajueiro, Daniel O.; Tabak, Benjamin M.

    2009-01-01

    This paper presents empirical evidence of fractional dynamics in interest rates for different maturities for Brazil. A variation of a newly developed test for long-range dependence, the V/S statistic, with a post-blackening bootstrap is employed. Results suggest that Brazilian interest rates possess strong long-range dependence in volatility, even when considering the structural break in 1999. These findings imply that the development of policy models that give rise to long-range dependence in interest rates' volatility could be very useful. The long-short-term interest rates spread has strong long-range dependence, which suggests that traditional tests of expectation hypothesis of the term structure of interest rates may be misspecified.

  11. Improving Delay-Range-Dependent Stability Condition for Systems with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Wei Qian

    2013-01-01

    Full Text Available This paper discusses the delay-range-dependent stability for systems with interval time-varying delay. Through defining the new Lyapunov-Krasovskii functional and estimating the derivative of the LKF by introducing new vectors, using free matrices and reciprocally convex approach, the new delay-range-dependent stability conditions are obtained. Two well-known examples are given to illustrate the less conservatism of the proposed theoretical results.

  12. A novel nuclear dependence of nucleon–nucleon short-range correlations

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Hongkai [College of Physics and Electronic Engineering, Northwest Normal University, Lanzhou 730070 (China); Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Wang, Rong, E-mail: rwang@impcas.ac.cn [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Lanzhou University, Lanzhou 730000 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Huang, Yin [Lanzhou University, Lanzhou 730000 (China); Chen, Xurong [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2017-06-10

    A linear correlation is found between the magnitude of nucleon–nucleon short-range correlations and the nuclear binding energy per nucleon with pairing energy removed. By using this relation, the strengths of nucleon–nucleon short-range correlations of some unmeasured nuclei are predicted. Discussions on nucleon–nucleon pairing energy and nucleon–nucleon short-range correlations are made. The found nuclear dependence of nucleon–nucleon short-range correlations may shed some lights on the short-range structure of nucleus.

  13. Registration-Based Range-Dependence Compensation for Bistatic STAP Radars

    Directory of Open Access Journals (Sweden)

    Lapierre Fabian D

    2005-01-01

    Full Text Available We address the problem of detecting slow-moving targets using space-time adaptive processing (STAP radar. Determining the optimum weights at each range requires data snapshots at neighboring ranges. However, in virtually all configurations, snapshot statistics are range dependent, meaning that snapshots are nonstationary with respect to range. This results in poor performance. In this paper, we propose a new compensation method based on registration of clutter ridges and designed to work on a single realization of the stochastic snapshot at each range. The method has been successfully tested on simulated, stochastic snapshots. An evaluation of performance is presented.

  14. Long-range dependence in returns and volatility of Central European Stock Indices

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2010-01-01

    Roč. 17, č. 27 (2010), s. 50-67 ISSN 1212-074X R&D Projects: GA ČR GD402/09/H045; GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 5183/2010 Institutional research plan: CEZ:AV0Z10750506 Keywords : long-range dependence * bootstrapping * rescaled range analysis * rescaled variance analysis Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-long-range dependence in returns and volatility of central european stock indices bces.pdf

  15. Common long-range dependence in a panel of hourly Nord Pool electricity prices and loads

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre; Haldrup, Niels; Rodríguez-Caballero, Carlos Vladimir

    to strong seasonal periodicity, and along the cross-sectional dimension, i.e. the hours of the day, there is a strong dependence which necessarily has to be accounted for in order to avoid spurious inference when focusing on the time series dependence alone. The long-range dependence is modelled in terms...... of a fractionally integrated panel data model and it is shown that both prices and loads consist of common factors with long memory and with loadings that vary considerably during the day. Due to the competitiveness of the Nordic power market the aggregate supply curve approximates well the marginal costs...... data approaches to analyse the time series and the cross-sectional dependence of hourly Nord Pool electricity spot prices and loads for the period 2000-2013. Hourly electricity prices and loads data are characterized by strong serial long-range dependence in the time series dimension in addition...

  16. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Sissay, Adonay [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J. [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Lopata, Kenneth, E-mail: klopata@lsu.edu [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2016-09-07

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  17. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    International Nuclear Information System (INIS)

    Sissay, Adonay; Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J.; Lopata, Kenneth

    2016-01-01

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  18. Investigation of photon detection probability dependence of SPADnet-I digital photon counter as a function of angle of incidence, wavelength and polarization

    Energy Technology Data Exchange (ETDEWEB)

    Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor

    2015-01-01

    SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.

  19. Trait mindfulness, reasons for living and general symptom severity as predictors of suicide probability in males with substance abuse or dependence.

    Directory of Open Access Journals (Sweden)

    Parvaneh Mohammadkhani

    2015-03-01

    Full Text Available The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms.Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS.The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001. The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001.It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability.

  20. Fractality Evidence and Long-Range Dependence on Capital Markets: a Hurst Exponent Evaluation

    Science.gov (United States)

    Oprean, Camelia; Tănăsescu, Cristina

    2014-07-01

    Since the existence of market memory could implicate the rejection of the efficient market hypothesis, the aim of this paper is to find any evidence that selected emergent capital markets (eight European and BRIC markets, namely Hungary, Romania, Estonia, Czech Republic, Brazil, Russia, India and China) evince long-range dependence or the random walk hypothesis. In this paper, the Hurst exponent as calculated by R/S fractal analysis and Detrended Fluctuation Analysis is our measure of long-range dependence in the series. The results reinforce our previous findings and suggest that if stock returns present long-range dependence, the random walk hypothesis is not valid anymore and neither is the market efficiency hypothesis.

  1. Rolling estimations of long range dependence volatility for high frequency S&P500 index

    Science.gov (United States)

    Cheong, Chin Wen; Pei, Tan Pei

    2015-10-01

    This study evaluates the time-varying long range dependence behaviors of the S&P500 volatility index using the modified rescaled adjusted range (R/S) statistic. For better computational result, a high frequency rolling bipower variation realized volatility estimates are used to avoid possible abrupt jump. The empirical analysis findings allow us to understand better the informationally market efficiency before and after the subprime mortgage crisis.

  2. Multi-configuration time-dependent density-functional theory based on range separation

    DEFF Research Database (Denmark)

    Fromager, E.; Knecht, S.; Jensen, Hans Jørgen Aagaard

    2013-01-01

    Multi-configuration range-separated density-functional theory is extended to the time-dependent regime. An exact variational formulation is derived. The approximation, which consists in combining a long-range Multi-Configuration- Self-Consistent Field (MCSCF) treatment with an adiabatic short...... (srGGA) approximations. As expected, when modeling long-range interactions with the MCSCF model instead of the adiabatic Buijse-Baerends density-matrix functional as recently proposed by Pernal [J. Chem. Phys. 136, 184105 (2012)10.1063/1.4712019], the description of both the 1D doubly-excited state...

  3. Functional framework and hardware platform for dependability study in short range wireless embedded systems

    NARCIS (Netherlands)

    Senouci, B.; Annema, Anne J.; Bentum, Marinus Jan; Kerkhoff, Hans G.

    2011-01-01

    A new direction in short-range wireless applications has appeared in the form of high-speed data communication devices for distances of a few meters. Behind these embedded applications, a complex Hardware/Software architecture is built. Dependability is one of the major challenges in these systems.

  4. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  5. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  6. System Estimation of Panel Data Models under Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre

    A general dynamic panel data model is considered that incorporates individual and interactive fixed effects allowing for contemporaneous correlation in model innovations. The model accommodates general stationary or nonstationary long-range dependence through interactive fixed effects...... and innovations, removing the necessity to perform a priori unit-root or stationarity testing. Moreover, persistence in innovations and interactive fixed effects allows for cointegration; innovations can also have vector-autoregressive dynamics; deterministic trends can be featured. Estimations are performed...

  7. Breakdown of long-range temporal dependence in default mode and attention networks during deep sleep.

    Science.gov (United States)

    Tagliazucchi, Enzo; von Wegner, Frederic; Morzelewski, Astrid; Brodbeck, Verena; Jahnke, Kolja; Laufs, Helmut

    2013-09-17

    The integration of segregated brain functional modules is a prerequisite for conscious awareness during wakeful rest. Here, we test the hypothesis that temporal integration, measured as long-term memory in the history of neural activity, is another important quality underlying conscious awareness. For this aim, we study the temporal memory of blood oxygen level-dependent signals across the human nonrapid eye movement sleep cycle. Results reveal that this property gradually decreases from wakefulness to deep nonrapid eye movement sleep and that such decreases affect areas identified with default mode and attention networks. Although blood oxygen level-dependent spontaneous fluctuations exhibit nontrivial spatial organization, even during deep sleep, they also display a decreased temporal complexity in specific brain regions. Conversely, this result suggests that long-range temporal dependence might be an attribute of the spontaneous conscious mentation performed during wakeful rest.

  8. Long-range spatial dependence in fractured rock. Empirical evidence and implications for tracer transport

    International Nuclear Information System (INIS)

    Painter, S.

    1999-02-01

    Nonclassical stochastic continuum models incorporating long-range spatial dependence are evaluated as models for fractured crystalline rock. Open fractures and fracture zones are not modeled explicitly in this approach. The fracture zones and intact rock are modeled as a single stochastic continuum. The large contrasts between the fracture zones and unfractured rock are accounted for by making use of random field models specifically designed for highly variable systems. Hydraulic conductivity data derived from packer tests in the vicinity of the Aespoe Hard Rock Laboratory form the basis for the evaluation. The Aespoe log K data were found to be consistent with a fractal scaling model based on bounded fractional Levy motion (bfLm), a model that has been used previously to model highly variable sedimentary formations. However, the data are not sufficient to choose between this model, a fractional Brownian motion model for the normal-score transform of log K, and a conventional geostatistical model. Stochastic simulations conditioned by the Aespoe data coupled with flow and tracer transport calculations demonstrate that the models with long-range dependence predict earlier arrival times for contaminants. This demonstrates the need to evaluate this class of models when assessing the performance of proposed waste repositories. The relationship between intermediate-scale and large-scale transport properties in media with long-range dependence is also addressed. A new Monte Carlo method for stochastic upscaling of intermediate-scale field data is proposed

  9. Estimation of failure probability of the end induced current depending on uncertain parameters of a transmission line

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper treats about the risk analysis of an EMC default using a statistical approach based on reliability methods. A probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is computed by taking into account uncertainties on input parameters influencing extreme levels of interference in the context of transmission lines. Results are compared to Monte Carlo simulation (MCS). (authors)

  10. Determination of Age-Dependent Reference Ranges for Coagulation Tests Performed Using Destiny Plus.

    Science.gov (United States)

    Arslan, Fatma Demet; Serdar, Muhittin; Merve Ari, Elif; Onur Oztan, Mustafa; Hikmet Kozcu, Sureyya; Tarhan, Huseyin; Cakmak, Ozgur; Zeytinli, Merve; Yasar Ellidag, Hamit

    2016-06-01

    In order to apply the right treatment for hemostatic disorders in pediatric patients, laboratory data should be interpreted with age-appropriate reference ranges. The purpose of this study was to determining age-dependent reference range values for prothrombin time (PT), activated partial thromboplastin time (aPTT), fibrinogen tests, and D-dimer tests. A total of 320 volunteers were included in the study with the following ages: 1 month - 1 year (n = 52), 2 - 5 years (n = 50), 6 - 10 years (n = 48), 11 - 17 years (n = 38), and 18 - 65 years (n = 132). Each volunteer completed a survey to exclude hemostatic system disorder. Using a nonparametric method, the lower and upper limits, including 95% distribution and 90% confidence intervals, were calculated. No statistically significant differences were found between PT and aPTT values in the groups consisting of children. Thus, the reference ranges were separated into child and adult age groups. PT and aPTT values were significantly higher in the children than in the adults. Fibrinogen values in the 6 - 10 age group and the adult age group were significantly higher than in the other groups. D-dimer levels were significantly lower in those aged 2 - 17; thus, a separate reference range was established. These results support other findings related to developmental hemostasis, confirming that adult and pediatric age groups should be evaluated using different reference ranges.

  11. Fluctuations and pseudo long range dependence in network flows: A non-stationary Poisson process model

    International Nuclear Information System (INIS)

    Yu-Dong, Chen; Li, Li; Yi, Zhang; Jian-Ming, Hu

    2009-01-01

    In the study of complex networks (systems), the scaling phenomenon of flow fluctuations refers to a certain power-law between the mean flux (activity) (F i ) of the i-th node and its variance σ i as σ i α (F i ) α . Such scaling laws are found to be prevalent both in natural and man-made network systems, but the understanding of their origins still remains limited. This paper proposes a non-stationary Poisson process model to give an analytical explanation of the non-universal scaling phenomenon: the exponent α varies between 1/2 and 1 depending on the size of sampling time window and the relative strength of the external/internal driven forces of the systems. The crossover behaviour and the relation of fluctuation scaling with pseudo long range dependence are also accounted for by the model. Numerical experiments show that the proposed model can recover the multi-scaling phenomenon. (general)

  12. Memory effects, two color percolation, and the temperature dependence of Mott variable-range hopping

    Science.gov (United States)

    Agam, Oded; Aleiner, Igor L.

    2014-06-01

    There are three basic processes that determine hopping transport: (a) hopping between normally empty sites (i.e., having exponentially small occupation numbers at equilibrium), (b) hopping between normally occupied sites, and (c) transitions between normally occupied and unoccupied sites. In conventional theories all these processes are considered Markovian and the correlations of occupation numbers of different sites are believed to be small (i.e., not exponential in temperature). We show that, contrary to this belief, memory effects suppress the processes of type (c) and manifest themselves in a subleading exponential temperature dependence of the variable-range hopping conductivity. This temperature dependence originates from the property that sites of type (a) and (b) form two independent resistor networks that are weakly coupled to each other by processes of type (c). This leads to a two-color percolation problem which we solve in the critical region.

  13. Generalized Efficient Inference on Factor Models with Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre

    . Short-memory dynamics are allowed in the common factor structure and possibly heteroskedastic error term. In the estimation, a generalized version of the principal components (PC) approach is proposed to achieve efficiency. Asymptotics for efficient common factor and factor loading as well as long......A dynamic factor model is considered that contains stochastic time trends allowing for stationary and nonstationary long-range dependence. The model nests standard I(0) and I(1) behaviour smoothly in common factors and residuals, removing the necessity of a priori unit-root and stationarity testing...

  14. The watercolor effect: quantitative evidence for luminance-dependent mechanisms of long-range color assimilation.

    Science.gov (United States)

    Devinck, Frédéric; Delahunt, Peter B; Hardy, Joseph L; Spillmann, Lothar; Werner, John S

    2005-05-01

    When a dark chromatic contour delineating a figure is flanked on the inside by a brighter chromatic contour, the brighter color will spread into the entire enclosed area. This is known as the watercolor effect (WCE). Here we quantified the effect of color spreading using both color-matching and hue-cancellation tasks. Over a wide range of stimulus chromaticities, there was a reliable shift in color appearance that closely followed the direction of the inducing contour. When the contours were equated in luminance, the WCE was still present, but weak. The magnitude of the color spreading increased with increases in luminance contrast between the two contours. Additionally, as the luminance contrast between the contours increased, the chromaticity of the induced color more closely resembled that of the inside contour. The results support the hypothesis that the WCE is mediated by luminance-dependent mechanisms of long-range color assimilation.

  15. Short-range correlations in an extended time-dependent mean-field theory

    International Nuclear Information System (INIS)

    Madler, P.

    1982-01-01

    A generalization is performed of the time-dependent mean-field theory by an explicit inclusion of strong short-range correlations on a level of microscopic reversibility relating them to realistic nucleon-nucleon forces. Invoking a least action principle for correlated trial wave functions, equations of motion for the correlation functions and the single-particle model wave function are derived in lowest order of the FAHT cluster expansion. Higher order effects as well as long-range correlations are consider only to the extent to which they contribute to the mean field via a readjusted phenomenological effective two-body interaction. The corresponding correlated stationary problem is investigated and appropriate initial conditions to describe a heavy ion reaction are proposed. The singleparticle density matrix is evaluated

  16. A long range dependent model with nonlinear innovations for simulating daily river flows

    Directory of Open Access Journals (Sweden)

    P. Elek

    2004-01-01

    Full Text Available We present the analysis aimed at the estimation of flood risks of Tisza River in Hungary on the basis of daily river discharge data registered in the last 100 years. The deseasonalised series has skewed and leptokurtic distribution and various methods suggest that it possesses substantial long memory. This motivates the attempt to fit a fractional ARIMA model with non-Gaussian innovations as a first step. Synthetic streamflow series can then be generated from the bootstrapped innovations. However, there remains a significant difference between the empirical and the synthetic density functions as well as the quantiles. This brings attention to the fact that the innovations are not independent, both their squares and absolute values are autocorrelated. Furthermore, the innovations display non-seasonal periods of high and low variances. This behaviour is characteristic to generalised autoregressive conditional heteroscedastic (GARCH models. However, when innovations are simulated as GARCH processes, the quantiles and extremes of the discharge series are heavily overestimated. Therefore we suggest to fit a smooth transition GARCH-process to the innovations. In a standard GARCH model the dependence of the variance on the lagged innovation is quadratic whereas in our proposed model it is a bounded function. While preserving long memory and eliminating the correlation from both the generating noise and from its square, the new model is superior to the previously mentioned ones in approximating the probability density, the high quantiles and the extremal behaviour of the empirical river flows.

  17. Long Range Dependence Prognostics for Bearing Vibration Intensity Chaotic Time Series

    Directory of Open Access Journals (Sweden)

    Qing Li

    2016-01-01

    Full Text Available According to the chaotic features and typical fractional order characteristics of the bearing vibration intensity time series, a forecasting approach based on long range dependence (LRD is proposed. In order to reveal the internal chaotic properties, vibration intensity time series are reconstructed based on chaos theory in phase-space, the delay time is computed with C-C method and the optimal embedding dimension and saturated correlation dimension are calculated via the Grassberger–Procaccia (G-P method, respectively, so that the chaotic characteristics of vibration intensity time series can be jointly determined by the largest Lyapunov exponent and phase plane trajectory of vibration intensity time series, meanwhile, the largest Lyapunov exponent is calculated by the Wolf method and phase plane trajectory is illustrated using Duffing-Holmes Oscillator (DHO. The Hurst exponent and long range dependence prediction method are proposed to verify the typical fractional order features and improve the prediction accuracy of bearing vibration intensity time series, respectively. Experience shows that the vibration intensity time series have chaotic properties and the LRD prediction method is better than the other prediction methods (largest Lyapunov, auto regressive moving average (ARMA and BP neural network (BPNN model in prediction accuracy and prediction performance, which provides a new approach for running tendency predictions for rotating machinery and provide some guidance value to the engineering practice.

  18. Approximation for the Finite-Time Ruin Probability of a General Risk Model with Constant Interest Rate and Extended Negatively Dependent Heavy-Tailed Claims

    Directory of Open Access Journals (Sweden)

    Yang Yang

    2011-01-01

    Full Text Available We propose a general continuous-time risk model with a constant interest rate. In this model, claims arrive according to an arbitrary counting process, while their sizes have dominantly varying tails and fulfill an extended negative dependence structure. We obtain an asymptotic formula for the finite-time ruin probability, which extends a corresponding result of Wang (2008.

  19. Earthquake simulations with time-dependent nucleation and long-range interactions

    Directory of Open Access Journals (Sweden)

    J. H. Dieterich

    1995-01-01

    Full Text Available A model for rapid simulation of earthquake sequences is introduced which incorporates long-range elastic interactions among fault elements and time-dependent earthquake nucleation inferred from experimentally derived rate- and state-dependent fault constitutive properties. The model consists of a planar two-dimensional fault surface which is periodic in both the x- and y-directions. Elastic interactions among fault elements are represented by an array of elastic dislocations. Approximate solutions for earthquake nucleation and dynamics of earthquake slip are introduced which permit computations to proceed in steps that are determined by the transitions from one sliding state to the next. The transition-driven time stepping and avoidance of systems of simultaneous equations permit rapid simulation of large sequences of earthquake events on computers of modest capacity, while preserving characteristics of the nucleation and rupture propagation processes evident in more detailed models. Earthquakes simulated with this model reproduce many of the observed spatial and temporal characteristics of clustering phenomena including foreshock and aftershock sequences. Clustering arises because the time dependence of the nucleation process is highly sensitive to stress perturbations caused by nearby earthquakes. Rate of earthquake activity following a prior earthquake decays according to Omori's aftershock decay law and falls off with distance.

  20. Delay-range-dependent exponential H∞ synchronization of a class of delayed neural networks

    International Nuclear Information System (INIS)

    Karimi, Hamid Reza; Maass, Peter

    2009-01-01

    This article aims to present a multiple delayed state-feedback control design for exponential H ∞ synchronization problem of a class of delayed neural networks with multiple time-varying discrete delays. On the basis of the drive-response concept and by introducing a descriptor technique and using Lyapunov-Krasovskii functional, new delay-range-dependent sufficient conditions for exponential H ∞ synchronization of the drive-response structure of neural networks are driven in terms of linear matrix inequalities (LMIs). The explicit expression of the controller gain matrices are parameterized based on the solvability conditions such that the drive system and the response system can be exponentially synchronized. A numerical example is included to illustrate the applicability of the proposed design method.

  1. Impact-parameter dependence of the total probability for electromagnetic electron-positron pair production in relativistic heavy-ion collisions

    International Nuclear Information System (INIS)

    Hencken, K.; Trautmann, D.; Baur, G.

    1995-01-01

    We calculate the impact-parameter-dependent total probability P total (b) for the electromagnetic production of electron-positron pairs in relativistic heavy-ion collisions in lowest order. We study expecially impact parameters smaller than the Compton wavelength of the electron, where the equivalent-photon approximation cannot be used. Calculations with and without a form factor for the heavy ions are done; the influence is found to be small. The lowest-order results are found to violate unitarity and are used for the calculation of multiple-pair production probabilities with the help of the approximate Poisson distribution already found in earlier publications

  2. The temperature dependence of intermediate range oxygen-oxygen correlations in liquid water

    International Nuclear Information System (INIS)

    Schlesinger, Daniel; Pettersson, Lars G. M.; Wikfeldt, K. Thor; Skinner, Lawrie B.; Benmore, Chris J.; Nilsson, Anders

    2016-01-01

    We analyze the recent temperature dependent oxygen-oxygen pair-distribution functions from experimental high-precision x-ray diffraction data of bulk water by Skinner et al. [J. Chem. Phys. 141, 214507 (2014)] with particular focus on the intermediate range where small, but significant, correlations are found out to 17 Å. The second peak in the pair-distribution function at 4.5 Å is connected to tetrahedral coordination and was shown by Skinner et al. to change behavior with temperature below the temperature of minimum isothermal compressibility. Here we show that this is associated also with a peak growing at 11 Å which strongly indicates a collective character of fluctuations leading to the enhanced compressibility at lower temperatures. We note that the peak at ∼13.2 Å exhibits a temperature dependence similar to that of the density with a maximum close to 277 K or 4 °C. We analyze simulations of the TIP4P/2005 water model in the same manner and find excellent agreement between simulations and experiment albeit with a temperature shift of ∼20 K.

  3. The temperature dependence of intermediate range oxygen-oxygen correlations in liquid water

    Energy Technology Data Exchange (ETDEWEB)

    Schlesinger, Daniel; Pettersson, Lars G. M., E-mail: Lars.Pettersson@fysik.su.se [Department of Physics, AlbaNova University Center, Stockholm University, SE-106 91 Stockholm (Sweden); Wikfeldt, K. Thor [Department of Physics, AlbaNova University Center, Stockholm University, SE-106 91 Stockholm (Sweden); Science Institute, University of Iceland, VR-III, 107 Reykjavik (Iceland); Skinner, Lawrie B.; Benmore, Chris J. [X-ray Science Division, Advanced Photon Source, Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nilsson, Anders [Department of Physics, AlbaNova University Center, Stockholm University, SE-106 91 Stockholm (Sweden); Stanford Synchrotron Radiation Lightsource, SLAC National Accelerator Laboratory, Menlo Park, California 94025 (United States)

    2016-08-28

    We analyze the recent temperature dependent oxygen-oxygen pair-distribution functions from experimental high-precision x-ray diffraction data of bulk water by Skinner et al. [J. Chem. Phys. 141, 214507 (2014)] with particular focus on the intermediate range where small, but significant, correlations are found out to 17 Å. The second peak in the pair-distribution function at 4.5 Å is connected to tetrahedral coordination and was shown by Skinner et al. to change behavior with temperature below the temperature of minimum isothermal compressibility. Here we show that this is associated also with a peak growing at 11 Å which strongly indicates a collective character of fluctuations leading to the enhanced compressibility at lower temperatures. We note that the peak at ∼13.2 Å exhibits a temperature dependence similar to that of the density with a maximum close to 277 K or 4 °C. We analyze simulations of the TIP4P/2005 water model in the same manner and find excellent agreement between simulations and experiment albeit with a temperature shift of ∼20 K.

  4. Context-dependent JPEG backward-compatible high-dynamic range image compression

    Science.gov (United States)

    Korshunov, Pavel; Ebrahimi, Touradj

    2013-10-01

    High-dynamic range (HDR) imaging is expected, together with ultrahigh definition and high-frame rate video, to become a technology that may change photo, TV, and film industries. Many cameras and displays capable of capturing and rendering both HDR images and video are already available in the market. The popularity and full-public adoption of HDR content is, however, hindered by the lack of standards in evaluation of quality, file formats, and compression, as well as large legacy base of low-dynamic range (LDR) displays that are unable to render HDR. To facilitate the wide spread of HDR usage, the backward compatibility of HDR with commonly used legacy technologies for storage, rendering, and compression of video and images are necessary. Although many tone-mapping algorithms are developed for generating viewable LDR content from HDR, there is no consensus of which algorithm to use and under which conditions. We, via a series of subjective evaluations, demonstrate the dependency of the perceptual quality of the tone-mapped LDR images on the context: environmental factors, display parameters, and image content itself. Based on the results of subjective tests, it proposes to extend JPEG file format, the most popular image format, in a backward compatible manner to deal with HDR images also. An architecture to achieve such backward compatibility with JPEG is proposed. A simple implementation of lossy compression demonstrates the efficiency of the proposed architecture compared with the state-of-the-art HDR image compression.

  5. NEUTRON-PROTON EFFECTIVE RANGE PARAMETERS AND ZERO-ENERGY SHAPE DEPENDENCE.

    Energy Technology Data Exchange (ETDEWEB)

    HACKENBURG, R.W.

    2005-06-01

    A completely model-independent effective range theory fit to available, unpolarized, np scattering data below 3 MeV determines the zero-energy free proton cross section {sigma}{sub 0} = 20.4287 {+-} 0.0078 b, the singlet apparent effective range r{sub s} = 2.754 {+-} 0.018{sub stat} {+-} 0.056{sub syst} fm, and improves the error slightly on the parahydrogen coherent scattering length, a{sub c} = -3.7406 {+-} 0.0010 fm. The triplet and singlet scattering lengths and the triplet mixed effective range are calculated to be a{sub t} = 5.4114 {+-} 0.0015 fm, a{sub s} = -23.7153 {+-} 0.0043 fm, and {rho}{sub t}(0,-{epsilon}{sub t}) = 1.7468 {+-} 0.0019 fm. The model-independent analysis also determines the zero-energy effective ranges by treating them as separate fit parameters without the constraint from the deuteron binding energy {epsilon}{sub t}. These are determined to be {rho}{sub t}(0,0) = 1.705 {+-} 0.023 fm and {rho}{sub s}(0,0) = 2.665 {+-} 0.056 fm. This determination of {rho}{sub t}(0,0) and {rho}{sub s}(0,0) is most sensitive to the sparse data between about 20 and 600 keV, where the correlation between the determined values of {rho}{sub t}(0,0) and {rho}{sub s}(0,0) is at a minimum. This correlation is responsible for the large systematic error in r{sub s}. More precise data in this range are needed. The present data do not event determine (with confidence) that {rho}{sub t}(0,0) {ne} {rho}{sub t}(0, -{epsilon}{sub t}), referred to here as ''zero-energy shape dependence''. The widely used measurement of {sigma}{sub 0} = 20.491 {+-} 0.014 b from W. Dilg, Phys. Rev. C 11, 103 (1975), is argued to be in error.

  6. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    International Nuclear Information System (INIS)

    Lehua Pan; G.S. Bodvarsson

    2001-01-01

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions

  7. Quantifying the range of cross-correlated fluctuations using a q- L dependent AHXA coefficient

    Science.gov (United States)

    Wang, Fang; Wang, Lin; Chen, Yuming

    2018-03-01

    Recently, based on analogous height cross-correlation analysis (AHXA), a cross-correlation coefficient ρ×(L) has been proposed to quantify the levels of cross-correlation on different temporal scales for bivariate series. A limitation of this coefficient is that it cannot capture the full information of cross-correlations on amplitude of fluctuations. In fact, it only detects the cross-correlation at a specific order fluctuation, which might neglect some important information inherited from other order fluctuations. To overcome this disadvantage, in this work, based on the scaling of the qth order covariance and time delay L, we define a two-parameter dependent cross-correlation coefficient ρq(L) to detect and quantify the range and level of cross-correlations. This new version of ρq(L) coefficient leads to the formation of a ρq(L) surface, which not only is able to quantify the level of cross-correlations, but also allows us to identify the range of fluctuation amplitudes that are correlated in two given signals. Applications to the classical ARFIMA models and the binomial multifractal series illustrate the feasibility of this new coefficient ρq(L) . In addition, a statistical test is proposed to quantify the existence of cross-correlations between two given series. Applying our method to the real life empirical data from the 1999-2000 California electricity market, we find that the California power crisis in 2000 destroys the cross-correlation between the price and the load series but does not affect the correlation of the load series during and before the crisis.

  8. Long-range dependence in returns and volatility of global gold market amid financial crises

    Science.gov (United States)

    Omane-Adjepong, Maurice; Boako, Gideon

    2017-04-01

    Using sampled historical daily gold market data from 07-03-1985 to 06-01-2015, and building on a related work by Bentes (2016), this paper examines the presence of long-range dependence (LRD) in the world's gold market returns and volatility, accounting for structural breaks. The sampled gold market data was divided into subsamples based on four global crises: the September 1992 collapse of the European Exchange Rate Mechanism (ERM), the Asian financial crisis of mid-1997, the Subprime meltdown of 2007, and the recent European sovereign debt crisis, which hit the world's market with varying effects. LRD test was carried-out on the full-sample and subsample periods using three semiparametric methods-before and after adjusting for structural breaks. The results show insignificant evidence of LRD in gold returns. However, very diminutive evidence is found for periods characterized by financial/economic shocks, with no significant detections for post-shock periods. Collectively, this is indicative that the gold market is less speculative, and hence could be somehow less risky for hedging and portfolio diversification.

  9. Searching for long-range dependence in real effective exchange rate: towards parity?

    Directory of Open Access Journals (Sweden)

    André M. Marques

    2015-12-01

    Full Text Available Abstract After the widespread adoption of flexible exchange rate regime since 1973 the volatility of the exchange rate has increased, as a consequence of greater trade openness and financial integration. As a result, it has become difficult to find evidence of the purchasing power parity hypothesis (PPP. This study investigates the possibility of a fall in the persistence of the real exchange rate as a consequence of the financial and commercial integration by employing monthly real effective exchange rate dataset provided by the International Monetary Fund (IMF. Beginning with an exploratory data analysis in the frequency domain, the fractional coefficient d was estimated employing the bias-reduced estimator on a sample of 20 countries over the period ranging from 1975 to 2011. As the main novelty, this study applies a bias-reduced log-periodogram regression estimator instead of the traditional method proposed by GPH which eliminates the first and higher orders biases by a data-dependent plug-in method for selecting the number of frequencies to minimize asymptotic mean-squared error (MSE. Additionally, this study also estimates a moving window of fifteen years to observe the path of the fractional coefficient in each country. No evidence was found of a statistically significant change in the persistence of the real exchange rate.

  10. Protection against post-irradiation oxygen-dependent damage in barley seeds by catalase and hydrogen peroxide: probable radiation chemistry

    International Nuclear Information System (INIS)

    Singh, S.P.; Kesavan, P.C.

    1990-01-01

    Influence of varying concentration of catalase and H 2 O 2 administered individually and in combination treatment during post-hydration on the oxygen-dependent and -independent pathways of damage was assessed in dry barley seeds irradiated in vacuo with 350 Gy of 60 Co gammarays. Both catalase (100 to 500 units/ml) and H 2 O 2 (0.001 to 0.1 mM) afforded significant radioprotection against the post-irradiation O 2 -dependent damage. However, a combination treatment (300 units/ml of catalase and 0.01 mM of H 2 O 2 ) afforded significantl y more protection than either of the additives individually. None of the concentrations of catalase exerted any effect on the O 2 -independent pathway, whereas H 2 O 2 at higher concentrations (1 and 10 mM) significantly potentiated both the O 2 -dependent as well as the -independent components of radiation damage. These observations are better explicable in terms of radiation chemistry. (author). 16 refs., 3 tabs

  11. Time-dependent fracture probability of bilayer, lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation

    Science.gov (United States)

    Anusavice, Kenneth J.; Jadaan, Osama M.; Esquivel–Upshaw, Josephine

    2013-01-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. Objective The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6 mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Materials and methods Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Results Predicted fracture probabilities (Pf) for centrally-loaded 1,6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8 mm/0.8 mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4 mm/1.2 mm). Conclusion CARES/Life results support the proposed crown design and load orientation hypotheses. Significance The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. PMID:24060349

  12. Time-dependent fracture probability of bilayer, lithium-disilicate-based, glass-ceramic, molar crowns as a function of core/veneer thickness ratio and load orientation.

    Science.gov (United States)

    Anusavice, Kenneth J; Jadaan, Osama M; Esquivel-Upshaw, Josephine F

    2013-11-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Predicted fracture probabilities (Pf) for centrally loaded 1.6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8mm/0.8mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4mm/1.2mm). CARES/Life results support the proposed crown design and load orientation hypotheses. The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. Copyright © 2013 Academy of Dental Materials. All rights reserved.

  13. Membrane topology of Golgi-localized probable S-adenosylmethionine-dependent methyltransferase in tobacco (Nicotiana tabacum) BY-2 cells.

    Science.gov (United States)

    Liu, Jianping; Hayashi, Kyoko; Matsuoka, Ken

    2015-01-01

    S-adenosylmethionine (SAM)-dependent methyltransferases (MTases) transfer methyl groups to substrates. In this study, a novel putative tobacco SAM-MTase termed Golgi-localized methyl transferase 1 (GLMT1) has been characterized. GLMT1 is comprised of 611 amino acids with short N-terminal region, putative transmembrane region, and C-terminal SAM-MTase domain. Expression of monomeric red fluorescence protein (mRFP)-tagged protein in tobacco BY-2 cell indicated that GLMT1 is a Golgi-localized protein. Analysis of the membrane topology by protease digestion suggested that both C-terminal catalytic region and N-terminal region seem to be located to the cytosolic side of the Golgi apparatus. Therefore, GLMT1 might have a different function than the previously studied SAM-MTases in plants.

  14. Ethanolic extract of Aconiti Brachypodi Radix attenuates nociceptive pain probably via inhibition of voltage-dependent Na⁺ channel.

    Science.gov (United States)

    Ren, Wei; Yuan, Lin; Li, Jun; Huang, Xian-Ju; Chen, Su; Zou, Da-Jiang; Liu, Xiangming; Yang, Xin-Zhou

    2012-01-01

    Aconiti Brachypodi Radix, belonging to the genus of Aconitum (Family Ranunculaceae), are used clinically as anti-rheumatic, anti-inflammatory and anti-nociceptive in traditional medicine of China. However, its mechanism and influence on nociceptive threshold are unknown and need further investigation. The analgesic effects of ethanolic extract of Aconiti Brachypodi Radix (EABR) were thus studied in vivo and in vitro. Three pain models in mice were used to assess the effect of EABR on nociceptive threshold. In vitro study was conducted to clarify the modulation of the extract on the tetrodotoxin-sensitive (TTX-S) sodium currents in rat's dorsal root ganglion (DRG) neurons using whole-cell patch clamp technique. The results showed that EABR (5-20 mg/kg, i.g.) could produce dose-dependent analgesic effect on hot-plate tests as well as writhing response induced by acetic acid. In addition, administration of 2.5-10 mg/kg EABR (i.g.) caused significant decrease in pain responses in the first and second phases of formalin test without altering the PGE₂ production in the hind paw of the mice. Moreover, EABR (10 µg/ml -1 mg/ml) could suppress TTX-S voltage-gated sodium currents in a dose-dependent way, indicating the underlying electrophysiological mechanism of the analgesic effect of the folk plant medicine. Collectively, our results indicated that EABR has analgesic property in three pain models and useful influence on TTX-S sodium currents in DRG neurons, suggesting that the interference with pain messages caused by the modulation of EABR on TTX-S sodium currents in DRG neurones may explain some of its analgesic effect.

  15. Conductance of partially disordered graphene: crossover from temperature-dependent to field-dependent variable-range hopping

    International Nuclear Information System (INIS)

    Cheah, C Y; Jaurigue, L C; Kaiser, A B; Gómez-Navarro, C

    2013-01-01

    We report an analysis of low-temperature measurements of the conductance of partially disordered reduced graphene oxide, finding that the data follow a simple crossover scenario. At room temperature, the conductance is dominated by two-dimensional (2D) electric field-assisted, thermally driven (Pollak–Riess) variable-range hopping (VRH) through highly disordered regions. However, at lower temperatures T, we find a smooth crossover to follow the exp(−E 0 /E) 1/3 field-driven (Shklovskii) 2D VRH conductance behaviour when the electric field E exceeds a specific crossover value E C (T) 2D =(E a E 0 1/3 /3) 3/4 determined by the scale factors E 0 and E a for the high-field and intermediate-field regimes respectively. Our crossover scenario also accounts well for experimental data reported by other authors for three-dimensional disordered carbon networks, suggesting wide applicability. (paper)

  16. Extended parametric gain range in photonic crystal fibers with strongly frequency-dependent field distributions.

    Science.gov (United States)

    Petersen, Sidsel R; Alkeskjold, Thomas T; Olausson, Christina B; Lægsgaard, Jesper

    2014-08-15

    The parametric gain range of a degenerate four-wave mixing process is determined in the undepleted pump regime. The gain range is considered with and without taking the mode field distributions of the four-wave mixing components into account. It is found that the mode field distributions have to be included to evaluate the parametric gain correctly in dispersion-tailored speciality fibers and that mode profile engineering can provide a way to increase the parametric gain range.

  17. Dependability investigation of wireless short range embedded systems: hardware platform oriented approach

    NARCIS (Netherlands)

    Senouci, B.; Kerkhoff, Hans G.; Annema, Anne J.; Bentum, Marinus Jan

    2015-01-01

    A new direction in short-range wireless applications has appeared in the form of high-speed data communication devices for distances of hundreds meters. Behind these embedded applications, a complex heterogeneous architecture is built. Moreover, these short range communications are introduced into

  18. Extended parametric gain range in photonic crystal fibers with strongly frequency-dependent field distributions

    DEFF Research Database (Denmark)

    Petersen, Sidsel Rübner; Alkeskjold, Thomas Tanggaard; Olausson, Christina Bjarnal Thulin

    2014-01-01

    The parametric gain range of a degenerate four-wave mixing process is determined in the undepleted pump regime. The gain range is considered with and without taking the mode field distributions of the four-wave mixing components into account. It is found that the mode field distributions have...

  19. Range dependent characteristics in the head-related transfer functions of a bat-head cast: part 2. Binaural characteristics

    International Nuclear Information System (INIS)

    Kim, S; Allen, R; Rowan, D

    2012-01-01

    Further innovations in bio-inspired engineering based on biosonar systems, such as bats, may arise from more detailed understanding of the underlying acoustic processes. This includes the range-dependent properties of bat heads and ears, particularly at the higher frequencies of bat vocalizations. In a companion paper Kim et al (2012 Bioinspir. Biomim.), range-dependent head-related transfer functions of a bat head cast were investigated up to 100 kHz at either ear (i.e. monaural features). The current paper extends this to consider range-dependent spectral and temporal disparities between the two ears (i.e. binaural features), using experimental data and a spherical model of a bat head to provide insights into the physical basis for these features. It was found that binaural temporal and high-frequency binaural spectral features are approximately independent of distance, having the effect of decreasing their angular resolution at close range. In contrast, low-frequency binaural spectral features are strongly distance-dependent, such that angular sensitivity can be maintained by lowering the frequency of the echolocation emission at close range. Together with the companion paper Kim et al, we speculate that distance-dependent low-frequency monaural and binaural features at short range might help explain why some species of bats that drop the frequency of their calls on target approach while approaching a target. This also provides an impetus for the design of effective emissions in sonar engineering applied to similar tasks. (paper)

  20. Effect of spin-orbit coupling on the wave vector and spin dependent transmission probability for the GaN/AlGaN/GaN heterostructure

    International Nuclear Information System (INIS)

    Li, M; Zhao, Z B; Fan, L B

    2015-01-01

    The effect of the Rashba and Dresselhaus spin–orbit coupling (SOC) on the transmission of electrons through the GaN/AlGaN/GaN heterostructure is studied. It is found that the Dresselhaus SOC causes the evident dependence of the transmission probability on the spin polarization and the in-plane wave vector of electrons, and also induces evident spin splitting of the resonant peaks in the (E z -k) plane. Because the magnitude of the Rashba SOC is relatively small, its effect on the transmission of electrons is much less. As k increases, the peaks of transmission probability for spin-up electrons (T + ) shift to a higher energy region and increase in magnitude, while the peaks of transmission probability for spin-down electrons (T − ) shift to a lower energy region and decrease in magnitude. The polarization efficiency (P) is found to peak at the resonant energies and increases with the in-plane wave vector. Moreover, the built-in electric field caused by the spontaneous and piezoelectric polarization can increase the amplitude of P. Results obtained here are helpful for the efficient spin injection into the III-nitride heterostructures by nonmagnetic means from the device point of view. (paper)

  1. Temperature Dependence of Short-Range Order in β-Brass

    DEFF Research Database (Denmark)

    Dietrich, O.W.; Als-Nielsen, Jens Aage

    1967-01-01

    Critical scattering of neutrons around the superlattice reflections (1, 0, 0) and (1, 1, 1) from a single crystal of beta-brass has been measured at temperatures from 2 to 25deg C above the transition temperature. The temperature dependence of the critical peak intensity, proportional to the susc......Critical scattering of neutrons around the superlattice reflections (1, 0, 0) and (1, 1, 1) from a single crystal of beta-brass has been measured at temperatures from 2 to 25deg C above the transition temperature. The temperature dependence of the critical peak intensity, proportional...

  2. Dependence of four-body observables on the range of UPA-like effective interactions

    International Nuclear Information System (INIS)

    Perne, R.; Sandhas, W.

    1977-07-01

    A generalized unitary pole approximation (UPA) concerning the three-body amplitudes in the kernel of four-body integral equations is introduced. We furhtermore study the dependence of the 4 He binding energy and of four-body cross sections upon a position space cut-off parameter in the effective interactions. (orig.) [de

  3. Development of a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence after curative radiotherapy/chemo-radiotherapy in head and neck cancer

    International Nuclear Information System (INIS)

    Wopken, Kim; Bijl, Hendrik P.; Schaaf, Arjen van der; Laan, Hans Paul van der; Chouvalova, Olga; Steenbakkers, Roel J.H.M.; Doornaert, Patricia; Slotman, Ben J.; Oosting, Sjoukje F.; Christianen, Miranda E.M.C.; Laan, Bernard F.A.M. van der; Roodenburg, Jan L.N.; René Leemans, C.; Verdonck-de Leeuw, Irma M.; Langendijk, Johannes A.

    2014-01-01

    Background and purpose: Curative radiotherapy/chemo-radiotherapy for head and neck cancer (HNC) may result in severe acute and late side effects, including tube feeding dependence. The purpose of this prospective cohort study was to develop a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence 6 months (TUBE M6 ) after definitive radiotherapy, radiotherapy plus cetuximab or concurrent chemoradiation based on pre-treatment and treatment characteristics. Materials and methods: The study included 355 patients with HNC. TUBE M6 was scored prospectively in a standard follow-up program. To design the prediction model, the penalized learning method LASSO was used, with TUBE M6 as the endpoint. Results: The prevalence of TUBE M6 was 10.7%. The multivariable model with the best performance consisted of the variables: advanced T-stage, moderate to severe weight loss at baseline, accelerated radiotherapy, chemoradiation, radiotherapy plus cetuximab, the mean dose to the superior and inferior pharyngeal constrictor muscle, to the contralateral parotid gland and to the cricopharyngeal muscle. Conclusions: We developed a multivariable NTCP model for TUBE M6 to identify patients at risk for tube feeding dependence. The dosimetric variables can be used to optimize radiotherapy treatment planning aiming at prevention of tube feeding dependence and to estimate the benefit of new radiation technologies

  4. Dependence of conductivity on thickness within the variable-range hopping regime for Coulomb glasses

    Directory of Open Access Journals (Sweden)

    M. Caravaca

    Full Text Available In this paper, we provide some computational evidence concerning the dependence of conductivity on the system thickness for Coulomb glasses. We also verify the Efros–Shklovskii law and deal with the calculation of its characteristic parameter as a function of the thickness. Our results strengthen the link between theoretical and experimental fields. Keywords: Coulomb glass, Conductivity, Density of states, Efros–Shklovskii law

  5. Evidence of long range dependence in Asian equity markets: the role of liquidity and market restrictions

    Science.gov (United States)

    Cajueiro, Daniel O.; Tabak, Benjamin M.

    2004-11-01

    In this paper, the efficient market hypothesis is tested for China, Hong Kong and Singapore by means of the long memory dependence approach. We find evidence suggesting that Hong Kong is the most efficient market followed by Chinese A type shares and Singapore and finally by Chinese B type shares, which suggests that liquidity and capital restrictions may play a role in explaining results of market efficiency tests.

  6. Range-separated time-dependent density-functional theory with a frequency-dependent second-order Bethe-Salpeter correlation kernel

    Energy Technology Data Exchange (ETDEWEB)

    Rebolini, Elisa, E-mail: elisa.rebolini@kjemi.uio.no; Toulouse, Julien, E-mail: julien.toulouse@upmc.fr [Laboratoire de Chimie Théorique, Sorbonne Universités, UPMC Univ Paris 06, CNRS, 4 place Jussieu, F-75005 Paris (France)

    2016-03-07

    We present a range-separated linear-response time-dependent density-functional theory (TDDFT) which combines a density-functional approximation for the short-range response kernel and a frequency-dependent second-order Bethe-Salpeter approximation for the long-range response kernel. This approach goes beyond the adiabatic approximation usually used in linear-response TDDFT and aims at improving the accuracy of calculations of electronic excitation energies of molecular systems. A detailed derivation of the frequency-dependent second-order Bethe-Salpeter correlation kernel is given using many-body Green-function theory. Preliminary tests of this range-separated TDDFT method are presented for the calculation of excitation energies of the He and Be atoms and small molecules (H{sub 2}, N{sub 2}, CO{sub 2}, H{sub 2}CO, and C{sub 2}H{sub 4}). The results suggest that the addition of the long-range second-order Bethe-Salpeter correlation kernel overall slightly improves the excitation energies.

  7. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  8. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  9. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  10. Temperature dependence of thermal expansion of cadmium sulfide in the temperature range 20 - 820 K

    International Nuclear Information System (INIS)

    Oskotskij, V.S.; Kobyakov, I.B.; Solodukhin, A.V.

    1980-01-01

    The linear thermal expansion of cadmium sulfide is measured perpendicularly (α 1 ) and parallelly (α 2 ) to the hexagonal axis in the temperature range from 20 to 820 K. Anisotropy is low at up to 80 K; rises at higher temperatures; at 3OO K α 1 /α 3 ratio is 1.8; at 820 K, 2.4. Heat expansion is negative at temperatures lower than 104.5 K(α 1 ) and 126.0 K(α 2 ). It achieves the minimum at 43.6 K (α 1 ) and 52.5K (α 3 ). The theory of heat expansion is plotted in the Debue, approximation and cadmium sulfide is considered as an isotope crystal with average elastic constants. Two parameters of the theory are determined by the position and value of the minimum of volumetric thermal expansion of the model isotope crystal. The theoretic curve agrees well with the experimental one at temperatures up to 160 K, i.e in the range of applicability of the Debue approximation and the isotropic model

  11. AUDIT-C scores as a scaled marker of mean daily drinking, alcohol use disorder severity, and probability of alcohol dependence in a U.S. general population sample of drinkers.

    Science.gov (United States)

    Rubinsky, Anna D; Dawson, Deborah A; Williams, Emily C; Kivlahan, Daniel R; Bradley, Katharine A

    2013-08-01

    Brief alcohol screening questionnaires are increasingly used to identify alcohol misuse in routine care, but clinicians also need to assess the level of consumption and the severity of misuse so that appropriate intervention can be offered. Information provided by a patient's alcohol screening score might provide a practical tool for assessing the level of consumption and severity of misuse. This post hoc analysis of data from the 2001 to 2002 National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) included 26,546 U.S. adults who reported drinking in the past year and answered additional questions about their consumption, including Alcohol Use Disorders Identification Test-Consumption questionnaire (AUDIT-C) alcohol screening. Linear or logistic regression models and postestimation methods were used to estimate mean daily drinking, the number of endorsed alcohol use disorder (AUD) criteria ("AUD severity"), and the probability of alcohol dependence associated with each individual AUDIT-C score (1 to 12), after testing for effect modification by gender and age. Among eligible past-year drinkers, mean daily drinking, AUD severity, and the probability of alcohol dependence increased exponentially across increasing AUDIT-C scores. Mean daily drinking ranged from alcohol dependence ranged from used to estimate patient-specific consumption and severity based on age, gender, and alcohol screening score. This information could be integrated into electronic decision support systems to help providers estimate and provide feedback about patient-specific risks and identify those patients most likely to benefit from further diagnostic assessment. Copyright © 2013 by the Research Society on Alcoholism.

  12. Mass dependence of short-range correlations in nuclei and the EMC effect

    Directory of Open Access Journals (Sweden)

    Cosyn Wim

    2014-03-01

    Full Text Available We sketch an approximate method to quantify the number of correlated pairs in any nucleus A. It is based on counting independent-particle model (IPM nucleon-nucleon pairs in a relative S-state with no radial excitation. We show that IPM pairs with those quantum numbers are most prone to short-range correlations and are at the origin of the high-momentum tail of the nuclear momentum distributions. Our method allows to compute the a2 ratios extracted from inclusive electron scattering. Furthermore, our results reproduce the observed linear correlation between the number of correlated pairs and the magnitude of the EMC effect. We show that the width of the pair center-ofmass distribution in exclusive two-nucleon knockout yields information on the quantum numbers of the pairs.

  13. The dependence of the nuclear charge form factor on short range correlations and surface fluctuation effects

    International Nuclear Information System (INIS)

    Massen, S. E.; Garistov, V. P.; Grypeos, M. E.

    1996-01-01

    The effects of nuclear surface fluctuations on harmonic oscillator elastic charge form factor of light nuclei are investigated, simultaneously approximating the short-range correlations through a Jastrow correlation factor. Inclusion of the surface fluctuation effects within this description, by truncating the cluster expansion at the two-body part, is found to improve somewhat the fit to the elastic charge form-factor of 16 O and 40 Ca. However, the convergence of the cluster expansion is expected to deteriorate. An additional finding is that surface-fluctuation correlations produce a drastic change in the asymptotic behaviour of the point-proton form-factor, which now falls off quite slowly (i.e. as const.q -4 ) at large values of the momentum transfer q

  14. Communication: Anomalous temperature dependence of the intermediate range order in phosphonium ionic liquids

    International Nuclear Information System (INIS)

    Hettige, Jeevapani J.; Kashyap, Hemant K.; Margulis, Claudio J.

    2014-01-01

    In a recent article by the Castner and Margulis groups [Faraday Discuss. 154, 133 (2012)], we described in detail the structure of the tetradecyltrihexylphosphonium bis(trifluoromethylsulfonyl)-amide ionic liquid as a function of temperature using X-ray scattering, and theoretical partitions of the computationally derived structure function. Interestingly, and as opposed to the case in most other ionic-liquids, the first sharp diffraction peak or prepeak appears to increase in intensity as temperature is increased. This phenomenon is counter intuitive as one would expect that intermediate range order fades as temperature increases. This Communication shows that a loss of hydrophobic tail organization at higher temperatures is counterbalanced by better organization of polar components giving rise to the increase in intensity of the prepeak

  15. Dependence of Coulomb Sum Rule on the Short Range Correlation by Using Av18 Potential

    Science.gov (United States)

    Modarres, M.; Moeini, H.; Moshfegh, H. R.

    The Coulomb sum rule (CSR) and structure factor are calculated for inelastic electron scattering from nuclear matter at zero and finite temperature in the nonrelativistic limit. The effect of short-range correlation (SRC) is presented by using lowest order constrained variational (LOCV) method and the Argonne Av18 and Δ-Reid soft-core potentials. The effects of different potentials as well as temperature are investigated. It is found that the nonrelativistic version of Bjorken scaling approximately sets in at the momentum transfer of about 1.1 to 1.2 GeV/c and the increase of temperature makes it to decrease. While different potentials do not significantly change CSR, the SRC improves the Coulomb sum rule and we get reasonably close results to both experimental data and others theoretical predictions.

  16. Generalized Cauchy model of sea level fluctuations with long-range dependence

    Science.gov (United States)

    Li, Ming; Li, Jia-Yue

    2017-10-01

    This article suggests the contributions with two highlights. One is to propose a novel model of sea level fluctuations (sea level for short), which is called the generalized Cauchy (GC) process. It provides a new outlook for the description of local and global behaviors of sea level from a view of fractal in that the fractal dimension D that measures the local behavior of sea level and the Hurst parameter H which characterizes the global behavior of sea level are independent of each other. The other is to show that sea level appears multi-fractal in both spatial and time. Such a meaning of multi-fractal is new in the sense that a pair of fractal parameters (D, H) of sea level is varying with measurement sites and time. This research exhibits that the ranges of D and H of sea level, in general, are 1 ≤ D sea level, we shall show that H > 0 . 96 for all data records at all measurement sites, implying that strong LRD may be a general phenomenon of sea level. On the other side, regarding with the local behavior, we will reveal that there appears D = 1 or D ≈ 1 for data records at a few stations and at some time, but D > 0 . 96 at most stations and at most time, meaning that sea level may appear highly local irregularity more frequently than weak local one.

  17. Experimental validation of gallium production and isotope-dependent positron range correction in PET

    Energy Technology Data Exchange (ETDEWEB)

    Fraile, L.M., E-mail: lmfraile@ucm.es [Grupo de Física Nuclear, Dpto. Física Atómica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Herraiz, J.L.; Udías, J.M.; Cal-González, J.; Corzo, P.M.G.; España, S.; Herranz, E.; Pérez-Liva, M.; Picado, E.; Vicente, E. [Grupo de Física Nuclear, Dpto. Física Atómica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Muñoz-Martín, A. [Centro de Microanálisis de Materiales, Universidad Autónoma de Madrid, E-28049 Madrid (Spain); Vaquero, J.J. [Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid (Spain)

    2016-04-01

    Positron range (PR) is one of the important factors that limit the spatial resolution of positron emission tomography (PET) preclinical images. Its blurring effect can be corrected to a large extent if the appropriate method is used during the image reconstruction. Nevertheless, this correction requires an accurate modelling of the PR for the particular radionuclide and materials in the sample under study. In this work we investigate PET imaging with {sup 68}Ga and {sup 66}Ga radioisotopes, which have a large PR and are being used in many preclinical and clinical PET studies. We produced a {sup 68}Ga and {sup 66}Ga phantom on a natural zinc target through (p,n) reactions using the 9-MeV proton beam delivered by the 5-MV CMAM tandetron accelerator. The phantom was imaged in an ARGUS small animal PET/CT scanner and reconstructed with a fully 3D iterative algorithm, with and without PR corrections. The reconstructed images at different time frames show significant improvement in spatial resolution when the appropriate PR is applied for each frame, by taking into account the relative amount of each isotope in the sample. With these results we validate our previously proposed PR correction method for isotopes with large PR. Additionally, we explore the feasibility of PET imaging with {sup 68}Ga and {sup 66}Ga radioisotopes in proton therapy.

  18. Metabolomic unveiling of a diverse range of green tea (Camellia sinensis) metabolites dependent on geography.

    Science.gov (United States)

    Lee, Jang-Eun; Lee, Bum-Jin; Chung, Jin-Oh; Kim, Hak-Nam; Kim, Eun-Hee; Jung, Sungheuk; Lee, Hyosang; Lee, Sang-Jun; Hong, Young-Shick

    2015-05-01

    Numerous factors such as geographical origin, cultivar, climate, cultural practices, and manufacturing processes influence the chemical compositions of tea, in the same way as growing conditions and grape variety affect wine quality. However, the relationships between these factors and tea chemical compositions are not well understood. In this study, a new approach for non-targeted or global analysis, i.e., metabolomics, which is highly reproducible and statistically effective in analysing a diverse range of compounds, was used to better understand the metabolome of Camellia sinensis and determine the influence of environmental factors, including geography, climate, and cultural practices, on tea-making. We found a strong correlation between environmental factors and the metabolome of green, white, and oolong teas from China, Japan, and South Korea. In particular, multivariate statistical analysis revealed strong inter-country and inter-city relationships in the levels of theanine and catechin derivatives found in green and white teas. This information might be useful for assessing tea quality or producing distinct tea products across different locations, and highlights simultaneous identification of diverse tea metabolites through an NMR-based metabolomics approach. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  20. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  1. Communication: Orbital instabilities and triplet states from time-dependent density functional theory and long-range corrected functionals

    Science.gov (United States)

    Sears, John S.; Koerzdoerfer, Thomas; Zhang, Cai-Rong; Brédas, Jean-Luc

    2011-10-01

    Long-range corrected hybrids represent an increasingly popular class of functionals for density functional theory (DFT) that have proven to be very successful for a wide range of chemical applications. In this Communication, we examine the performance of these functionals for time-dependent (TD)DFT descriptions of triplet excited states. Our results reveal that the triplet energies are particularly sensitive to the range-separation parameter; this sensitivity can be traced back to triplet instabilities in the ground state coming from the large effective amounts of Hartree-Fock exchange included in these functionals. As such, the use of standard long-range corrected functionals for the description of triplet states at the TDDFT level is not recommended.

  2. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  3. Range dependent characteristics in the head-related transfer functions of a bat-head cast: part 1. Monaural characteristics

    International Nuclear Information System (INIS)

    Kim, S; Allen, R; Rowan, D

    2012-01-01

    Knowledge of biological sonar systems has revolutionized many aspects of sonar engineering and further advances will benefit from more detailed understanding of their underlying acoustical processes. The anatomically diverse, complex and dynamic heads and ears of bats are known to be important for echolocation although their range-dependent properties are not well understood, particularly across the wide frequency range of some bats' vocalizations. The aim of this and a companion paper Kim et al (2012 Bioinspir. Biomim.) is to investigate bat-head acoustics as a function of bat-target distance, based on measurements up to 100 kHz and more robust examination of hardware characteristics in measurements than previously reported, using a cast of a bat head. In this first paper, we consider the spectral features at either ear (i.e. monaural head-related transfer functions). The results show, for example, that there is both higher magnitude and a stronger effect of distance at close range at relatively low frequencies. This might explain, at least in part, why bats adopt a strategy of changing the frequency range of their vocalizations while approaching a target. There is also potential advantage in the design of bio-inspired receivers of using range-dependent HRTFs and utilizing their distinguished frequency characteristics over the distance. (paper)

  4. Electron transport in furfural: dependence of the electron ranges on the cross sections and the energy loss distribution functions

    Science.gov (United States)

    Ellis-Gibbings, L.; Krupa, K.; Colmenares, R.; Blanco, F.; Muńoz, A.; Mendes, M.; Ferreira da Silva, F.; Limá Vieira, P.; Jones, D. B.; Brunger, M. J.; García, G.

    2016-09-01

    Recent theoretical and experimental studies have provided a complete set of differential and integral electron scattering cross section data from furfural over a broad energy range. The energy loss distribution functions have been determined in this study by averaging electron energy loss spectra for different incident energies and scattering angles. All these data have been used as input parameters for an event by event Monte Carlo simulation procedure to obtain the electron energy deposition patterns and electron ranges in liquid furfural. The dependence of these results on the input cross sections is then analysed to determine the uncertainty of the simulated values.

  5. Temperature dependence of muonium spin exchange with O2 in the range 88 K to 478 K

    International Nuclear Information System (INIS)

    Senba, M.; Garner, D.M.; Arseneau, D.J.; Fleming, D.G.

    1984-01-01

    The authors have extended an earlier study of the spin exchange reactions of Mu with O 2 in the range 295 K to 478 K, to a low temperature region down to 88 K. From 135 K to 296 K, the spin depolarization rate constant was found to vary according to the relative velocity of the colliding species, which indicates that the spin exchange cross section of Mu-O 2 is temperature independent in this range. However, it was found that below 105 K and above 400 K, the spin depolarization rate constant tends to have stronger temperature dependences. (Auth.)

  6. Withdrawal of corticosteroids in inflammatory bowel disease patients after dependency periods ranging from 2 to 45 years: a proposed method.

    LENUS (Irish Health Repository)

    Murphy, S J

    2012-02-01

    BACKGROUND: Even in the biologic era, corticosteroid dependency in IBD patients is common and causes a lot of morbidity, but methods of withdrawal are not well described. AIM: To assess the effectiveness of a corticosteroid withdrawal method. METHODS: Twelve patients (10 men, 2 women; 6 ulcerative colitis, 6 Crohn\\'s disease), median age 53.5 years (range 29-75) were included. IBD patients with quiescent disease refractory to conventional weaning were transitioned to oral dexamethasone, educated about symptoms of the corticosteroid withdrawal syndrome (CWS) and weaned under the supervision of an endocrinologist. When patients failed to wean despite a slow weaning pace and their IBD remaining quiescent, low dose synthetic ACTH stimulation testing was performed to assess for adrenal insufficiency. Multivariate analysis was performed to assess predictors of a slow wean. RESULTS: Median durations for disease and corticosteroid dependency were 21 (range 3-45) and 14 (range 2-45) years respectively. Ten patients (83%) were successfully weaned after a median follow-up from final wean of 38 months (range 5-73). Disease flares occurred in two patients, CWS in five and ACTH testing was performed in 10. Multivariate analysis showed that longer duration of corticosteroid use appeared to be associated with a slower wean (P = 0.056). CONCLUSIONS: Corticosteroid withdrawal using this protocol had a high success rate and durable effect and was effective in patients with long-standing (up to 45 years) dependency. As symptoms of CWS mimic symptoms of IBD disease flares, gastroenterologists may have difficulty distinguishing them, which may be a contributory factor to the frequency of corticosteroid dependency in IBD patients.

  7. Scattering from extended targets in range-dependent fluctuating ocean-waveguides with clutter from theory and experiments.

    Science.gov (United States)

    Jagannathan, Srinivasan; Küsel, Elizabeth T; Ratilal, Purnima; Makris, Nicholas C

    2012-08-01

    Bistatic, long-range measurements of acoustic scattered returns from vertically extended, air-filled tubular targets were made during three distinct field experiments in fluctuating continental shelf waveguides. It is shown that Sonar Equation estimates of mean target-scattered intensity lead to large errors, differing by an order of magnitude from both the measurements and waveguide scattering theory. The use of the Ingenito scattering model is also shown to lead to significant errors in estimating mean target-scattered intensity in the field experiments because they were conducted in range-dependent ocean environments with large variations in sound speed structure over the depth of the targets, scenarios that violate basic assumptions of the Ingenito model. Green's theorem based full-field modeling that describes scattering from vertically extended tubular targets in range-dependent ocean waveguides by taking into account nonuniform sound speed structure over the target's depth extent is shown to accurately describe the statistics of the targets' scattered field in all three field experiments. Returns from the man-made targets are also shown to have a very different spectral dependence from the natural target-like clutter of the dominant fish schools observed, suggesting that judicious multi-frequency sensing may often provide a useful means of distinguishing fish from man-made targets.

  8. Temperature-dependent dielectric function of germanium in the UV–vis spectral range: A first-principles study

    International Nuclear Information System (INIS)

    Yang, J.Y.; Liu, L.H.; Tan, J.Y.

    2014-01-01

    The study of temperature dependence of thermophysical parameter dielectric function is key to understanding thermal radiative transfer in high-temperature environments. Limited by self-radiation and thermal oxidation, however, it is difficult to directly measure the high-temperature dielectric function of solids with present experimental technologies. In this work, we implement two first-principles methods, the ab initio molecular dynamics (AIMD) and density functional perturbation theory (DFPT), to study the temperature dependence of dielectric function of germanium (Ge) in the UV–vis spectral range in order to provide data of high-temperature dielectric function for radiative transfer study in high-temperature environments. Both the two methods successfully predict the temperature dependence of dielectric function of Ge. Moreover, the good agreement between the calculated results of the AIMD approach and experimental data at 825 K enables us to predict the high-temperature dielectric function of Ge with the AIMD method in the UV–vis spectral range. - Highlights: • The temperature dependence of dielectric function of germanium (Ge) is investigated with two first-principles methods. • The temperature effect on dielectric function of Ge is discussed. • The high-temperature dielectric function of Ge is predicted

  9. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  10. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  11. Comparison of three different concepts of high dynamic range and dependability optimised current measurement digitisers for beam loss systems

    CERN Document Server

    Viganò, W; Effinger, E; Venturini, G G; Zamantzas, C

    2012-01-01

    Three Different Concepts of High Dynamic Range and Dependability Optimised Current Measurement Digitisers for Beam Loss Systems will be compared on this paper. The first concept is based on Current to Frequency Conversion, enhanced with an ADC for extending the dynamic range and decreasing the response time. A summary of 3 years’ worth of operational experience with such a system for LHC beam loss monitoring will be given. The second principle is based on an Adaptive Current to Frequency Converter implemented in an ASIC. The basic parameters of the circuit are discussed and compared with measurements. Several measures are taken to harden both circuits against single event effects and to make them tolerant for operation in radioactive environments. The third circuit is based on a Fully Differential Integrator for enhanced dynamic range, where laboratory and test installation measurements will be presented. All circuits are designed to avoid any dead time in the acquisition and have reliability and fail safe...

  12. Dependence of wavelength of Xe ion-induced rippled structures on the fluence in the medium ion energy range

    Energy Technology Data Exchange (ETDEWEB)

    Hanisch, Antje; Grenzer, Joerg [Institute of Ion Beam Physics and Materials Research, Dresden (Germany); Biermanns, Andreas; Pietsch, Ullrich [Institute of Physics, University of Siegen (Germany)

    2010-07-01

    Ion-beam eroded self-organized nanostructures on semiconductors offer new ways for the fabrication of high density memory and optoelectronic devices. It is known that wavelength and amplitude of noble gas ion-induced rippled structures tune with the ion energy and the fluence depending on the energy range, ion type and substrate. The linear theory by Makeev predicts a linear dependence of the ion energy on the wavelength for low temperatures. For Ar{sup +} and O{sub 2}{sup +} it was observed by different groups that the wavelength grows with increasing fluence after being constant up to an onset fluence and before saturation. In this coarsening regime power-law or exponential behavior of the wavelength with the fluence was monitored. So far, investigations for Xe ions on silicon surfaces mainly concentrated on energies below 1 keV. We found a linear dependence of both the ion energy and the fluence on the wavelength and amplitude of rippled structures over a wide range of the Xe{sup +} ion energy between 5 and 70 keV. Moreover, we estimated the ratio of wavelength to amplitude to be constant meaning a shape stability when a threshold fluence of 2.10{sup 17} cm{sup -2} was exceeded.

  13. Low Probability of Intercept Laser Range Finder

    Science.gov (United States)

    2017-07-19

    time of arrival, and it may also include wavelength, pulse width, and pulse repetition frequency (PRF). Second photodetector 38 in conjunction with... conjunction with lens 32 and telescope 36 that can correct for turbulence along the free space path. [0024] In all embodiments, the time interval

  14. Study of the Wavelength Dependence in Laser Ablation of Advanced Ceramics and Glass-Ceramic Materials in the Nanosecond Range.

    Science.gov (United States)

    Sola, Daniel; Peña, Jose I

    2013-11-19

    In this work, geometrical dimensions and ablation yields as a function of the machining method and reference position were studied when advanced ceramics and glass-ceramic materials were machined with pulsed lasers in the nanosecond range. Two laser systems, emitting at 1064 and 532 nm, were used. It was shown that the features obtained depend on whether the substrate is processed by means of pulse bursts or by grooves. In particular, when the samples were processed by grooves, machined depth, removed volume and ablation yields reached their maximum, placing the sample out of focus. It was shown that these characteristics do not depend on the processing conditions, the wavelength or the optical configuration, and that this is intrinsic behavior of the processing method. Furthermore, the existence of a close relation between material hardness and ablation yields was demonstrated.

  15. On the long-range dependence properties of annual precipitation using a global network of instrumental measurements

    Science.gov (United States)

    Tyralis, Hristos; Dimitriadis, Panayiotis; Koutsoyiannis, Demetris; O'Connell, Patrick Enda; Tzouka, Katerina; Iliopoulou, Theano

    2018-01-01

    The long-range dependence (LRD) is considered an inherent property of geophysical processes, whose presence increases uncertainty. Here we examine the spatial behaviour of LRD in precipitation by regressing the Hurst parameter estimate of mean annual precipitation instrumental data which span from 1916-2015 and cover a big area of the earth's surface on location characteristics of the instrumental data stations. Furthermore, we apply the Mann-Kendall test under the LRD assumption (MKt-LRD) to reassess the significance of observed trends. To summarize the results, the LRD is spatially clustered, it seems to depend mostly on the location of the stations, while the predictive value of the regression model is good. Thus when investigating for LRD properties we recommend that the local characteristics should be considered. The application of the MKt-LRD suggests that no significant monotonic trend appears in global precipitation, excluding the climate type D (snow) regions in which positive significant trends appear.

  16. A study of the angular momentum dependence of the phase shift for finite range and Coulomb potentials

    International Nuclear Information System (INIS)

    Valluri, S.R.; Romo, W.J.

    1989-01-01

    The dependence of the phase shift δ l (k) on the angular momentum l is investigated. An analytic expression for the derivative of the phase shift with respect to angular momentum is derived for a class of potentials that includes complex and real potentials. The potentials behave like the finite range potential for small r and like a Coulomb potential for large r. Specific examples like the square well, the pure point charge Coulomb and a combination of a square well and the Coulomb potential are analytically treated. Possible applications are briefly indicated. (orig.)

  17. Delay-Range-Dependent H∞ Control for Automatic Mooring Positioning System with Time-Varying Input Delay

    Directory of Open Access Journals (Sweden)

    Xiaoyu Su

    2014-01-01

    Full Text Available Aiming at the economy and security of the positioning system in semi-submersible platform, the paper presents a new scheme based on the mooring line switching strategy. Considering the input delay in switching process, H∞ control with time-varying input delay is designed to calculate the control forces to resist disturbing forces. In order to reduce the conservativeness, the information of the lower bound of delay is taken into account, and a Lyapunov function which contains the range of delay is constructed. Besides, the input constraint is considered to avoid breakage of mooring lines. The sufficient conditions for delay-range-dependent stabilization are derived in terms of LMI, and the controller is also obtained. The effectiveness of the proposed approach is illustrated by a realistic design example.

  18. Long-range correlations of different EEG derivations in rats: sleep stage-dependent generators may play a key role

    International Nuclear Information System (INIS)

    Fang, Guangzhan; Xia, Yang; Lai, Yongxiu; You, Zili; Yao, Dezhong

    2010-01-01

    For the electroencephalogram (EEG), topographic differences in the long-range temporal correlations would imply that these signals might be affected by specific mechanisms related to the generation of a given neuronal process. So the properties of the generators of various EEG oscillations might be investigated by their spatial differences of the long-range temporal correlations. In the present study, these correlations were characterized with respect to their topography during different vigilance states by detrended fluctuation analysis (DFA). The results indicated that (1) most of the scaling exponents acquired from different EEG derivations for various oscillations were significantly different in each vigilance state; these differences might be resulted from the different quantities and different locations of sleep stage-dependent generators of various neuronal processes; (2) there might be multiple generators of delta and theta over the brain and many of them were sleep stage-dependent; (3) the best site of the frontal electrode in a fronto-parietal bipolar electrode for sleep staging might be above the anterior midline cortex. We suggest that DFA analysis can be used to explore the properties of the generators of a given neuronal oscillation, and the localizations of these generators if more electrodes are involved

  19. On the Full-range β Dependence of Ion-scale Spectral Break in the Solar Wind Turbulence

    Science.gov (United States)

    Wang, Xin; Tu, Chuanyi; He, Jiansen; Wang, Linghua

    2018-04-01

    The power spectrum of magnetic fluctuations has a break at the high-frequency end of the inertial range. Beyond this break, the spectrum becomes steeper than the Kolmogorov law f ‑5/3. The break frequency was found to be associated with plasma beta (β). However, the full-range β dependence of the ion-scale spectral break has not been presented before in observational studies. Here we show the continuous variation of the break frequency on full-range β in the solar wind turbulence. By using measurements from the WIND and Ulysses spacecraft, we show the break frequency (f b ) normalized, respectively, by the frequencies corresponding to ion inertial length (f di ), ion gyroradius ({f}ρ i), and cyclotron resonance scale (f ri ) as a function of β for 1306 intervals. Their β values spread from 0.005 to 20, which nearly covers the full β range of the observed solar wind turbulence. It is found that {f}b/{f}{di} ({f}b/{f}ρ i) generally decreases (increases) with β, while {f}b/{f}{ri} is nearly a constant. We perform a linear fit on the statistical result, and obtain the empirical formulas {f}b/{f}{di}∼ {β }-1/4, {f}b/{f}ρ i∼ {β }1/4, and {f}b/{f}{ri}∼ 0.90 to describe the relation between f b and β. We also compare our observations with a numerical simulation and the prediction by ion cyclotron resonance theory. Our result favors the idea that the cyclotron resonance is an important mechanism for energy dissipation at the spectral break. When β ≪ 1 and β ≫ 1, the break at f di and {f}ρ i may also be associated with other processes.

  20. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  1. Social, state-dependent and environmental modulation of faecal corticosteroid levels in free-ranging female spotted hyenas.

    Science.gov (United States)

    Goymann, W; East, M L; Wachter, B; Höner, O P; Möstl, E; Van't Hof, T J; Hofer, H

    2001-12-07

    Little is known about to what extent the sensitivity of the hypothalamic-pituitary-adrenal (HPA) axis may be state dependent and vary in the same species between environments. Here we tested whether the faecal corticosteroid concentrations of matrilineal adult female spotted hyenas are influenced by social and reproductive status in adjacent ecosystems and whether they vary between periods with and without social stress. Females in the Serengeti National Park frequently become socially subordinate intruders in other hyena territories by undertaking long-distance foraging trips to migratory herds, whereas in the Ngorongoro Crater they usually forage inside their own small territories on resident prey. The faecal corticosteroid concentrations in Serengeti females were significantly higher than in Ngorongoro females. Energy expenditure by lactation is exceptionally high in spotted hyenas and this may be reflected in their corticosteroid levels. The faecal corticosteroid levels in both populations were higher in lactating than in non-lactating females. During periods of social stability, faecal corticosteroid concentrations increased in non-lactating females but not in lactating females as social status declined. Lactating Serengeti females had significantly higher faecal corticosteroid concentrations during periods with acute severe social stress than during periods without, indicating that the HPA axis is sensitive to social stimuli even in lactating females. So far few studies have used non-invasive monitoring methods for assessing social stress in freeranging animals. This study demonstrates for the first time, to the authors' knowledge, that corticosteroid concentrations may differ between periods with and without social stress for a free-ranging female mammal and that the modulating effect of social status may depend on reproductive status.

  2. Statistical modeling of the long-range-dependent structure of barrier island framework geology and surface geomorphology

    Directory of Open Access Journals (Sweden)

    B. A. Weymer

    2018-06-01

    Full Text Available Shorelines exhibit long-range dependence (LRD and have been shown in some environments to be described in the wave number domain by a power-law characteristic of scale independence. Recent evidence suggests that the geomorphology of barrier islands can, however, exhibit scale dependence as a result of systematic variations in the underlying framework geology. The LRD of framework geology, which influences island geomorphology and its response to storms and sea level rise, has not been previously examined. Electromagnetic induction (EMI surveys conducted along Padre Island National Seashore (PAIS, Texas, United States, reveal that the EMI apparent conductivity (σa signal and, by inference, the framework geology exhibits LRD at scales of up to 101 to 102 km. Our study demonstrates the utility of describing EMI σa and lidar spatial series by a fractional autoregressive integrated moving average (ARIMA process that specifically models LRD. This method offers a robust and compact way of quantifying the geological variations along a barrier island shoreline using three statistical parameters (p, d, q. We discuss how ARIMA models that use a single parameter d provide a quantitative measure for determining free and forced barrier island evolutionary behavior across different scales. Statistical analyses at regional, intermediate, and local scales suggest that the geologic framework within an area of paleo-channels exhibits a first-order control on dune height. The exchange of sediment amongst nearshore, beach, and dune in areas outside this region are scale independent, implying that barrier islands like PAIS exhibit a combination of free and forced behaviors that affect the response of the island to sea level rise.

  3. Statistical modeling of the long-range-dependent structure of barrier island framework geology and surface geomorphology

    Science.gov (United States)

    Weymer, Bradley A.; Wernette, Phillipe; Everett, Mark E.; Houser, Chris

    2018-06-01

    Shorelines exhibit long-range dependence (LRD) and have been shown in some environments to be described in the wave number domain by a power-law characteristic of scale independence. Recent evidence suggests that the geomorphology of barrier islands can, however, exhibit scale dependence as a result of systematic variations in the underlying framework geology. The LRD of framework geology, which influences island geomorphology and its response to storms and sea level rise, has not been previously examined. Electromagnetic induction (EMI) surveys conducted along Padre Island National Seashore (PAIS), Texas, United States, reveal that the EMI apparent conductivity (σa) signal and, by inference, the framework geology exhibits LRD at scales of up to 101 to 102 km. Our study demonstrates the utility of describing EMI σa and lidar spatial series by a fractional autoregressive integrated moving average (ARIMA) process that specifically models LRD. This method offers a robust and compact way of quantifying the geological variations along a barrier island shoreline using three statistical parameters (p, d, q). We discuss how ARIMA models that use a single parameter d provide a quantitative measure for determining free and forced barrier island evolutionary behavior across different scales. Statistical analyses at regional, intermediate, and local scales suggest that the geologic framework within an area of paleo-channels exhibits a first-order control on dune height. The exchange of sediment amongst nearshore, beach, and dune in areas outside this region are scale independent, implying that barrier islands like PAIS exhibit a combination of free and forced behaviors that affect the response of the island to sea level rise.

  4. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  5. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  6. Experimental determination of the steady-state charging probabilities and particle size conservation in non-radioactive and radioactive bipolar aerosol chargers in the size range of 5–40 nm

    Energy Technology Data Exchange (ETDEWEB)

    Kallinger, Peter, E-mail: peter.kallinger@univie.ac.at; Szymanski, Wladyslaw W. [University of Vienna, Faculty of Physics (Austria)

    2015-04-15

    Three bipolar aerosol chargers, an AC-corona (Electrical Ionizer 1090, MSP Corp.), a soft X-ray (Advanced Aerosol Neutralizer 3087, TSI Inc.), and an α-radiation-based {sup 241}Am charger (tapcon & analysesysteme), were investigated on their charging performance of airborne nanoparticles. The charging probabilities for negatively and positively charged particles and the particle size conservation were measured in the diameter range of 5–40 nm using sucrose nanoparticles. Chargers were operated under various flow conditions in the range of 0.6–5.0 liters per minute. For particular experimental conditions, some deviations from the chosen theoretical model were found for all chargers. For very small particle sizes, the AC-corona charger showed particle losses at low flow rates and did not reach steady-state charge equilibrium at high flow rates. However, for all chargers, operating conditions were identified where the bipolar charge equilibrium was achieved. Practically, excellent particle size conservation was found for all three chargers.

  7. Energy dependence of the zero-range DWBA normalization of the /sup 58/Ni(/sup 3/He,. cap alpha. )/sup 57/Ni reaction. [15 to 205 GeV, finite-range and nonlocality corrections

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, J R; Zimmerman, W R; Kraushaar, J J [Colorado Univ., Boulder (USA). Dept. of Physics and Astrophysics

    1977-01-04

    Strong transitions in the /sup 58/Ni(/sup 3/He,..cap alpha..)/sup 57/Ni reaction were analyzed using both the zero-range and exact finite-range DWBA. Data considered covered a range of bombarding energies from 15 to 205 MeV. The zero-range DWBA described all data well when finite-range and non-locality corrections were included in the local energy approximation. Comparison of zero-range and exact finite-range calculations showed the local energy approximation correction to be very accurate over the entire energy region. Empirically determined D/sub 0/ values showed no energy dependence. A theoretical D/sub 0/ value calculated using an ..cap alpha.. wave function which reproduced the measured ..cap alpha.. rms charge radius and the elastic electron scattering form factor agreed well the empirical values. Comparison was made between these values and D/sub 0/ values quoted previously in the literature.

  8. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  9. Overpotential-induced lability of the electronic overlap factor in long-range electrochemical electron transfer: charge and distance dependence

    DEFF Research Database (Denmark)

    Kornyshev, A. A.; Kuznetsov, A. M.; Nielsen, Jens Ulrik

    2000-01-01

    Long-distance electrochemical electron transfer exhibits approximately exponential dependence on the electron transfer distance. On the basis of a jellium model of the metal surface we show that the slope of the logarithm of the current vs. the transfer distance also depends strongly...

  10. Probability of Loss of Assured Safety in Systems with Multiple Time-Dependent Failure Modes: Incorporation of Delayed Link Failure in the Presence of Aleatory Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon C. [Arizona State Univ., Tempe, AZ (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sallaberry, Cedric Jean-Marie. [Engineering Mechanics Corp. of Columbus, OH (United States)

    2018-02-01

    Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximation and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Scale dependence in habitat selection: The case of the endangered brown bear (Ursus arctos) in the Cantabrian Range (NW Spain)

    Science.gov (United States)

    Maria C. Mateo Sanchez; Samuel A. Cushman; Santiago Saura

    2013-01-01

    Animals select habitat resources at multiple spatial scales. Thus, explicit attention to scale dependency in species-habitat relationships is critical to understand the habitat suitability patterns as perceived by organisms in complex landscapes. Identification of the scales at which particular environmental variables influence habitat selection may be as important as...

  13. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  14. Wavelength dependence of the single pulse femtosecond laser ablation threshold of indium phosphide in the 400-2050 nm range

    International Nuclear Information System (INIS)

    Borowiec, A.; Tiedje, H.F.; Haugen, H.K.

    2005-01-01

    We present single pulse femtosecond laser ablation threshold measurements of InP obtained by optical, scanning electron, and atomic force microscopy. The experiments were conducted with laser pulses 65-175 fs in duration, in the wavelength range from 400 to 2050 nm, covering the photon energy region above and below the bandgap of InP. The ablation thresholds determined from depth and volume measurements varied from 87 mJ/cm 2 at 400 nm to 250 mJ/cm 2 at 2050 nm. In addition, crater depths and volumes were measured over a range of laser fluences extending well above the ablation threshold

  15. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  16. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  17. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  18. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  19. True and apparent scaling: The proximity of the Markov-switching multifractal model to long-range dependence

    Science.gov (United States)

    Liu, Ruipeng; Di Matteo, T.; Lux, Thomas

    2007-09-01

    In this paper, we consider daily financial data of a collection of different stock market indices, exchange rates, and interest rates, and we analyze their multi-scaling properties by estimating a simple specification of the Markov-switching multifractal (MSM) model. In order to see how well the estimated model captures the temporal dependence of the data, we estimate and compare the scaling exponents H(q) (for q=1,2) for both empirical data and simulated data of the MSM model. In most cases the multifractal model appears to generate ‘apparent’ long memory in agreement with the empirical scaling laws.

  20. SMPLNORM: A simple model for obtaining the joint probabilities of two flows and the level that depends on them. SMPLNORM: Un modele simple pour obtenir les probabilites conjointes de deux debits et le niveau qui en depend

    Energy Technology Data Exchange (ETDEWEB)

    Bruneau, P. (Hydro-Quebec, Montreal, PQ (Canada)); Ashkar, F. (Moncton Univ., NB (Canada)); Bobee, B. (Inst. National de la Recherche Scientifique, Saint-Foy, PQ (Canada))

    1994-01-01

    Most bivariate models assume the same type of marginal distribution, with two parameters, for two variables. The disadvantage of these models for hydrologic flow studies is that it is often difficult to make adjustments for observed flows. A complete example is presented of the application flexibility of the SMPLNORM program, which calculates the joint probability of two variables, Q1 and Q2, with marginal distributions that have three parameters. The program can also provide the probability of nonexceedence of a third variable H, mathematically related to the first two variables. Two applications are discussed, in which Q1 and Q2 are the flows of two rivers controlling the variable H, which is a water level in both cases. Theoretically, this model could also be applied to other types of variables. The proposed model is based on the hypothesis that a Box-Cox type of power transformation could reduce the marginal distributions of Q1 and Q2 to a normal distribution. One of the main conclusions of the study addresses the importance of taking into account the correlation between Q1 and Q2 to obtain a valid estimate of H. 11 refs., 14 figs., 7 tabs.

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  3. Temperature dependence of the Schottky-barrier heights of n-type semiconductors in the temperature range of 7 to 300 K

    International Nuclear Information System (INIS)

    Chen, T.P.; Lee, T.C.; Fung, S.; Beling, C.D.

    1994-01-01

    In this note we present the results of the temperature dependence of the SBH in Au/n-Si, Ag/n-GaAs, and Au/n-GaAs in the temperature range of 7 to 300 K from our internal photoemission measurements. (orig.)

  4. Measurement of angularly dependent spectra of betatron gamma-rays from a laser plasma accelerator with quadrant-sectored range filters

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Jong Ho, E-mail: jhjeon07@ibs.re.kr; Nakajima, Kazuhisa, E-mail: naka115@dia-net.ne.jp; Rhee, Yong Joo; Pathak, Vishwa Bandhu; Cho, Myung Hoon; Shin, Jung Hun; Yoo, Byung Ju; Jo, Sung Ha; Shin, Kang Woo [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 61005 (Korea, Republic of); Kim, Hyung Taek; Sung, Jae Hee; Lee, Seong Ku; Choi, Il Woo [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 61005 (Korea, Republic of); Advanced Photonics Research Institute, GIST, Gwangju 61005 (Korea, Republic of); Hojbota, Calin; Bae, Lee Jin; Jung, Jaehyung; Cho, Min Sang; Cho, Byoung Ick; Nam, Chang Hee [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 61005 (Korea, Republic of); Department of Physics and Photon Science, GIST, Gwangju 61005 (Korea, Republic of)

    2016-07-15

    Measurement of angularly dependent spectra of betatron gamma-rays radiated by GeV electron beams from laser wakefield accelerators (LWFAs) are presented. The angle-resolved spectrum of betatron radiation was deconvolved from the position dependent data measured for a single laser shot with a broadband gamma-ray spectrometer comprising four-quadrant sectored range filters and an unfolding algorithm, based on the Monte Carlo code GEANT4. The unfolded gamma-ray spectra in the photon energy range of 0.1–10 MeV revealed an approximately isotropic angular dependence of the peak photon energy and photon energy-integrated fluence. As expected by the analysis of betatron radiation from LWFAs, the results indicate that unpolarized gamma-rays are emitted by electrons undergoing betatron motion in isotropically distributed orbit planes.

  5. Mass Dependent Fractionation of Hg Isotopes in Source Rocks, Mineral Deposits and Spring Waters of the California Coast Ranges, USA

    Science.gov (United States)

    Smith, C. N.; Kesler, S. E.; Blum, J. D.; Rytuba, J. J.

    2007-12-01

    We present here the first study of the isotopic composition of Hg in rocks, ore deposits, and active hydrothermal systems from the California Coast Ranges, one of Earth's largest Hg-depositing systems. The Franciscan Complex and Great Valley Sequence, which form the bedrock in the California Coast Ranges, are intruded and overlain by Tertiary volcanic rocks including the Clear Lake Volcanic Sequence. These rocks contain two types of Hg deposits, hot-spring deposits that form at shallow depths (<300 m) and silica-carbonate deposits that extend to greater depths (200 to 1000 m), as well as active springs and geothermal systems that release Hg to the present surface. The Franciscan Complex and Great Valley Sequence contain clastic sedimentary rocks with higher concentrations of Hg than volcanic rocks of the Clear Lake Volcanic Field. Mean Hg isotope compositions for all three rock units are similar, although the range of values in Franciscan Complex rocks is greater than in either Great Valley or Clear Lake rocks. Hot spring and silica-carbonate Hg deposits have similar average isotopic compositions that are indistinguishable from averages for the three rock units, although δ202Hg values for the Hg deposits have a greater variance than the country rocks. Precipitates from dilute spring and saline thermal waters in the area have similarly large variance and a mean δ202Hg value that is significantly lower than the ore deposits and rocks. These observations indicate there is little or no isotopic fractionation during release of Hg from its source rocks into hydrothermal solutions. Isotopic fractionation does appear to take place during transport and concentration of Hg in deposits, especially in their uppermost parts. Boiling of hydrothermal fluids is likely the most important process causing of the observed Hg isotope fractionation. This should result in the release of Hg with low δ202Hg values into the atmosphere from the top of these hydrothermal systems and a

  6. Self-produced Time Intervals Are Perceived as More Variable and/or Shorter Depending on Temporal Context in Subsecond and Suprasecond Ranges

    Directory of Open Access Journals (Sweden)

    Keita eMitani

    2016-06-01

    Full Text Available The processing of time intervals is fundamental for sensorimotor and cognitive functions. Perceptual and motor timing are often performed concurrently (e.g., playing a musical instrument. Although previous studies have shown the influence of body movements on time perception, how we perceive self-produced time intervals has remained unclear. Furthermore, it has been suggested that the timing mechanisms are distinct for the sub- and suprasecond ranges. Here, we compared perceptual performances for self-produced and passively presented time intervals in random contexts (i.e., multiple target intervals presented in a session across the sub- and suprasecond ranges (Experiment 1 and within the sub- (Experiment 2 and suprasecond (Experiment 3 ranges, and in a constant context (i.e., a single target interval presented in a session in the sub- and suprasecond ranges (Experiment 4. We show that self-produced time intervals were perceived as shorter and more variable across the sub- and suprasecond ranges and within the suprasecond range but not within the subsecond range in a random context. In a constant context, the self-produced time intervals were perceived as more variable in the suprasecond range but not in the subsecond range. The impairing effects indicate that motor timing interferes with perceptual timing. The dependence of impairment on temporal contexts suggests multiple timing mechanisms for the subsecond and suprasecond ranges. In addition, violation of the scalar property (i.e., a constant variability to target interval ratio was observed between the sub- and suprasecond ranges. The violation was clearer for motor timing than for perceptual timing. This suggests that the multiple timing mechanisms for the sub- and suprasecond ranges overlap more for perception than for motor. Moreover, the central tendency effect (i.e., where shorter base intervals are overestimated and longer base intervals are underestimated disappeared with subsecond

  7. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  8. Altered Long- and Short-Range Functional Connectivity in Patients with Betel Quid Dependence: A Resting-State Functional MRI Study

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2016-12-01

    Full Text Available Objective: Addiction is a chronic relapsing brain disease. Brain structural abnormalities may constitute an abnormal neural network that underlies the risk of drug dependence. We hypothesized that individuals with Betel Quid Dependence (BQD have functional connectivity alterations that can be described by long- and short-range functional connectivity density(FCD maps. Methods: We tested this hypothesis using functional magnetic resonance imaging (fMRI data from subjects of the Han ethnic group in Hainan, China. Here, we examined BQD individuals (n = 33 and age-, sex-, and education-matched healthy controls (HCs (n = 32 in a rs-fMRI study to observe FCD alterations associated with the severity of BQD. Results: Compared with HCs, long-range FCD was decreased in the right anterior cingulate cortex (ACC and increased in the left cerebellum posterior lobe (CPL and bilateral inferior parietal lobule (IPL in the BQD group. Short-range FCD was reduced in the right ACC and left dorsolateral prefrontal cortex (dlPFC, and increased in the left CPL. The short-range FCD alteration in the right ACC displayed a negative correlation with the Betel Quid Dependence Scale (BQDS (r=-0.432, P=0.012, and the long-range FCD alteration of left IPL showed a positive correlation with the duration of BQD(r=0.519, P=0.002 in BQD individuals. Conclusions: fMRI revealed differences in long- and short- range FCD in BQD individuals, and these alterations might be due to BQ chewing, BQ dependency, or risk factors for developing BQD.

  9. Optimal packaging of FIV genomic RNA depends upon a conserved long-range interaction and a palindromic sequence within gag.

    Science.gov (United States)

    Rizvi, Tahir A; Kenyon, Julia C; Ali, Jahabar; Aktar, Suriya J; Phillip, Pretty S; Ghazawi, Akela; Mustafa, Farah; Lever, Andrew M L

    2010-10-15

    The feline immunodeficiency virus (FIV) is a lentivirus that is related to human immunodeficiency virus (HIV), causing a similar pathology in cats. It is a potential small animal model for AIDS and the FIV-based vectors are also being pursued for human gene therapy. Previous studies have mapped the FIV packaging signal (ψ) to two or more discontinuous regions within the 5' 511 nt of the genomic RNA and structural analyses have determined its secondary structure. The 5' and 3' sequences within ψ region interact through extensive long-range interactions (LRIs), including a conserved heptanucleotide interaction between R/U5 and gag. Other secondary structural elements identified include a conserved 150 nt stem-loop (SL2) and a small palindromic stem-loop within gag open reading frame that might act as a viral dimerization initiation site. We have performed extensive mutational analysis of these sequences and structures and ascertained their importance in FIV packaging using a trans-complementation assay. Disrupting the conserved heptanucleotide LRI to prevent base pairing between R/U5 and gag reduced packaging by 2.8-5.5 fold. Restoration of pairing using an alternative, non-wild type (wt) LRI sequence restored RNA packaging and propagation to wt levels, suggesting that it is the structure of the LRI, rather than its sequence, that is important for FIV packaging. Disrupting the palindrome within gag reduced packaging by 1.5-3-fold, but substitution with a different palindromic sequence did not restore packaging completely, suggesting that the sequence of this region as well as its palindromic nature is important. Mutation of individual regions of SL2 did not have a pronounced effect on FIV packaging, suggesting that either it is the structure of SL2 as a whole that is necessary for optimal packaging, or that there is redundancy within this structure. The mutational analysis presented here has further validated the previously predicted RNA secondary structure of FIV

  10. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  11. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  12. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  13. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  14. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  15. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  16. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  17. Temperature dependence of the short-range order parameter and the concentration dependence of the order disorder temperature for Ni-Pt and Ni-Fe systems in the improved statistical pseudopotential approximation

    International Nuclear Information System (INIS)

    Khwaja, F.A.

    1980-08-01

    The calculations for the temperature dependence of the first shell short-range order (SRO) parameter for Ni 3 Fe using the cubic approximation of Tahir Kheli, and the concentration dependence of order-disorder temperature Tsub(c) for Ni-Fe and Ni-Pt systems using the linear approximation, have been carried out in the framework of pseudopotential theory. It is shown that the cubic approximation yields a good agreement between the theoretical prediction of the α 1 and the experimental data. Results for the concentration dependence of the Tsub(c) show that improvements in the statistical pseudo-potential approach are essential to achieve a good agreement with experiment. (author)

  18. Energy Dependence of Elliptic Flow over a Large Pseudorapidity Range in Au+Au Collisions at the BNL Relativistic Heavy Ion Collider

    Science.gov (United States)

    Back, B. B.; Baker, M. D.; Ballintijn, M.; Barton, D. S.; Betts, R. R.; Bickley, A. A.; Bindel, R.; Budzanowski, A.; Busza, W.; Carroll, A.; Chai, Z.; Decowski, M. P.; García, E.; Gburek, T.; George, N.; Gulbrandsen, K.; Gushue, S.; Halliwell, C.; Hamblen, J.; Hauer, M.; Heintzelman, G. A.; Henderson, C.; Hofman, D. J.; Hollis, R. S.; Hołyński, R.; Holzman, B.; Iordanova, A.; Johnson, E.; Kane, J. L.; Katzy, J.; Khan, N.; Kucewicz, W.; Kulinich, P.; Kuo, C. M.; Lin, W. T.; Manly, S.; McLeod, D.; Mignerey, A. C.; Nouicer, R.; Olszewski, A.; Pak, R.; Park, I. C.; Pernegger, H.; Reed, C.; Remsberg, L. P.; Reuter, M.; Roland, C.; Roland, G.; Rosenberg, L.; Sagerer, J.; Sarin, P.; Sawicki, P.; Seals, H.; Sedykh, I.; Skulski, W.; Smith, C. E.; Stankiewicz, M. A.; Steinberg, P.; Stephans, G. S.; Sukhanov, A.; Tang, J.-L.; Tonjes, M. B.; Trzupek, A.; Vale, C.; van Nieuwenhuizen, G. J.; Vaurynovich, S. S.; Verdier, R.; Veres, G. I.; Wenger, E.; Wolfs, F. L.; Wosiek, B.; Woźniak, K.; Wuosmaa, A. H.; Wysłouch, B.

    2005-04-01

    This Letter describes the measurement of the energy dependence of elliptic flow for charged particles in Au+Au collisions using the PHOBOS detector at the Relativistic Heavy Ion Collider. Data taken at collision energies of √(sNN)=19.6, 62.4, 130, and 200 GeV are shown over a wide range in pseudorapidity. These results, when plotted as a function of η'=|η|-ybeam, scale with approximate linearity throughout η', implying no sharp changes in the dynamics of particle production as a function of pseudorapidity or increasing beam energy.

  19. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  20. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  1. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  2. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  4. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  5. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  6. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  7. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  8. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  9. Carbon dots with strong excitation-dependent fluorescence changes towards pH. Application as nanosensors for a broad range of pH

    Energy Technology Data Exchange (ETDEWEB)

    Barati, Ali [Faculty of Chemistry, Institute for Advanced Studies in Basic Sciences, Zanjan (Iran, Islamic Republic of); Department of Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of); Shamsipur, Mojtaba, E-mail: mshamsipur@yahoo.com [Department of Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of); Abdollahi, Hamid, E-mail: abd@iasbs.ac.ir [Faculty of Chemistry, Institute for Advanced Studies in Basic Sciences, Zanjan (Iran, Islamic Republic of)

    2016-08-10

    In this study, preparation of novel pH-sensitive N-doped carbon dots (NCDs) using glucose and urea is reported. The prepared NCDs present strong excitation-dependent fluorescence changes towards the pH that is a new behavior from these nanomaterials. By taking advantage of this unique behavior, two separated ratiometric pH sensors using emission spectra of the NCDs for both acidic (pH 2.0 to 8.0) and basic (pH 7.0 to 14.0) ranges of pH are constructed. Additionally, by considering the entire Excitation–Emission Matrix (EEM) of NCDs as analytical signal and using a suitable multivariate calibration method, a broad range of pH from 2.0 to 14.0 was well calibrated. The multivariate calibration method was independent from the concentration of NCDs and resulted in a very low average prediction error of 0.067 pH units. No changes in the predicted pH under UV irradiation (for 3 h) and at high ionic strength (up to 2 M NaCl) indicated the high stability of this pH nanosensor. The practicality of this pH nanosensor for pH determination in real water samples was validated with good accuracy and repeatability. - Highlights: • Novel pH-sensitive carbon dots with strong FL changes towards pH are reported. • Ratiometric FL pH-sensors for both acidic and basic ranges of pH are constructed. • Multivariate calibration methods were used to calibrate a broad range of pH. • Using EEM of carbon dots and ANN, pH from 2.0 to 14.0 was well calibrated. • The pH prediction is stable even at high ionic strength up to 2 M NaCl.

  10. Carbon dots with strong excitation-dependent fluorescence changes towards pH. Application as nanosensors for a broad range of pH

    International Nuclear Information System (INIS)

    Barati, Ali; Shamsipur, Mojtaba; Abdollahi, Hamid

    2016-01-01

    In this study, preparation of novel pH-sensitive N-doped carbon dots (NCDs) using glucose and urea is reported. The prepared NCDs present strong excitation-dependent fluorescence changes towards the pH that is a new behavior from these nanomaterials. By taking advantage of this unique behavior, two separated ratiometric pH sensors using emission spectra of the NCDs for both acidic (pH 2.0 to 8.0) and basic (pH 7.0 to 14.0) ranges of pH are constructed. Additionally, by considering the entire Excitation–Emission Matrix (EEM) of NCDs as analytical signal and using a suitable multivariate calibration method, a broad range of pH from 2.0 to 14.0 was well calibrated. The multivariate calibration method was independent from the concentration of NCDs and resulted in a very low average prediction error of 0.067 pH units. No changes in the predicted pH under UV irradiation (for 3 h) and at high ionic strength (up to 2 M NaCl) indicated the high stability of this pH nanosensor. The practicality of this pH nanosensor for pH determination in real water samples was validated with good accuracy and repeatability. - Highlights: • Novel pH-sensitive carbon dots with strong FL changes towards pH are reported. • Ratiometric FL pH-sensors for both acidic and basic ranges of pH are constructed. • Multivariate calibration methods were used to calibrate a broad range of pH. • Using EEM of carbon dots and ANN, pH from 2.0 to 14.0 was well calibrated. • The pH prediction is stable even at high ionic strength up to 2 M NaCl.

  11. Concentration and temperature dependence of short-range order in Ni-Ta solid solution using X-ray diffraction method

    International Nuclear Information System (INIS)

    Khwaja, F.A.; Alam, A.

    1980-09-01

    Diffuse X-ray scattering investigations about the existence of short-range order (SRO) have been carried out in the Ni-Ta system for different concentrations and annealing temperatures. It is observed that the values of the SRO parameters for the first co-ordination shell have anomalously large negative values for all the samples studied. These values of the α 1 depend upon the annealing temperatures and the concentration of Ta atoms in the Ni-Ta system. The results of the theoretical predictions of the ordering potential obtained using the formulae of the electronic theory of SRO, confirm the existence of very strong attractive correlation between the atoms of the different species in this system. (author)

  12. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  13. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  14. Slotted rotatable target assembly and systematic error analysis for a search for long range spin dependent interactions from exotic vector boson exchange using neutron spin rotation

    Science.gov (United States)

    Haddock, C.; Crawford, B.; Fox, W.; Francis, I.; Holley, A.; Magers, S.; Sarsour, M.; Snow, W. M.; Vanderwerp, J.

    2018-03-01

    We discuss the design and construction of a novel target array of nonmagnetic test masses used in a neutron polarimetry measurement made in search for new possible exotic spin dependent neutron-atominteractions of Nature at sub-mm length scales. This target was designed to accept and efficiently transmit a transversely polarized slow neutron beam through a series of long open parallel slots bounded by flat rectangular plates. These openings possessed equal atom density gradients normal to the slots from the flat test masses with dimensions optimized to achieve maximum sensitivity to an exotic spin-dependent interaction from vector boson exchanges with ranges in the mm - μm regime. The parallel slots were oriented differently in four quadrants that can be rotated about the neutron beam axis in discrete 90°increments using a Geneva drive. The spin rotation signals from the 4 quadrants were measured using a segmented neutron ion chamber to suppress possible systematic errors from stray magnetic fields in the target region. We discuss the per-neutron sensitivity of the target to the exotic interaction, the design constraints, the potential sources of systematic errors which could be present in this design, and our estimate of the achievable sensitivity using this method.

  15. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  16. Maximizing probable oil field profit: uncertainties on well spacing

    International Nuclear Information System (INIS)

    MacKay, J.A.; Lerche, I.

    1997-01-01

    The influence of uncertainties in field development costs, well costs, lifting costs, selling price, discount factor, and oil field reserves are evaluated for their impact on assessing probable ranges of uncertainty on present day worth (PDW), oil field lifetime τ 2/3 , optimum number of wells (OWI), and the minimum (n-) and maximum (n+) number of wells to produce a PDW ≥ O. The relative importance of different factors in contributing to the uncertainties in PDW, τ 2/3 , OWI, nsub(-) and nsub(+) is also analyzed. Numerical illustrations indicate how the maximum PDW depends on the ranges of parameter values, drawn from probability distributions using Monte Carlo simulations. In addition, the procedure illustrates the relative importance of contributions of individual factors to the total uncertainty, so that one can assess where to place effort to improve ranges of uncertainty; while the volatility of each estimate allows one to determine when such effort is needful. (author)

  17. Time-dependent non-probability reliability analysis of corroded pipeline in service%在役腐蚀管道动态非概率可靠性分析

    Institute of Scientific and Technical Information of China (English)

    魏宗平

    2014-01-01

    Corrosion is one of the primary failure mode of pressurized pipelines .Research on relia-bility of the corroded pipelines has important theoretical significance and application value .Ade-quate data is necessary for probabilistic reliability model and fuzzy reliability model used to ana-lyze the reliability of corroded pipelines .In case of insufficient information was available ,the un-certain parameters of yield strength ,pipeline diameter ,defect depths ,operating pressure and so on were described as interval variables to make up the gap by using the uncertain information of the corroded pipelines well .Time-dependent non-probability reliability model of the corroded pipeline in service was established based on the interval model .A simple method for the remaining life prediction of the corroded pipelines was given .A numerical example had been used to illustrate the proposed method .T he result show s that the proposed method is feasible and reasonable and procticable .Finally a sensitivity analysis was carried out on interval variables involved in the problem .The effects of the coefficient of variation of pipeline wall thickness ,defect depths ,oper-ating pressures and corrosion velocity on non-probability reliability index of the corroded pipelines were evaluated .The results of sensitivity analysis indicate that the non-probability reliability in-dex is the most sensitive to the coefficient of variation of pipeline wall thickness interval variable .%腐蚀失效是压力管道失效的主要形式之一,研究腐蚀管道的可靠性具有重要理论意义和应用价值。在对腐蚀管道可靠性分析时,概率可靠性模型和模糊可靠性模型对于数据信息的要求较高。而在掌握不确定性信息很少情况下,为了充分利用管道的不确定性信息弥补原始数据的不足,可将腐蚀管道可靠性分析中的材料屈服强度、管道直径、缺陷深度和操作压力等不确定参数视为区间变量,

  18. Age dependence of dielectric properties of bovine brain and ocular tissues in the frequency range of 400 MHz to 18 GHz

    International Nuclear Information System (INIS)

    Schmid, Gernot; Ueberbacher, Richard

    2005-01-01

    In order to identify possible age-dependent dielectric properties of brain and eye tissues in the frequency range of 400 MHz to 18 GHz, measurements on bovine grey and white matter as well as on cornea, lens (cortical) and the vitreous body were performed using a commercially available open-ended coaxial probe and a computer-controlled vector network analyser. Freshly excised tissues of 52 animals of two age groups (42 adult animals, i.e. 16-24 month old and 10 young animals, i.e. 4-6 month old calves) were examined within 8 min (brain tissue) and 15 min (eye tissue), respectively, of the animals' death. Tissue temperatures for the measurements were 32 ± 1 0 C and 25 ± 1 0 C for brain and eye tissues, respectively. Statistical analysis of the measured data revealed significant differences in the dielectric properties of white matter and cortical lens tissue between the adult and the young group. In the case of white matter the mean values of conductivity and permittivity of young tissue were 15%-22% and 12%-15%, respectively, higher compared to the adult tissue in the considered frequency range. Similarly, young cortical lens tissue was 25%-76% higher in conductivity and 27%-39% higher in permittivity than adult cortical lens tissue

  19. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  20. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  1. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  2. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  3. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  4. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  5. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  6. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  7. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  8. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  9. Saddlepoint Approximations for Various Statistics of Dependent, Non-Gaussian Random Variables: Applications to the Maximum Variate and the Range Variate

    National Research Council Canada - National Science Library

    Nuttall, Albert

    2001-01-01

    ... other. Although this assumption greatly simplifies the analysis, it can lead to very misleading probability measures, especially on the tails of the distributions, where the exact details of the particular...

  10. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  11. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  12. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  13. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  14. Analytical dependence of effective atomic number on the elemental composition of matter and radiation energy in the range 10-1000 keV

    Science.gov (United States)

    Eritenko, A. N.; Tsvetiansky, A. L.; Polev, A. A.

    2018-01-01

    In the present paper, a universal analytical dependence of effective atomic number on the composition of matter and radiation energy is proposed. This enables one to consider the case of a strong difference in the elemental composition with respect to their atomic numbers over a wide energy range. The contribution of photoelectric absorption and incoherent and coherent scattering during the interaction between radiation and matter is considered. For energy values over 40 keV, the contribution of coherent scattering does not exceed approximately 10% that can be neglected at a further consideration. The effective atomic numbers calculated on the basis of the proposed relationships are compared to the results of calculations based on other methods considered by different authors on the basis of experimental and tabulated data on mass and atomic attenuation coefficients. The examination is carried out for both single-element (e.g., 6C, 14Si, 28Cu, 56Ba, and 82Pb) and multi-element materials. Calculations are performed for W1-xCux alloys (x = 0.35; x = 0.4), PbO, ther moluminescent dosimetry compounds (56Ba, 48Cd, 41Sr, 20Ca, 12Mg, and 11Na), and SO4 in a wide energy range. A case with radiation energy between the K- and L1-absorption edges is considered for 82Pb, 74W, 56Ba, 48Cd, and 38Sr. This enables to substantially simplify the calculation of the atomic number and will be useful in technical and scientific fields related to the interaction between X-ray/gamma radiation and matter.

  15. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  16. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  17. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  18. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  19. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  20. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  1. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  2. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  3. Experimental study of energy dependence of proton induced fission cross sections for heavy nuclei in the energy range 200-1000 MeV

    Energy Technology Data Exchange (ETDEWEB)

    Kotov, A.A.; Gavrikov, Yu.A.; Vaishnene, L.A.; Vovchenko, V.G.; Poliakov, V.V.; Fedorov, O.Ya.; Chestnov, Yu.A.; Shchetkovskiy, A.I [Petersburg Nuclear Physics Institute, Gatchina, Leningrad district, Orlova roscha 1, 188300 (Russian Federation); Fukahori, T. [Japan Atomic Energy Research Institute, Tokai-mura, Ibaraki 319-1195 (Japan)

    2005-07-01

    The results of the total fission cross sections measurements for {sup nat}Pb, {sup 209}Bi, {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U, {sup 237}Np and {sup 239}Pu nuclei at the energy proton range 200-1000 MeV are presented. Experiments were carried out at 1 GeV synchrocyclotron of Petersburg Nuclear Physics Institute (Gatchina). The measurement method is based on the registration in coincidence of both complementary fission fragments by two gas parallel plate avalanche counters, located at a short distance and opposite sides of investigated target. The insensitivity of parallel plate avalanche counters to neutron and light charged particles allowed us to place the counters together with target immediately in the proton beam providing a large solid angle acceptance for fission fragment registration and reliable identification of fission events. The proton flux on the target to be studied was determined by direct counting of protons by scintillation telescope. The measured energy dependence of the total fission cross sections is presented. Obtained results are compared with other experimental data as well as with calculation in the frame of the cascade evaporation model. (authors)

  4. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  5. Search for a spin-dependent short-range force between nucleons with a 3He/129Xe clock-comparison experiment

    International Nuclear Information System (INIS)

    Tullney, Kathlynne

    2014-01-01

    The standard model (SM) of particle physics describes all known particles and their interactions. However, the SM leaves many issues unresolved. For example, it only includes three of the four fundamental forces and does not clarify the question why in the strong interaction CP symmetry is violated due to its non-trivial vacuum structure is predicted (Θ-term), but experimentally unverifiable. The latter one is known as the strong CP-problem of quantum chromodynamics (QCD) and is solved by the Peccei-Quinn-Weinberg-Wilczek theory. This theory predicts a new and almost massless boson which is known as the axion. The axion feebly interacts with matter and therefore it is a good candidate for cold dark matter, too. Axions are produced by the Primakoff-effect, i.e. by conversion of photons which are scattered in the electromagnetic field, e.g. of atoms. The inverse Primakoff-effect, which converts axions to photons again, can be used for direct detection of galactic, solar, or laboratory axions. Cosmological and astrophysical observations constrain the mass of the axion from a few μeV to some meV (''axion mass window''). If the axion exists, then it mediates a CP violating, spin-dependent, short-range interaction between a fermion and the spin of another fermion. By verification of this interaction, the axion can be detected indirectly. In the framework of the present thesis an experiment to search for this spindependent short-range interaction was performed in the magnetically shielded room BMSR-2 of the Physikalisch-Technische Bundesanstalt Berlin. An ultra-sensitive low-field co-magnetometer was employed which is based on the detection of free precession of 3 He and 129 Xe nuclear spins using SQUIDs as low-noise magnetic flux detectors. The two nuclear spin polarized gases are filled into a glass cell which is immersed in a low magnetic field of about B 0 = 0.35 μT with absolute field gradients in the order of pT/cm. The spin precession frequencies of 3 He and 129

  6. Precipitação esperada, em diferentes níveis de probabilidade, na região de Dourados, MS Dependable rainfall, using different levels of probability, in Dourados, MS, Brazil

    Directory of Open Access Journals (Sweden)

    Carlos Ricardo Fietz

    1998-03-01

    Full Text Available O objetivo deste trabalho foi determinar a precipitação pluviométrica esperada, em diferentes níveis de probabilidade, na região de Dourados, MS. O estudo, realizado para períodos decendiais, quinzenais e mensais, baseou-se em dados diários de precipitação de 17 anos. As séries foram ajustadas a uma distribuição mista e a aderência dos dados verificada pelo teste de Kolmogorov-Smirnov. Os parâmetros da distribuição gama foram estimados pelo método da máxima verossimilhança. Houve ajuste dos dados para todos os períodos. Através da distribuição acumulada mista foram gerados valores de precipitação esperada para períodos de retorno de 2, 3, 4, 5, 8, 10, 12 e 17 anos. Os resultados obtidos permitem que se adotem valores criteriosos de precipitação no dimensionamento de sistemas de irrigação implantados na região de Dourados. O uso da precipitação média para este fim não é recomendado, pois pode resultar em projetos subdimensionados.The aim of this work was to determine the dependable rainfall using different levels of probability in Dourados, Mato Grosso do Sul State, Brazil. This study was conducted for intervals of 10, 15 and 30 days, based on daily data of rainfall for 17 years. Series were adjusted to a mixed distribution and a non-parametric test Kolmogorov-Smirnov was used . The parameters of gama distribution were estimated by the method maximum likelihood. Data were adjusted to all periods. Based on the accumulated mixed distribution were generated values of rainfall for periods return of 2, 3, 4, 5, 10, 12 and 17 years. Results allow irrigation personal to adopt criterions values of rainfall for designing irrigation projects near Dourados-MS. The average rainfall should not be used because it may result in under estimation of the rainfall.

  7. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  8. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  9. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  10. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  11. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  14. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  15. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  16. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  17. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  18. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  19. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  20. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  1. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  2. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  3. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  4. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  5. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  6. Prediction ranges. Annual review

    Energy Technology Data Exchange (ETDEWEB)

    Parker, J.C.; Tharp, W.H.; Spiro, P.S.; Keng, K.; Angastiniotis, M.; Hachey, L.T.

    1988-01-01

    Prediction ranges equip the planner with one more tool for improved assessment of the outcome of a course of action. One of their major uses is in financial evaluations, where corporate policy requires the performance of uncertainty analysis for large projects. This report gives an overview of the uses of prediction ranges, with examples; and risks and uncertainties in growth, inflation, and interest and exchange rates. Prediction ranges and standard deviations of 80% and 50% probability are given for various economic indicators in Ontario, Canada, and the USA, as well as for foreign exchange rates and Ontario Hydro interest rates. An explanatory note on probability is also included. 23 tabs.

  7. Persistent current and transmission probability in the Aharonov-Bohm ring with an embedded quantum dot

    International Nuclear Information System (INIS)

    Wu Suzhi; Li Ning; Jin Guojun; Ma Yuqiang

    2008-01-01

    Persistent current and transmission probability in the Aharonov-Bohm (AB) ring with an embedded quantum dot (QD) are studied using the technique of the scattering matrix. For the first time, we find that the persistent current can arise in the absence of magnetic flux in the ring with an embedded QD. The persistent current and the transmission probability are sensitive to the lead-ring coupling and the short-range potential barrier. It is shown that increasing the lead-ring coupling or the short-range potential barrier causes the suppression of the persistent current and the increasing resonance width of the transmission probability. The effect of the potential barrier on the number of the transmission peaks is also investigated. The dependence of the persistent current and the transmission probability on the magnetic flux exhibits a periodic property with period of the flux quantum

  8. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  9. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  10. Temperature dependence of the dielectric tensor of monoclinic Ga2O3 single crystals in the spectral range 1.0-8.5 eV

    Science.gov (United States)

    Sturm, C.; Schmidt-Grund, R.; Zviagin, V.; Grundmann, M.

    2017-08-01

    The full dielectric tensor of monoclinic Ga2O3 (β-phase) was determined by generalized spectroscopic ellipsometry in the spectral range from 1.0 eV up to 8.5 eV and temperatures in the range from 10 K up to 300 K. By using the oriented dipole approach, the energies and broadenings of the excitonic transitions are determined as a function of the temperature, and the exciton-phonon coupling properties are deduced.

  11. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  12. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  13. Criterion 1: Conservation of biological diversity - Indicator 8: The number of forest dependent species that occupy a small portion of their former range

    Science.gov (United States)

    Curtis H. Flather; Carolyn Hull Sieg; Michael S. Knowles; Jason McNees

    2003-01-01

    This indicator measures the portion of a species' historical distribution that is currently occupied as a surrogate measure of genetic diversity. Based on data for 1,642 terrestrial animals associated with forests, most species (88 percent) were found to fully occupy their historic range - at least as measured by coarse state-level occurrence patterns. Of the 193...

  14. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  15. Time evolution of some quantum-mechanical systems. Wavefunction cloning in evolving rotating systems. Finite range boundary conditions for time dependent Schroedinger Equation

    International Nuclear Information System (INIS)

    Arvieu, R.; Carbonell, J.; Gignoux, C.; Mangin-Brinet, M.; Rozmej, P.

    1997-01-01

    The time evolution of coherent rotational wave packets associated to a diatomic molecule or to a deformed nucleus has been studied. Assuming a rigid body dynamics the J(J+1) law leads to a mechanism of cloning: the way function is divided into wave packets identical to the initial one at specific time. Applications are studied for a nuclear wave packed formed by Coulomb excitation. Exact boundary conditions at finite distance for the solution of the time-dependent Schroedinger equation are derived. A numerical scheme based on Crank-Nicholson method is proposed to illustrate its applicability in several examples. (authors)

  16. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  17. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  19. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  20. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  1. Re-recognition of Age-dependent Reference Range for the Serum Creatinine Level in Teenagers - A Case of Slowly Progressive Tubulointerstitial Nephritis which Occurred in an Adolescent.

    Science.gov (United States)

    Ono, Hiroyuki; Nagai, Kojiro; Shibata, Eriko; Matsuura, Motokazu; Kishi, Seiji; Inagaki, Taizo; Minato, Masanori; Yoshimoto, Sakiya; Ueda, Sayo; Obata, Fumiaki; Nishimura, Kenji; Tamaki, Masanori; Kishi, Fumi; Murakami, Taichi; Abe, Hideharu; Kinoshita, Yukiko; Urushihara, Maki; Kagami, Shoji; Doi, Toshio

    2017-08-15

    For the first time, a 15-year-old boy was found to have a slight degree of proteinuria and microscopic hematuria during annual school urinalysis screening. His kidney function had already severely deteriorated. A kidney biopsy revealed tubulointerstitial nephritis (TIN) with diffuse inflammatory cell infiltration. His medical records showed his serum creatinine level to be 0.98 mg/dL two years ago, which was abnormally high considering his age. Although the etiology of slowly progressive TIN was unclear, glucocorticoid and immunosuppressant therapy improved his kidney function. This case report suggests that all doctors should recognize the reference range for the serum creatinine level in teenagers.

  2. Probability of brittle failure

    Science.gov (United States)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  3. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1993-01-01

    This paper will describe a methodology which has been developed to allow accident probabilities associated with one severity category scheme to be transferred to another severity category scheme, permitting some comparisons of different studies at the category level. In this methodology, the severity category schemes to be compared are mapped onto a common set of axes. The axes represent critical accident environments (e.g., impact, thermal, crush, puncture) and indicate the range of accident parameters from zero (no accident) to the most sever credible forces. The choice of critical accident environments for the axes depends on the package being transported and the mode of transportation. The accident probabilities associated with one scheme are then transferred to the other scheme. This transfer of category probabilities is based on the relationships of the critical accident parameters to probability of occurrence. The methodology can be employed to transfer any quantity between category schemes if the appropriate supporting information is available. (J.P.N.)

  4. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  5. Do Cloud Properties in a Puerto Rican Tropical Montane Cloud Forest Depend on Occurrence of Long-Range Transported African Dust?

    Science.gov (United States)

    Spiegel, Johanna K.; Buchmann, Nina; Mayol-Bracero, Olga L.; Cuadra-Rodriguez, Luis A.; Valle Díaz, Carlos J.; Prather, Kimberly A.; Mertes, Stephan; Eugster, Werner

    2014-09-01

    We investigated cloud properties of warm clouds in a tropical montane cloud forest at Pico del Este (1,051 m a.s.l.) in the northeastern part of Puerto Rico to address the question of whether cloud properties in the Caribbean could potentially be affected by African dust transported across the Atlantic Ocean. We analyzed data collected during 12 days in July 2011. Cloud droplet size spectra were measured using the FM-100 fog droplet spectrometer that measured droplet size distributions in the range from 2 to 49 µm, primarily during fog events. The droplet size spectra revealed a bimodal structure, with the first peak ( D < 6 µm) being more pronounced in terms of droplet number concentrations, whereas the second peak (10 µm < D < 20 µm) was found to be the one relevant for total liquid water content (LWC) of the cloud. We identified three major clusters of characteristic droplet size spectra by means of hierarchical clustering. All clusters differed significantly from each other in droplet number concentration (), effective diameter (ED), and median volume diameter (MVD). For the cluster comprising the largest droplets and the lowest droplet number concentrations, we found evidence of inhomogeneous mixing in the cloud. Contrastingly, the other two clusters revealed microphysical behavior, which could be expected under homogeneous mixing conditions. For those conditions, an increase in cloud condensation nuclei—e.g., from processed African dust transported to the site—is supposed to lead to an increased droplet concentration. In fact, one of these two clusters showed a clear shift of cloud droplet size spectra towards smaller droplet diameters. Since this cluster occurred during periods with strong evidence for the presence of long-range transported African dust, we hypothesize a link between the observed dust episodes and cloud characteristics in the Caribbean at our site, which is similar to the anthropogenic aerosol indirect effect.

  6. Porous silicon-VO{sub 2} based hybrids as possible optical temperature sensor: Wavelength-dependent optical switching from visible to near-infrared range

    Energy Technology Data Exchange (ETDEWEB)

    Antunez, E. E.; Salazar-Kuri, U.; Estevez, J. O.; Basurto, M. A.; Agarwal, V., E-mail: vagarwal@uaem.mx [Centro de Investigación en Ingeniería y Ciencias Aplicadas, Instituto de Investigación en Ciencias Básicas y Aplicadas, UAEM, Av. Universidad 1001, Col. Chamilpa, Cuernavaca, Mor. 62209 (Mexico); Campos, J. [Instituto de Energías Renovables, UNAM, Priv. Xochicalco S/N, Temixco, Mor. 62580 (Mexico); Jiménez Sandoval, S. [Laboratorio de Investigación en Materiales, Centro de Investigación y estudios Avanzados del Instituto Politécnico Nacional, Unidad Querétaro, Qro. 76001 (Mexico)

    2015-10-07

    Morphological properties of thermochromic VO{sub 2}—porous silicon based hybrids reveal the growth of well-crystalized nanometer-scale features of VO{sub 2} as compared with typical submicron granular structure obtained in thin films deposited on flat substrates. Structural characterization performed as a function of temperature via grazing incidence X-ray diffraction and micro-Raman demonstrate reversible semiconductor-metal transition of the hybrid, changing from a low-temperature monoclinic VO{sub 2}(M) to a high-temperature tetragonal rutile VO{sub 2}(R) crystalline structure, coupled with a decrease in phase transition temperature. Effective optical response studied in terms of red/blue shift of the reflectance spectra results in a wavelength-dependent optical switching with temperature. As compared to VO{sub 2} film over crystalline silicon substrate, the hybrid structure is found to demonstrate up to 3-fold increase in the change of reflectivity with temperature, an enlarged hysteresis loop and a wider operational window for its potential application as an optical temperature sensor. Such silicon based hybrids represent an exciting class of functional materials to display thermally triggered optical switching culminated by the characteristics of each of the constituent blocks as well as device compatibility with standard integrated circuit technology.

  7. Mutation induction and neoplastic transformation in human and human-hamster hybrid cells: dependence on photon energy and modulation in the low-dose range

    Energy Technology Data Exchange (ETDEWEB)

    Frankenberg, D.; Frankenberg-Schwager, M.; Garg, I.; Pralle, E. [Abt. Klin. Strahlenbiologie und Klin. Strahlenphysik, Universitaet Goettingen, Goettingen (Germany); Uthe, D.; Greve, B.; Severin, E.; Goehde, W. [Institut fuer Strahlenbiologie, Universitaet Muenster, Munster (Germany)

    2002-09-01

    Mutation induction in the HPRT gene of human fibroblasts after irradiation with mammography-like 29 kVp or 200 kVp x-rays shows radiohypersensitivity for doses smaller than {approx}0.5 Gy. Similarly, mutation induction in the CD 59 gene on human chromosome 11 in A{sub L} cells shows radiohypersensitivity for doses smaller than {approx}0.5 Gy after exposure to 200 kVp x-rays, but not after irradiation with low-filtered 30 kVp x-rays. The RBE values of 29 and 30 kVp x-rays relative to 200 kVp x-rays are strongly dose dependent. For neoplastic transformation of human hybrid (CGL1) cells after irradiation with 29 or 200 kVp x-rays or {sup 60}Co gamma rays a linear-quadratic dose relationship was observed with RBE values of approximately four and eight for mammography relative to 200 kVp x-rays and {sup 60}Co gamma rays, respectively. (author)

  8. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  9. Pseudorapidity dependence of long-range two-particle correlations in pPb collisions at $\\sqrt{s_{\\mathrm{NN}}}=$ 5.02 TeV

    CERN Document Server

    Khachatryan, Vardan; Tumasyan, Armen; Adam, Wolfgang; Aşılar, Ece; Bergauer, Thomas; Brandstetter, Johannes; Brondolin, Erica; Dragicevic, Marko; Erö, Janos; Flechl, Martin; Friedl, Markus; Fruehwirth, Rudolf; Ghete, Vasile Mihai; Hartl, Christian; Hörmann, Natascha; Hrubec, Josef; Jeitler, Manfred; Knünz, Valentin; König, Axel; Krammer, Manfred; Krätschmer, Ilse; Liko, Dietrich; Matsushita, Takashi; Mikulec, Ivan; Rabady, Dinyar; Rad, Navid; Rahbaran, Babak; Rohringer, Herbert; Schieck, Jochen; Schöfbeck, Robert; Strauss, Josef; Treberer-Treberspurg, Wolfgang; Waltenberger, Wolfgang; Wulz, Claudia-Elisabeth; Mossolov, Vladimir; Shumeiko, Nikolai; Suarez Gonzalez, Juan; Alderweireldt, Sara; Cornelis, Tom; De Wolf, Eddi A; Janssen, Xavier; Knutsson, Albert; Lauwers, Jasper; Luyckx, Sten; Van De Klundert, Merijn; Van Haevermaet, Hans; Van Mechelen, Pierre; Van Remortel, Nick; Van Spilbeeck, Alex; Abu Zeid, Shimaa; Blekman, Freya; D'Hondt, Jorgen; Daci, Nadir; De Bruyn, Isabelle; Deroover, Kevin; Heracleous, Natalie; Keaveney, James; Lowette, Steven; Moreels, Lieselotte; Olbrechts, Annik; Python, Quentin; Strom, Derek; Tavernier, Stefaan; Van Doninck, Walter; Van Mulders, Petra; Van Onsem, Gerrit Patrick; Van Parijs, Isis; Barria, Patrizia; Brun, Hugues; Caillol, Cécile; Clerbaux, Barbara; De Lentdecker, Gilles; Fasanella, Giuseppe; Favart, Laurent; Goldouzian, Reza; Grebenyuk, Anastasia; Karapostoli, Georgia; Lenzi, Thomas; Léonard, Alexandre; Maerschalk, Thierry; Marinov, Andrey; Perniè, Luca; Randle-conde, Aidan; Seva, Tomislav; Vander Velde, Catherine; Vanlaer, Pascal; Yonamine, Ryo; Zenoni, Florian; Zhang, Fengwangdong; Beernaert, Kelly; Benucci, Leonardo; Cimmino, Anna; Crucy, Shannon; Dobur, Didar; Fagot, Alexis; Garcia, Guillaume; Gul, Muhammad; Mccartin, Joseph; Ocampo Rios, Alberto Andres; Poyraz, Deniz; Ryckbosch, Dirk; Salva Diblen, Sinem; Sigamani, Michael; Tytgat, Michael; Van Driessche, Ward; Yazgan, Efe; Zaganidis, Nicolas; Basegmez, Suzan; Beluffi, Camille; Bondu, Olivier; Brochet, Sébastien; Bruno, Giacomo; Caudron, Adrien; Ceard, Ludivine; Delaere, Christophe; Favart, Denis; Forthomme, Laurent; Giammanco, Andrea; Jafari, Abideh; Jez, Pavel; Komm, Matthias; Lemaitre, Vincent; Mertens, Alexandre; Musich, Marco; Nuttens, Claude; Perrini, Lucia; Piotrzkowski, Krzysztof; Popov, Andrey; Quertenmont, Loic; Selvaggi, Michele; Vidal Marono, Miguel; Beliy, Nikita; Hammad, Gregory Habib; Aldá Júnior, Walter Luiz; Alves, Fábio Lúcio; Alves, Gilvan; Brito, Lucas; Correa Martins Junior, Marcos; Hamer, Matthias; Hensel, Carsten; Moraes, Arthur; Pol, Maria Elena; Rebello Teles, Patricia; Belchior Batista Das Chagas, Ewerton; Carvalho, Wagner; Chinellato, Jose; Custódio, Analu; Melo Da Costa, Eliza; De Jesus Damiao, Dilson; De Oliveira Martins, Carley; Fonseca De Souza, Sandro; Huertas Guativa, Lina Milena; Malbouisson, Helena; Matos Figueiredo, Diego; Mora Herrera, Clemencia; Mundim, Luiz; Nogima, Helio; Prado Da Silva, Wanda Lucia; Santoro, Alberto; Sznajder, Andre; Tonelli Manganote, Edmilson José; Vilela Pereira, Antonio; Ahuja, Sudha; Bernardes, Cesar Augusto; De Souza Santos, Angelo; Dogra, Sunil; Tomei, Thiago; De Moraes Gregores, Eduardo; Mercadante, Pedro G; Moon, Chang-Seong; Novaes, Sergio F; Padula, Sandra; Romero Abad, David; Ruiz Vargas, José Cupertino; Aleksandrov, Aleksandar; Hadjiiska, Roumyana; Iaydjiev, Plamen; Rodozov, Mircho; Stoykova, Stefka; Sultanov, Georgi; Vutova, Mariana; Dimitrov, Anton; Glushkov, Ivan; Litov, Leander; Pavlov, Borislav; Petkov, Peicho; Ahmad, Muhammad; Bian, Jian-Guo; Chen, Guo-Ming; Chen, He-Sheng; Chen, Mingshui; Cheng, Tongguang; Du, Ran; Jiang, Chun-Hua; Leggat, Duncan; Plestina, Roko; Romeo, Francesco; Shaheen, Sarmad Masood; Spiezia, Aniello; Tao, Junquan; Wang, Chunjie; Wang, Zheng; Zhang, Huaqiao; Asawatangtrakuldee, Chayanit; Ban, Yong; Li, Qiang; Liu, Shuai; Mao, Yajun; Qian, Si-Jin; Wang, Dayong; Xu, Zijun; Avila, Carlos; Cabrera, Andrés; Chaparro Sierra, Luisa Fernanda; Florez, Carlos; Gomez, Juan Pablo; Gomez Moreno, Bernardo; Sanabria, Juan Carlos; Godinovic, Nikola; Lelas, Damir; Puljak, Ivica; Ribeiro Cipriano, Pedro M; Antunovic, Zeljko; Kovac, Marko; Brigljevic, Vuko; Kadija, Kreso; Luetic, Jelena; Micanovic, Sasa; Sudic, Lucija; Attikis, Alexandros; Mavromanolakis, Georgios; Mousa, Jehad; Nicolaou, Charalambos; Ptochos, Fotios; Razis, Panos A; Rykaczewski, Hans; Bodlak, Martin; Finger, Miroslav; Finger Jr, Michael; Abdelalim, Ahmed Ali; Awad, Adel; Mahrous, Ayman; Radi, Amr; Calpas, Betty; Kadastik, Mario; Murumaa, Marion; Raidal, Martti; Tiko, Andres; Veelken, Christian; Eerola, Paula; Pekkanen, Juska; Voutilainen, Mikko; Härkönen, Jaakko; Karimäki, Veikko; Kinnunen, Ritva; Lampén, Tapio; Lassila-Perini, Kati; Lehti, Sami; Lindén, Tomas; Luukka, Panja-Riina; Peltola, Timo; Tuominiemi, Jorma; Tuovinen, Esa; Wendland, Lauri; Talvitie, Joonas; Tuuva, Tuure; Besancon, Marc; Couderc, Fabrice; Dejardin, Marc; Denegri, Daniel; Fabbro, Bernard; Faure, Jean-Louis; Favaro, Carlotta; Ferri, Federico; Ganjour, Serguei; Givernaud, Alain; Gras, Philippe; Hamel de Monchenault, Gautier; Jarry, Patrick; Locci, Elizabeth; Machet, Martina; Malcles, Julie; Rander, John; Rosowsky, André; Titov, Maksym; Zghiche, Amina; Antropov, Iurii; Baffioni, Stephanie; Beaudette, Florian; Busson, Philippe; Cadamuro, Luca; Chapon, Emilien; Charlot, Claude; Davignon, Olivier; Filipovic, Nicolas; Granier de Cassagnac, Raphael; Jo, Mihee; Lisniak, Stanislav; Mastrolorenzo, Luca; Miné, Philippe; Naranjo, Ivo Nicolas; Nguyen, Matthew; Ochando, Christophe; Ortona, Giacomo; Paganini, Pascal; Pigard, Philipp; Regnard, Simon; Salerno, Roberto; Sauvan, Jean-Baptiste; Sirois, Yves; Strebler, Thomas; Yilmaz, Yetkin; Zabi, Alexandre; Agram, Jean-Laurent; Andrea, Jeremy; Aubin, Alexandre; Bloch, Daniel; Brom, Jean-Marie; Buttignol, Michael; Chabert, Eric Christian; Chanon, Nicolas; Collard, Caroline; Conte, Eric; Coubez, Xavier; Fontaine, Jean-Charles; Gelé, Denis; Goerlach, Ulrich; Goetzmann, Christophe; Le Bihan, Anne-Catherine; Merlin, Jeremie Alexandre; Skovpen, Kirill; Van Hove, Pierre; Gadrat, Sébastien; Beauceron, Stephanie; Bernet, Colin; Boudoul, Gaelle; Bouvier, Elvire; Carrillo Montoya, Camilo Andres; Chierici, Roberto; Contardo, Didier; Courbon, Benoit; Depasse, Pierre; El Mamouni, Houmani; Fan, Jiawei; Fay, Jean; Gascon, Susan; Gouzevitch, Maxime; Ille, Bernard; Lagarde, Francois; Laktineh, Imad Baptiste; Lethuillier, Morgan; Mirabito, Laurent; Pequegnot, Anne-Laure; Perries, Stephane; Ruiz Alvarez, José David; Sabes, David; Sgandurra, Louis; Sordini, Viola; Vander Donckt, Muriel; Verdier, Patrice; Viret, Sébastien; Toriashvili, Tengizi; Tsamalaidze, Zviad; Autermann, Christian; Beranek, Sarah; Feld, Lutz; Heister, Arno; Kiesel, Maximilian Knut; Klein, Katja; Lipinski, Martin; Ostapchuk, Andrey; Preuten, Marius; Raupach, Frank; Schael, Stefan; Schulte, Jan-Frederik; Verlage, Tobias; Weber, Hendrik; Zhukov, Valery; Ata, Metin; Brodski, Michael; Dietz-Laursonn, Erik; Duchardt, Deborah; Endres, Matthias; Erdmann, Martin; Erdweg, Sören; Esch, Thomas; Fischer, Robert; Güth, Andreas; Hebbeker, Thomas; Heidemann, Carsten; Hoepfner, Kerstin; Knutzen, Simon; Kreuzer, Peter; Merschmeyer, Markus; Meyer, Arnd; Millet, Philipp; Mukherjee, Swagata; Olschewski, Mark; Padeken, Klaas; Papacz, Paul; Pook, Tobias; Radziej, Markus; Reithler, Hans; Rieger, Marcel; Scheuch, Florian; Sonnenschein, Lars; Teyssier, Daniel; Thüer, Sebastian; Cherepanov, Vladimir; Erdogan, Yusuf; Flügge, Günter; Geenen, Heiko; Geisler, Matthias; Hoehle, Felix; Kargoll, Bastian; Kress, Thomas; Künsken, Andreas; Lingemann, Joschka; Nehrkorn, Alexander; Nowack, Andreas; Nugent, Ian Michael; Pistone, Claudia; Pooth, Oliver; Stahl, Achim; Aldaya Martin, Maria; Asin, Ivan; Bartosik, Nazar; Behnke, Olaf; Behrens, Ulf; Borras, Kerstin; Burgmeier, Armin; Campbell, Alan; Contreras-Campana, Christian; Costanza, Francesco; Diez Pardos, Carmen; Dolinska, Ganna; Dooling, Samantha; Dorland, Tyler; Eckerlin, Guenter; Eckstein, Doris; Eichhorn, Thomas; Flucke, Gero; Gallo, Elisabetta; Garay Garcia, Jasone; Geiser, Achim; Gizhko, Andrii; Gunnellini, Paolo; Hauk, Johannes; Hempel, Maria; Jung, Hannes; Kalogeropoulos, Alexis; Karacheban, Olena; Kasemann, Matthias; Katsas, Panagiotis; Kieseler, Jan; Kleinwort, Claus; Korol, Ievgen; Lange, Wolfgang; Leonard, Jessica; Lipka, Katerina; Lobanov, Artur; Lohmann, Wolfgang; Mankel, Rainer; Melzer-Pellmann, Isabell-Alissandra; Meyer, Andreas Bernhard; Mittag, Gregor; Mnich, Joachim; Mussgiller, Andreas; Naumann-Emme, Sebastian; Nayak, Aruna; Ntomari, Eleni; Perrey, Hanno; Pitzl, Daniel; Placakyte, Ringaile; Raspereza, Alexei; Roland, Benoit; Sahin, Mehmet Özgür; Saxena, Pooja; Schoerner-Sadenius, Thomas; Seitz, Claudia; Spannagel, Simon; Trippkewitz, Karim Damun; Walsh, Roberval; Wissing, Christoph; Blobel, Volker; Centis Vignali, Matteo; Draeger, Arne-Rasmus; Erfle, Joachim; Garutti, Erika; Goebel, Kristin; Gonzalez, Daniel; Görner, Martin; Haller, Johannes; Hoffmann, Malte; Höing, Rebekka Sophie; Junkes, Alexandra; Klanner, Robert; Kogler, Roman; Kovalchuk, Nataliia; Lapsien, Tobias; Lenz, Teresa; Marchesini, Ivan; Marconi, Daniele; Meyer, Mareike; Nowatschin, Dominik; Ott, Jochen; Pantaleo, Felice; Peiffer, Thomas; Perieanu, Adrian; Pietsch, Niklas; Poehlsen, Jennifer; Rathjens, Denis; Sander, Christian; Scharf, Christian; Schleper, Peter; Schlieckau, Eike; Schmidt, Alexander; Schumann, Svenja; Schwandt, Joern; Sola, Valentina; Stadie, Hartmut; Steinbrück, Georg; Stober, Fred-Markus Helmut; Tholen, Heiner; Troendle, Daniel; Usai, Emanuele; Vanelderen, Lukas; Vanhoefer, Annika; Vormwald, Benedikt; Barth, Christian; Baus, Colin; Berger, Joram; Böser, Christian; Butz, Erik; Chwalek, Thorsten; Colombo, Fabio; De Boer, Wim; Descroix, Alexis; Dierlamm, Alexander; Fink, Simon; Frensch, Felix; Friese, Raphael; Giffels, Manuel; Gilbert, Andrew; Haitz, Dominik; Hartmann, Frank; Heindl, Stefan Michael; Husemann, Ulrich; Katkov, Igor; Kornmayer, Andreas; Lobelle Pardo, Patricia; Maier, Benedikt; Mildner, Hannes; Mozer, Matthias Ulrich; Müller, Thomas; Müller, Thomas; Plagge, Michael; Quast, Gunter; Rabbertz, Klaus; Röcker, Steffen; Roscher, Frank; Schröder, Matthias; Sieber, Georg; Simonis, Hans-Jürgen; Ulrich, Ralf; Wagner-Kuhr, Jeannine; Wayand, Stefan; Weber, Marc; Weiler, Thomas; Williamson, Shawn; Wöhrmann, Clemens; Wolf, Roger; Anagnostou, Georgios; Daskalakis, Georgios; Geralis, Theodoros; Giakoumopoulou, Viktoria Athina; Kyriakis, Aristotelis; Loukas, Demetrios; Psallidas, Andreas; Topsis-Giotis, Iasonas; Agapitos, Antonis; Kesisoglou, Stilianos; Panagiotou, Apostolos; Saoulidou, Niki; Tziaferi, Eirini; Evangelou, Ioannis; Flouris, Giannis; Foudas, Costas; Kokkas, Panagiotis; Loukas, Nikitas; Manthos, Nikolaos; Papadopoulos, Ioannis; Paradas, Evangelos; Strologas, John; Bencze, Gyorgy; Hajdu, Csaba; Hazi, Andras; Hidas, Pàl; Horvath, Dezso; Sikler, Ferenc; Veszpremi, Viktor; Vesztergombi, Gyorgy; Zsigmond, Anna Julia; Beni, Noemi; Czellar, Sandor; Karancsi, János; Molnar, Jozsef; Szillasi, Zoltan; Bartók, Márton; Makovec, Alajos; Raics, Peter; Trocsanyi, Zoltan Laszlo; Ujvari, Balazs; Choudhury, Somnath; Mal, Prolay; Mandal, Koushik; Sahoo, Deepak Kumar; Sahoo, Niladribihari; Swain, Sanjay Kumar; Bansal, Sunil; Beri, Suman Bala; Bhatnagar, Vipin; Chawla, Ridhi; Gupta, Ruchi; Bhawandeep, Bhawandeep; Kalsi, Amandeep Kaur; Kaur, Anterpreet; Kaur, Manjit; Kumar, Ramandeep; Mehta, Ankita; Mittal, Monika; Singh, Jasbir; Walia, Genius; Kumar, Ashok; Bhardwaj, Ashutosh; Choudhary, Brajesh C; Garg, Rocky Bala; Malhotra, Shivali; Naimuddin, Md; Nishu, Nishu; Ranjan, Kirti; Sharma, Ramkrishna; Sharma, Varun; Bhattacharya, Satyaki; Chatterjee, Kalyanmoy; Dey, Sourav; Dutta, Suchandra; Majumdar, Nayana; Modak, Atanu; Mondal, Kuntal; Mukhopadhyay, Supratik; Roy, Ashim; Roy, Debarati; Roy Chowdhury, Suvankar; Sarkar, Subir; Sharan, Manoj; Abdulsalam, Abdulla; Chudasama, Ruchi; Dutta, Dipanwita; Jha, Vishwajeet; Kumar, Vineet; Mohanty, Ajit Kumar; Pant, Lalit Mohan; Shukla, Prashant; Topkar, Anita; Aziz, Tariq; Banerjee, Sudeshna; Bhowmik, Sandeep; Chatterjee, Rajdeep Mohan; Dewanjee, Ram Krishna; Dugad, Shashikant; Ganguly, Sanmay; Ghosh, Saranya; Guchait, Monoranjan; Gurtu, Atul; Jain, Sandhya; Kole, Gouranga; Kumar, Sanjeev; Mahakud, Bibhuprasad; Maity, Manas; Majumder, Gobinda; Mazumdar, Kajari; Mitra, Soureek; Mohanty, Gagan Bihari; Parida, Bibhuti; Sarkar, Tanmay; Sur, Nairit; Sutar, Bajrang; Wickramage, Nadeesha; Chauhan, Shubhanshu; Dube, Sourabh; Kapoor, Anshul; Kothekar, Kunal; Sharma, Seema; Bakhshiansohi, Hamed; Behnamian, Hadi; Etesami, Seyed Mohsen; Fahim, Ali; Khakzad, Mohsen; Mohammadi Najafabadi, Mojtaba; Naseri, Mohsen; Paktinat Mehdiabadi, Saeid; Rezaei Hosseinabadi, Ferdos; Safarzadeh, Batool; Zeinali, Maryam; Felcini, Marta; Grunewald, Martin; Abbrescia, Marcello; Calabria, Cesare; Caputo, Claudio; Colaleo, Anna; Creanza, Donato; Cristella, Leonardo; De Filippis, Nicola; De Palma, Mauro; Fiore, Luigi; Iaselli, Giuseppe; Maggi, Giorgio; Maggi, Marcello; Miniello, Giorgia; My, Salvatore; Nuzzo, Salvatore; Pompili, Alexis; Pugliese, Gabriella; Radogna, Raffaella; Ranieri, Antonio; Selvaggi, Giovanna; Silvestris, Lucia; Venditti, Rosamaria; Abbiendi, Giovanni; Battilana, Carlo; Benvenuti, Alberto; Bonacorsi, Daniele; Braibant-Giacomelli, Sylvie; Brigliadori, Luca; Campanini, Renato; Capiluppi, Paolo; Castro, Andrea; Cavallo, Francesca Romana; Chhibra, Simranjit Singh; Codispoti, Giuseppe; Cuffiani, Marco; Dallavalle, Gaetano-Marco; Fabbri, Fabrizio; Fanfani, Alessandra; Fasanella, Daniele; Giacomelli, Paolo; Grandi, Claudio; Guiducci, Luigi; Marcellini, Stefano; Masetti, Gianni; Montanari, Alessandro; Navarria, Francesco; Perrotta, Andrea; Rossi, Antonio; Rovelli, Tiziano; Siroli, Gian Piero; Tosi, Nicolò; Cappello, Gigi; Chiorboli, Massimiliano; Costa, Salvatore; Di Mattia, Alessandro; Giordano, Ferdinando; Potenza, Renato; Tricomi, Alessia; Tuve, Cristina; Barbagli, Giuseppe; Ciulli, Vitaliano; Civinini, Carlo; D'Alessandro, Raffaello; Focardi, Ettore; Gori, Valentina; Lenzi, Piergiulio; Meschini, Marco; Paoletti, Simone; Sguazzoni, Giacomo; Viliani, Lorenzo; Benussi, Luigi; Bianco, Stefano; Fabbri, Franco; Piccolo, Davide; Primavera, Federica; Calvelli, Valerio; Ferro, Fabrizio; Lo Vetere, Maurizio; Monge, Maria Roberta; Robutti, Enrico; Tosi, Silvano; Brianza, Luca; Dinardo, Mauro Emanuele; Fiorendi, Sara; Gennai, Simone; Gerosa, Raffaele; Ghezzi, Alessio; Govoni, Pietro; Malvezzi, Sandra; Manzoni, Riccardo Andrea; Marzocchi, Badder; Menasce, Dario; Moroni, Luigi; Paganoni, Marco; Pedrini, Daniele; Ragazzi, Stefano; Redaelli, Nicola; Tabarelli de Fatis, Tommaso; Buontempo, Salvatore; Cavallo, Nicola; Di Guida, Salvatore; Esposito, Marco; Fabozzi, Francesco; Iorio, Alberto Orso Maria; Lanza, Giuseppe; Lista, Luca; Meola, Sabino; Merola, Mario; Paolucci, Pierluigi; Sciacca, Crisostomo; Thyssen, Filip; Azzi, Patrizia; Bacchetta, Nicola; Bellato, Marco; Benato, Lisa; Boletti, Alessio; Branca, Antonio; Dall'Osso, Martino; Dorigo, Tommaso; Fantinel, Sergio; Fanzago, Federica; Gonella, Franco; Gozzelino, Andrea; Kanishchev, Konstantin; Lacaprara, Stefano; Margoni, Martino; Meneguzzo, Anna Teresa; Montecassiano, Fabio; Passaseo, Marina; Pazzini, Jacopo; Pegoraro, Matteo; Pozzobon, Nicola; Ronchese, Paolo; Simonetto, Franco; Torassa, Ezio; Tosi, Mia; Ventura, Sandro; Zanetti, Marco; Zotto, Pierluigi; Zucchetta, Alberto; Braghieri, Alessandro; Magnani, Alice; Montagna, Paolo; Ratti, Sergio P; Re, Valerio; Riccardi, Cristina; Salvini, Paola; Vai, Ilaria; Vitulo, Paolo; Alunni Solestizi, Luisa; Bilei, Gian Mario; Ciangottini, Diego; Fanò, Livio; Lariccia, Paolo; Mantovani, Giancarlo; Menichelli, Mauro; Saha, Anirban; Santocchia, Attilio; Androsov, Konstantin; Azzurri, Paolo; Bagliesi, Giuseppe; Bernardini, Jacopo; Boccali, Tommaso; Castaldi, Rino; Ciocci, Maria Agnese; Dell'Orso, Roberto; Donato, Silvio; Fedi, Giacomo; Foà, Lorenzo; Giassi, Alessandro; Grippo, Maria Teresa; Ligabue, Franco; Lomtadze, Teimuraz; Martini, Luca; Messineo, Alberto; Palla, Fabrizio; Rizzi, Andrea; Savoy-Navarro, Aurore; Serban, Alin Titus; Spagnolo, Paolo; Tenchini, Roberto; Tonelli, Guido; Venturi, Andrea; Verdini, Piero Giorgio; Barone, Luciano; Cavallari, Francesca; D'imperio, Giulia; Del Re, Daniele; Diemoz, Marcella; Gelli, Simone; Jorda, Clara; Longo, Egidio; Margaroli, Fabrizio; Meridiani, Paolo; Organtini, Giovanni; Paramatti, Riccardo; Preiato, Federico; Rahatlou, Shahram; Rovelli, Chiara; Santanastasio, Francesco; Traczyk, Piotr; Amapane, Nicola; Arcidiacono, Roberta; Argiro, Stefano; Arneodo, Michele; Bellan, Riccardo; Biino, Cristina; Cartiglia, Nicolo; Costa, Marco; Covarelli, Roberto; Degano, Alessandro; Demaria, Natale; Finco, Linda; Kiani, Bilal; Mariotti, Chiara; Maselli, Silvia; Migliore, Ernesto; Monaco, Vincenzo; Monteil, Ennio; Obertino, Maria Margherita; Pacher, Luca; Pastrone, Nadia; Pelliccioni, Mario; Pinna Angioni, Gian Luca; Ravera, Fabio; Romero, Alessandra; Ruspa, Marta; Sacchi, Roberto; Solano, Ada; Staiano, Amedeo; Belforte, Stefano; Candelise, Vieri; Casarsa, Massimo; Cossutti, Fabio; Della Ricca, Giuseppe; Gobbo, Benigno; La Licata, Chiara; Marone, Matteo; Schizzi, Andrea; Zanetti, Anna; Kropivnitskaya, Anna; Nam, Soon-Kwon; Kim, Dong Hee; Kim, Gui Nyun; Kim, Min Suk; Kong, Dae Jung; Lee, Sangeun; Oh, Young Do; Sakharov, Alexandre; Son, Dong-Chul; Brochero Cifuentes, Javier Andres; Kim, Hyunsoo; Kim, Tae Jeong; Song, Sanghyeon; Cho, Sungwoong; Choi, Suyong; Go, Yeonju; Gyun, Dooyeon; Hong, Byung-Sik; Kim, Hyunchul; Kim, Yongsun; Lee, Byounghoon; Lee, Kisoo; Lee, Kyong Sei; Lee, Songkyo; Park, Sung Keun; Roh, Youn; Yoo, Hwi Dong; Choi, Minkyoo; Kim, Hyunyong; Kim, Ji Hyun; Lee, Jason Sang Hun; Park, Inkyu; Ryu, Geonmo; Ryu, Min Sang; Choi, Young-Il; Goh, Junghwan; Kim, Donghyun; Kwon, Eunhyang; Lee, Jongseok; Yu, Intae; Dudenas, Vytautas; Juodagalvis, Andrius; Vaitkus, Juozas; Ahmed, Ijaz; Ibrahim, Zainol Abidin; Komaragiri, Jyothsna Rani; Md Ali, Mohd Adli Bin; Mohamad Idris, Faridah; Wan Abdullah, Wan Ahmad Tajuddin; Yusli, Mohd Nizam; Zolkapli, Zukhaimira; Casimiro Linares, Edgar; Castilla-Valdez, Heriberto; De La Cruz-Burelo, Eduard; Heredia-De La Cruz, Ivan; Hernandez-Almada, Alberto; Lopez-Fernandez, Ricardo; Sánchez Hernández, Alberto; Carrillo Moreno, Salvador; Vazquez Valencia, Fabiola; Pedraza, Isabel; Salazar Ibarguen, Humberto Antonio; Morelos Pineda, Antonio; Krofcheck, David; Butler, Philip H; Ahmad, Ashfaq; Ahmad, Muhammad; Hassan, Qamar; Hoorani, Hafeez R; Khan, Wajid Ali; Khurshid, Taimoor; Shoaib, Muhammad; Bialkowska, Helena; Bluj, Michal; Boimska, Bożena; Frueboes, Tomasz; Górski, Maciej; Kazana, Malgorzata; Nawrocki, Krzysztof; Romanowska-Rybinska, Katarzyna; Szleper, Michal; Zalewski, Piotr; Brona, Grzegorz; Bunkowski, Karol; Byszuk, Adrian; Doroba, Krzysztof; Kalinowski, Artur; Konecki, Marcin; Krolikowski, Jan; Misiura, Maciej; Olszewski, Michal; Walczak, Marek; Bargassa, Pedrame; Beirão Da Cruz E Silva, Cristóvão; Di Francesco, Agostino; Faccioli, Pietro; Ferreira Parracho, Pedro Guilherme; Gallinaro, Michele; Hollar, Jonathan; Leonardo, Nuno; Lloret Iglesias, Lara; Nguyen, Federico; Rodrigues Antunes, Joao; Seixas, Joao; Toldaiev, Oleksii; Vadruccio, Daniele; Varela, Joao; Vischia, Pietro; Afanasiev, Serguei; Bunin, Pavel; Gavrilenko, Mikhail; Golutvin, Igor; Gorbunov, Ilya; Kamenev, Alexey; Karjavin, Vladimir; Lanev, Alexander; Malakhov, Alexander; Matveev, Viktor; Moisenz, Petr; Palichik, Vladimir; Perelygin, Victor; Shmatov, Sergey; Shulha, Siarhei; Skatchkov, Nikolai; Smirnov, Vitaly; Zarubin, Anatoli; Golovtsov, Victor; Ivanov, Yury; Kim, Victor; Kuznetsova, Ekaterina; Levchenko, Petr; Murzin, Victor; Oreshkin, Vadim; Smirnov, Igor; Sulimov, Valentin; Uvarov, Lev; Vavilov, Sergey; Vorobyev, Alexey; Andreev, Yuri; Dermenev, Alexander; Gninenko, Sergei; Golubev, Nikolai; Karneyeu, Anton; Kirsanov, Mikhail; Krasnikov, Nikolai; Pashenkov, Anatoli; Tlisov, Danila; Toropin, Alexander; Epshteyn, Vladimir; Gavrilov, Vladimir; Lychkovskaya, Natalia; Popov, Vladimir; Pozdnyakov, Ivan; Safronov, Grigory; Spiridonov, Alexander; Vlasov, Evgueni; Zhokin, Alexander; Bylinkin, Alexander; Chadeeva, Marina; Danilov, Mikhail; Andreev, Vladimir; Azarkin, Maksim; Dremin, Igor; Kirakosyan, Martin; Leonidov, Andrey; Mesyats, Gennady; Rusakov, Sergey V; Baskakov, Alexey; Belyaev, Andrey; Boos, Edouard; Ershov, Alexander; Gribushin, Andrey; Kaminskiy, Alexandre; Kodolova, Olga; Korotkikh, Vladimir; Lokhtin, Igor; Miagkov, Igor; Obraztsov, Stepan; Petrushanko, Sergey; Savrin, Viktor; Snigirev, Alexander; Vardanyan, Irina; Azhgirey, Igor; Bayshev, Igor; Bitioukov, Sergei; Kachanov, Vassili; Kalinin, Alexey; Konstantinov, Dmitri; Krychkine, Victor; Petrov, Vladimir; Ryutin, Roman; Sobol, Andrei; Tourtchanovitch, Leonid; Troshin, Sergey; Tyurin, Nikolay; Uzunian, Andrey; Volkov, Alexey; Adzic, Petar; Cirkovic, Predrag; Milosevic, Jovan; Rekovic, Vladimir; Alcaraz Maestre, Juan; Calvo, Enrique; Cerrada, Marcos; Chamizo Llatas, Maria; Colino, Nicanor; De La Cruz, Begona; Delgado Peris, Antonio; Escalante Del Valle, Alberto; Fernandez Bedoya, Cristina; Fernández Ramos, Juan Pablo; Flix, Jose; Fouz, Maria Cruz; Garcia-Abia, Pablo; Gonzalez Lopez, Oscar; Goy Lopez, Silvia; Hernandez, Jose M; Josa, Maria Isabel; Navarro De Martino, Eduardo; Pérez-Calero Yzquierdo, Antonio María; Puerta Pelayo, Jesus; Quintario Olmeda, Adrián; Redondo, Ignacio; Romero, Luciano; Santaolalla, Javier; Senghi Soares, Mara; Albajar, Carmen; de Trocóniz, Jorge F; Missiroli, Marino; Moran, Dermot; Cuevas, Javier; Fernandez Menendez, Javier; Folgueras, Santiago; Gonzalez Caballero, Isidro; Palencia Cortezon, Enrique; Vizan Garcia, Jesus Manuel; Cabrillo, Iban Jose; Calderon, Alicia; Castiñeiras De Saa, Juan Ramon; De Castro Manzano, Pablo; Fernandez, Marcos; Garcia-Ferrero, Juan; Gomez, Gervasio; Lopez Virto, Amparo; Marco, Jesus; Marco, Rafael; Martinez Rivero, Celso; Matorras, Francisco; Piedra Gomez, Jonatan; Rodrigo, Teresa; Rodríguez-Marrero, Ana Yaiza; Ruiz-Jimeno, Alberto; Scodellaro, Luca; Trevisani, Nicolò; Vila, Ivan; Vilar Cortabitarte, Rocio; Abbaneo, Duccio; Auffray, Etiennette; Auzinger, Georg; Bachtis, Michail; Baillon, Paul; Ball, Austin; Barney, David; Benaglia, Andrea; Bendavid, Joshua; Benhabib, Lamia; Berruti, Gaia Maria; Bloch, Philippe; Bocci, Andrea; Bonato, Alessio; Botta, Cristina; Breuker, Horst; Camporesi, Tiziano; Castello, Roberto; Cerminara, Gianluca; D'Alfonso, Mariarosaria; D'Enterria, David; Dabrowski, Anne; Daponte, Vincenzo; David Tinoco Mendes, Andre; De Gruttola, Michele; De Guio, Federico; De Roeck, Albert; De Visscher, Simon; Di Marco, Emanuele; Dobson, Marc; Dordevic, Milos; Dorney, Brian; Du Pree, Tristan; Duggan, Daniel; Dünser, Marc; Dupont, Niels; Elliott-Peisert, Anna; Franzoni, Giovanni; Fulcher, Jonathan; Funk, Wolfgang; Gigi, Dominique; Gill, Karl; Giordano, Domenico; Girone, Maria; Glege, Frank; Guida, Roberto; Gundacker, Stefan; Guthoff, Moritz; Hammer, Josef; Harris, Philip; Hegeman, Jeroen; Innocente, Vincenzo; Janot, Patrick; Kirschenmann, Henning; Kortelainen, Matti J; Kousouris, Konstantinos; Krajczar, Krisztian; Lecoq, Paul; Lourenco, Carlos; Lucchini, Marco Toliman; Magini, Nicolo; Malgeri, Luca; Mannelli, Marcello; Martelli, Arabella; Masetti, Lorenzo; Meijers, Frans; Mersi, Stefano; Meschi, Emilio; Moortgat, Filip; Morovic, Srecko; Mulders, Martijn; Nemallapudi, Mythra Varun; Neugebauer, Hannes; Orfanelli, Styliani; Orsini, Luciano; Pape, Luc; Perez, Emmanuelle; Peruzzi, Marco; Petrilli, Achille; Petrucciani, Giovanni; Pfeiffer, Andreas; Pierini, Maurizio; Piparo, Danilo; Racz, Attila; Reis, Thomas; Rolandi, Gigi; Rovere, Marco; Ruan, Manqi; Sakulin, Hannes; Schäfer, Christoph; Schwick, Christoph; Seidel, Markus; Sharma, Archana; Silva, Pedro; Simon, Michal; Sphicas, Paraskevas; Steggemann, Jan; Stieger, Benjamin; Stoye, Markus; Takahashi, Yuta; Treille, Daniel; Triossi, Andrea; Tsirou, Andromachi; Veres, Gabor Istvan; Wardle, Nicholas; Wöhri, Hermine Katharina; Zagoździńska, Agnieszka; Zeuner, Wolfram Dietrich; Bertl, Willi; Deiters, Konrad; Erdmann, Wolfram; Horisberger, Roland; Ingram, Quentin; Kaestli, Hans-Christian; Kotlinski, Danek; Langenegger, Urs; Rohe, Tilman; Bachmair, Felix; Bäni, Lukas; Bianchini, Lorenzo; Casal, Bruno; Dissertori, Günther; Dittmar, Michael; Donegà, Mauro; Eller, Philipp; Grab, Christoph; Heidegger, Constantin; Hits, Dmitry; Hoss, Jan; Kasieczka, Gregor; Lecomte, Pierre; Lustermann, Werner; Mangano, Boris; Marionneau, Matthieu; Martinez Ruiz del Arbol, Pablo; Masciovecchio, Mario; Meister, Daniel; Micheli, Francesco; Musella, Pasquale; Nessi-Tedaldi, Francesca; Pandolfi, Francesco; Pata, Joosep; Pauss, Felicitas; Perrozzi, Luca; Quittnat, Milena; Rossini, Marco; Schönenberger, Myriam; Starodumov, Andrei; Takahashi, Maiko; Tavolaro, Vittorio Raoul; Theofilatos, Konstantinos; Wallny, Rainer; Aarrestad, Thea Klaeboe; Amsler, Claude; Caminada, Lea; Canelli, Maria Florencia; Chiochia, Vincenzo; De Cosa, Annapaola; Galloni, Camilla; Hinzmann, Andreas; Hreus, Tomas; Kilminster, Benjamin; Lange, Clemens; Ngadiuba, Jennifer; Pinna, Deborah; Rauco, Giorgia; Robmann, Peter; Salerno, Daniel; Yang, Yong; Cardaci, Marco; Chen, Kuan-Hsin; Doan, Thi Hien; Jain, Shilpi; Khurana, Raman; Konyushikhin, Maxim; Kuo, Chia-Ming; Lin, Willis; Lu, Yun-Ju; Pozdnyakov, Andrey; Yu, Shin-Shan; Kumar, Arun; Chang, Paoti; Chang, You-Hao; Chang, Yu-Wei; Chao, Yuan; Chen, Kai-Feng; Chen, Po-Hsun; Dietz, Charles; Fiori, Francesco; Grundler, Ulysses; Hou, George Wei-Shu; Hsiung, Yee; Liu, Yueh-Feng; Lu, Rong-Shyang; Miñano Moya, Mercedes; Petrakou, Eleni; Tsai, Jui-fa; Tzeng, Yeng-Ming; Asavapibhop, Burin; Kovitanggoon, Kittikul; Singh, Gurpreet; Srimanobhas, Norraphat; Suwonjandee, Narumon; Adiguzel, Aytul; Cerci, Salim; Demiroglu, Zuhal Seyma; Dozen, Candan; Dumanoglu, Isa; Gecit, Fehime Hayal; Girgis, Semiray; Gokbulut, Gul; Guler, Yalcin; Gurpinar, Emine; Hos, Ilknur; Kangal, Evrim Ersin; Kayis Topaksu, Aysel; Onengut, Gulsen; Ozcan, Merve; Ozdemir, Kadri; Ozturk, Sertac; Tali, Bayram; Topakli, Huseyin; Zorbilmez, Caglar; Bilin, Bugra; Bilmis, Selcuk; Isildak, Bora; Karapinar, Guler; Yalvac, Metin; Zeyrek, Mehmet; Gülmez, Erhan; Kaya, Mithat; Kaya, Ozlem; Yetkin, Elif Asli; Yetkin, Taylan; Cakir, Altan; Cankocak, Kerem; Sen, Sercan; Vardarlı, Fuat Ilkehan; Grynyov, Boris; Levchuk, Leonid; Sorokin, Pavel; Aggleton, Robin; Ball, Fionn; Beck, Lana; Brooke, James John; Clement, Emyr; Cussans, David; Flacher, Henning; Goldstein, Joel; Grimes, Mark; Heath, Greg P; Heath, Helen F; Jacob, Jeson; Kreczko, Lukasz; Lucas, Chris; Meng, Zhaoxia; Newbold, Dave M; Paramesvaran, Sudarshan; Poll, Anthony; Sakuma, Tai; Seif El Nasr-storey, Sarah; Senkin, Sergey; Smith, Dominic; Smith, Vincent J; Belyaev, Alexander; Brew, Christopher; Brown, Robert M; Calligaris, Luigi; Cieri, Davide; Cockerill, David JA; Coughlan, John A; Harder, Kristian; Harper, Sam; Olaiya, Emmanuel; Petyt, David; Shepherd-Themistocleous, Claire; Thea, Alessandro; Tomalin, Ian R; Williams, Thomas; Worm, Steven; Baber, Mark; Bainbridge, Robert; Buchmuller, Oliver; Bundock, Aaron; Burton, Darren; Casasso, Stefano; Citron, Matthew; Colling, David; Corpe, Louie; Dauncey, Paul; Davies, Gavin; De Wit, Adinda; Della Negra, Michel; Dunne, Patrick; Elwood, Adam; Futyan, David; Hall, Geoffrey; Iles, Gregory; Lane, Rebecca; Lucas, Robyn; Lyons, Louis; Magnan, Anne-Marie; Malik, Sarah; Nash, Jordan; Nikitenko, Alexander; Pela, Joao; Pesaresi, Mark; Raymond, David Mark; Richards, Alexander; Rose, Andrew; Seez, Christopher; Tapper, Alexander; Uchida, Kirika; Vazquez Acosta, Monica; Virdee, Tejinder; Zenz, Seth Conrad; Cole, Joanne; Hobson, Peter R; Khan, Akram; Kyberd, Paul; Leslie, Dawn; Reid, Ivan; Symonds, Philip; Teodorescu, Liliana; Turner, Mark; Borzou, Ahmad; Call, Kenneth; Dittmann, Jay; Hatakeyama, Kenichi; Liu, Hongxuan; Pastika, Nathaniel; Charaf, Otman; Cooper, Seth; Henderson, Conor; Rumerio, Paolo; Arcaro, Daniel; Avetisyan, Aram; Bose, Tulika; Gastler, Daniel; Rankin, Dylan; Richardson, Clint; Rohlf, James; Sulak, Lawrence; Zou, David; Alimena, Juliette; Berry, Edmund; Cutts, David; Ferapontov, Alexey; Garabedian, Alex; Hakala, John; Heintz, Ulrich; Jesus, Orduna; Laird, Edward; Landsberg, Greg; Mao, Zaixing; Narain, Meenakshi; Piperov, Stefan; Sagir, Sinan; Syarif, Rizki; Breedon, Richard; Breto, Guillermo; Calderon De La Barca Sanchez, Manuel; Chauhan, Sushil; Chertok, Maxwell; Conway, John; Conway, Rylan; Cox, Peter Timothy; Erbacher, Robin; Funk, Garrett; Gardner, Michael; Ko, Winston; Lander, Richard; Mclean, Christine; Mulhearn, Michael; Pellett, Dave; Pilot, Justin; Ricci-Tam, Francesca; Shalhout, Shalhout; Smith, John; Squires, Michael; Stolp, Dustin; Tripathi, Mani; Wilbur, Scott; Yohay, Rachel; Cousins, Robert; Everaerts, Pieter; Florent, Alice; Hauser, Jay; Ignatenko, Mikhail; Saltzberg, David; Takasugi, Eric; Valuev, Vyacheslav; Weber, Matthias; Burt, Kira; Clare, Robert; Ellison, John Anthony; Gary, J William; Hanson, Gail; Heilman, Jesse; Paneva, Mirena Ivova; Jandir, Pawandeep; Kennedy, Elizabeth; Lacroix, Florent; Long, Owen Rosser; Malberti, Martina; Olmedo Negrete, Manuel; Shrinivas, Amithabh; Wei, Hua; Wimpenny, Stephen; Yates, Brent; Branson, James G; Cerati, Giuseppe Benedetto; Cittolin, Sergio; D'Agnolo, Raffaele Tito; Derdzinski, Mark; Holzner, André; Kelley, Ryan; Klein, Daniel; Letts, James; Macneill, Ian; Olivito, Dominick; Padhi, Sanjay; Pieri, Marco; Sani, Matteo; Sharma, Vivek; Simon, Sean; Tadel, Matevz; Vartak, Adish; Wasserbaech, Steven; Welke, Charles; Würthwein, Frank; Yagil, Avraham; Zevi Della Porta, Giovanni; Bradmiller-Feld, John; Campagnari, Claudio; Dishaw, Adam; Dutta, Valentina; Flowers, Kristen; Franco Sevilla, Manuel; Geffert, Paul; George, Christopher; Golf, Frank; Gouskos, Loukas; Gran, Jason; Incandela, Joe; Mccoll, Nickolas; Mullin, Sam Daniel; Richman, Jeffrey; Stuart, David; Suarez, Indara; West, Christopher; Yoo, Jaehyeok; Anderson, Dustin; Apresyan, Artur; Bornheim, Adolf; Bunn, Julian; Chen, Yi; Duarte, Javier; Mott, Alexander; Newman, Harvey B; Pena, Cristian; Spiropulu, Maria; Vlimant, Jean-Roch; Xie, Si; Zhu, Ren-Yuan; Andrews, Michael Benjamin; Azzolini, Virginia; Calamba, Aristotle; Carlson, Benjamin; Ferguson, Thomas; Paulini, Manfred; Russ, James; Sun, Menglei; Vogel, Helmut; Vorobiev, Igor; Cumalat, John Perry; Ford, William T; Gaz, Alessandro; Jensen, Frank; Johnson, Andrew; Krohn, Michael; Mulholland, Troy; Nauenberg, Uriel; Stenson, Kevin; Wagner, Stephen Robert; Alexander, James; Chatterjee, Avishek; Chaves, Jorge; Chu, Jennifer; Dittmer, Susan; Eggert, Nicholas; Mirman, Nathan; Nicolas Kaufman, Gala; Patterson, Juliet Ritchie; Rinkevicius, Aurelijus; Ryd, Anders; Skinnari, Louise; Soffi, Livia; Sun, Werner; Tan, Shao Min; Teo, Wee Don; Thom, Julia; Thompson, Joshua; Tucker, Jordan; Weng, Yao; Wittich, Peter; Abdullin, Salavat; Albrow, Michael; Apollinari, Giorgio; Banerjee, Sunanda; Bauerdick, Lothar AT; Beretvas, Andrew; Berryhill, Jeffrey; Bhat, Pushpalatha C; Bolla, Gino; Burkett, Kevin; Butler, Joel Nathan; Cheung, Harry; Chlebana, Frank; Cihangir, Selcuk; Elvira, Victor Daniel; Fisk, Ian; Freeman, Jim; Gottschalk, Erik; Gray, Lindsey; Green, Dan; Grünendahl, Stefan; Gutsche, Oliver; Hanlon, Jim; Hare, Daryl; Harris, Robert M; Hasegawa, Satoshi; Hirschauer, James; Hu, Zhen; Jayatilaka, Bodhitha; Jindariani, Sergo; Johnson, Marvin; Joshi, Umesh; Klima, Boaz; Kreis, Benjamin; Lammel, Stephan; Linacre, Jacob; Lincoln, Don; Lipton, Ron; Liu, Tiehui; Lopes De Sá, Rafael; Lykken, Joseph; Maeshima, Kaori; Marraffino, John Michael; Maruyama, Sho; Mason, David; McBride, Patricia; Merkel, Petra; Mrenna, Stephen; Nahn, Steve; Newman-Holmes, Catherine; O'Dell, Vivian; Pedro, Kevin; Prokofyev, Oleg; Rakness, Gregory; Sexton-Kennedy, Elizabeth; Soha, Aron; Spalding, William J; Spiegel, Leonard; Stoynev, Stoyan; Strobbe, Nadja; Taylor, Lucas; Tkaczyk, Slawek; Tran, Nhan Viet; Uplegger, Lorenzo; Vaandering, Eric Wayne; Vernieri, Caterina; Verzocchi, Marco; Vidal, Richard; Wang, Michael; Weber, Hannsjoerg Artur; Whitbeck, Andrew; Acosta, Darin; Avery, Paul; Bortignon, Pierluigi; Bourilkov, Dimitri; Carnes, Andrew; Carver, Matthew; Curry, David; Das, Souvik; Field, Richard D; Furic, Ivan-Kresimir; Gleyzer, Sergei V; Konigsberg, Jacobo; Korytov, Andrey; Kotov, Khristian; Ma, Peisen; Matchev, Konstantin; Mei, Hualin; Milenovic, Predrag; Mitselmakher, Guenakh; Rank, Douglas; Rossin, Roberto; Shchutska, Lesya; Snowball, Matthew; Sperka, David; Terentyev, Nikolay; Thomas, Laurent; Wang, Jian; Wang, Sean-Jiun; Yelton, John; Hewamanage, Samantha; Linn, Stephan; Markowitz, Pete; Martinez, German; Rodriguez, Jorge Luis; Ackert, Andrew; Adams, Jordon Rowe; Adams, Todd; Askew, Andrew; Bein, Samuel; Bochenek, Joseph; Diamond, Brendan; Haas, Jeff; Hagopian, Sharon; Hagopian, Vasken; Johnson, Kurtis F; Khatiwada, Ajeeta; Prosper, Harrison; Weinberg, Marc; Baarmand, Marc M; Bhopatkar, Vallary; Colafranceschi, Stefano; Hohlmann, Marcus; Kalakhety, Himali; Noonan, Daniel; Roy, Titas; Yumiceva, Francisco; Adams, Mark Raymond; Apanasevich, Leonard; Berry, Douglas; Betts, Russell Richard; Bucinskaite, Inga; Cavanaugh, Richard; Evdokimov, Olga; Gauthier, Lucie; Gerber, Cecilia Elena; Hofman, David Jonathan; Kurt, Pelin; O'Brien, Christine; Sandoval Gonzalez, Irving Daniel; Turner, Paul; Varelas, Nikos; Wu, Zhenbin; Zakaria, Mohammed; Bilki, Burak; Clarida, Warren; Dilsiz, Kamuran; Durgut, Süleyman; Gandrajula, Reddy Pratap; Haytmyradov, Maksat; Khristenko, Viktor; Merlo, Jean-Pierre; Mermerkaya, Hamit; Mestvirishvili, Alexi; Moeller, Anthony; Nachtman, Jane; Ogul, Hasan; Onel, Yasar; Ozok, Ferhat; Penzo, Aldo; Snyder, Christina; Tiras, Emrah; Wetzel, James; Yi, Kai; Anderson, Ian; Barnett, Bruce Arnold; Blumenfeld, Barry; Eminizer, Nicholas; Fehling, David; Feng, Lei; Gritsan, Andrei; Maksimovic, Petar; Martin, Christopher; Osherson, Marc; Roskes, Jeffrey; Cocoros, Alice; Sarica, Ulascan; Swartz, Morris; Xiao, Meng; Xin, Yongjie; You, Can; Baringer, Philip; Bean, Alice; Benelli, Gabriele; Bruner, Christopher; Kenny III, Raymond Patrick; Majumder, Devdatta; Malek, Magdalena; Mcbrayer, William; Murray, Michael; Sanders, Stephen; Stringer, Robert; Wang, Quan; Ivanov, Andrew; Kaadze, Ketino; Khalil, Sadia; Makouski, Mikhail; Maravin, Yurii; Mohammadi, Abdollah; Saini, Lovedeep Kaur; Skhirtladze, Nikoloz; Toda, Sachiko; Lange, David; Rebassoo, Finn; Wright, Douglas; Anelli, Christopher; Baden, Drew; Baron, Owen; Belloni, Alberto; Calvert, Brian; Eno, Sarah Catherine; Ferraioli, Charles; Gomez, Jaime; Hadley, Nicholas John; Jabeen, Shabnam; Kellogg, Richard G; Kolberg, Ted; Kunkle, Joshua; Lu, Ying; Mignerey, Alice; Shin, Young Ho; Skuja, Andris; Tonjes, Marguerite; Tonwar, Suresh C; Apyan, Aram; Barbieri, Richard; Baty, Austin; Bierwagen, Katharina; Brandt, Stephanie; Busza, Wit; Cali, Ivan Amos; Demiragli, Zeynep; Di Matteo, Leonardo; Gomez Ceballos, Guillelmo; Goncharov, Maxim; Gulhan, Doga; Iiyama, Yutaro; Innocenti, Gian Michele; Klute, Markus; Kovalskyi, Dmytro; Lai, Yue Shi; Lee, Yen-Jie; Levin, Andrew; Luckey, Paul David; Marini, Andrea Carlo; Mcginn, Christopher; Mironov, Camelia; Narayanan, Siddharth; Niu, Xinmei; Paus, Christoph; Roland, Christof; Roland, Gunther; Salfeld-Nebgen, Jakob; Stephans, George; Sumorok, Konstanty; Varma, Mukund; Velicanu, Dragos; Veverka, Jan; Wang, Jing; Wang, Ta-Wei; Wyslouch, Bolek; Yang, Mingming; Zhukova, Victoria; Dahmes, Bryan; Evans, Andrew; Finkel, Alexey; Gude, Alexander; Hansen, Peter; Kalafut, Sean; Kao, Shih-Chuan; Klapoetke, Kevin; Kubota, Yuichi; Lesko, Zachary; Mans, Jeremy; Nourbakhsh, Shervin; Ruckstuhl, Nicole; Rusack, Roger; Tambe, Norbert; Turkewitz, Jared; Acosta, John Gabriel; Oliveros, Sandra; Avdeeva, Ekaterina; Bartek, Rachel; Bloom, Kenneth; Bose, Suvadeep; Claes, Daniel R; Dominguez, Aaron; Fangmeier, Caleb; Gonzalez Suarez, Rebeca; Kamalieddin, Rami; Knowlton, Dan; Kravchenko, Ilya; Meier, Frank; Monroy, Jose; Ratnikov, Fedor; Siado, Joaquin Emilo; Snow, Gregory R; Alyari, Maral; Dolen, James; George, Jimin; Godshalk, Andrew; Harrington, Charles; Iashvili, Ia; Kaisen, Josh; Kharchilava, Avto; Kumar, Ashish; Rappoccio, Salvatore; Roozbahani, Bahareh; Alverson, George; Barberis, Emanuela; Baumgartel, Darin; Chasco, Matthew; Hortiangtham, Apichart; Massironi, Andrea; Morse, David Michael; Nash, David; Orimoto, Toyoko; Teixeira De Lima, Rafael; Trocino, Daniele; Wang, Ren-Jie; Wood, Darien; Zhang, Jinzhong; Bhattacharya, Saptaparna; Hahn, Kristan Allan; Kubik, Andrew; Low, Jia Fu; Mucia, Nicholas; Odell, Nathaniel; Pollack, Brian; Schmitt, Michael Henry; Sung, Kevin; Trovato, Marco; Velasco, Mayda; Brinkerhoff, Andrew; Dev, Nabarun; Hildreth, Michael; Jessop, Colin; Karmgard, Daniel John; Kellams, Nathan; Lannon, Kevin; Marinelli, Nancy; Meng, Fanbo; Mueller, Charles; Musienko, Yuri; Planer, Michael; Reinsvold, Allison; Ruchti, Randy; Smith, Geoffrey; Taroni, Silvia; Valls, Nil; Wayne, Mitchell; Wolf, Matthias; Woodard, Anna; Antonelli, Louis; Brinson, Jessica; Bylsma, Ben; Durkin, Lloyd Stanley; Flowers, Sean; Hart, Andrew; Hill, Christopher; Hughes, Richard; Ji, Weifeng; Ling, Ta-Yung; Liu, Bingxuan; Luo, Wuming; Puigh, Darren; Rodenburg, Marissa; Winer, Brian L; Wulsin, Howard Wells; Driga, Olga; Elmer, Peter; Hardenbrook, Joshua; Hebda, Philip; Koay, Sue Ann; Lujan, Paul; Marlow, Daniel; Medvedeva, Tatiana; Mooney, Michael; Olsen, James; Palmer, Christopher; Piroué, Pierre; Stickland, David; Tully, Christopher; Zuranski, Andrzej; Malik, Sudhir; Barker, Anthony; Barnes, Virgil E; Benedetti, Daniele; Bortoletto, Daniela; Gutay, Laszlo; Jha, Manoj; Jones, Matthew; Jung, Andreas Werner; Jung, Kurt; Kumar, Ajay; Miller, David Harry; Neumeister, Norbert; Radburn-Smith, Benjamin Charles; Shi, Xin; Shipsey, Ian; Silvers, David; Sun, Jian; Svyatkovskiy, Alexey; Wang, Fuqiang; Xie, Wei; Xu, Lingshan; Parashar, Neeti; Stupak, John; Adair, Antony; Akgun, Bora; Chen, Zhenyu; Ecklund, Karl Matthew; Geurts, Frank JM; Guilbaud, Maxime; Li, Wei; Michlin, Benjamin; Northup, Michael; Padley, Brian Paul; Redjimi, Radia; Roberts, Jay; Rorie, Jamal; Tu, Zhoudunming; Zabel, James; Betchart, Burton; Bodek, Arie; de Barbaro, Pawel; Demina, Regina; Eshaq, Yossof; Ferbel, Thomas; Galanti, Mario; Garcia-Bellido, Aran; Han, Jiyeon; Harel, Amnon; Hindrichs, Otto; Khukhunaishvili, Aleko; Lo, Kin Ho; Petrillo, Gianluca; Tan, Ping; Verzetti, Mauro; Chou, John Paul; Contreras-Campana, Emmanuel; Ferencek, Dinko; Gershtein, Yuri; Halkiadakis, Eva; Heindl, Maximilian; Hidas, Dean; Hughes, Elliot; Kaplan, Steven; Kunnawalkam Elayavalli, Raghav; Lath, Amitabh; Nash, Kevin; Saka, Halil; Salur, Sevil; Schnetzer, Steve; Sheffield, David; Somalwar, Sunil; Stone, Robert; Thomas, Scott; Thomassen, Peter; Walker, Matthew; Foerster, Mark; Riley, Grant; Rose, Keith; Spanier, Stefan; Thapa, Krishna; Bouhali, Othmane; Castaneda Hernandez, Alfredo; Celik, Ali; Dalchenko, Mykhailo; De Mattia, Marco; Delgado, Andrea; Dildick, Sven; Eusebi, Ricardo; Gilmore, Jason; Huang, Tao; Kamon, Teruki; Krutelyov, Vyacheslav; Mueller, Ryan; Osipenkov, Ilya; Pakhotin, Yuriy; Patel, Rishi; Perloff, Alexx; Rose, Anthony; Safonov, Alexei; Tatarinov, Aysen; Ulmer, Keith; Akchurin, Nural; Cowden, Christopher; Damgov, Jordan; Dragoiu, Cosmin; Dudero, Phillip Russell; Faulkner, James; Kunori, Shuichi; Lamichhane, Kamal; Lee, Sung Won; Libeiro, Terence; Undleeb, Sonaina; Volobouev, Igor; Appelt, Eric; Delannoy, Andrés G; Greene, Senta; Gurrola, Alfredo; Janjam, Ravi; Johns, Willard; Maguire, Charles; Mao, Yaxian; Melo, Andrew; Ni, Hong; Sheldon, Paul; Tuo, Shengquan; Velkovska, Julia; Xu, Qiao; Arenton, Michael Wayne; Cox, Bradley; Francis, Brian; Goodell, Joseph; Hirosky, Robert; Ledovskoy, Alexander; Li, Hengne; Lin, Chuanzhe; Neu, Christopher; Sinthuprasith, Tutanon; Sun, Xin; Wang, Yanchu; Wolfe, Evan; Wood, John; Xia, Fan; Clarke, Christopher; Harr, Robert; Karchin, Paul Edmund; Kottachchi Kankanamge Don, Chamath; Lamichhane, Pramod; Sturdy, Jared; Belknap, Donald; Carlsmith, Duncan; Cepeda, Maria; Dasu, Sridhara; Dodd, Laura; Duric, Senka; Gomber, Bhawna; Grothe, Monika; Herndon, Matthew; Hervé, Alain; Klabbers, Pamela; Lanaro, Armando; Levine, Aaron; Long, Kenneth; Loveless, Richard; Mohapatra, Ajit; Ojalvo, Isabel; Perry, Thomas; Pierro, Giuseppe Antonio; Polese, Giovanni; Ruggles, Tyler; Sarangi, Tapas; Savin, Alexander; Sharma, Archana; Smith, Nicholas; Smith, Wesley H; Taylor, Devin; Verwilligen, Piet; Woods, Nathaniel

    2017-07-31

    Two-particle correlations in pPb collisions at a nucleon-nucleon center-of-mass energy of 5.02 TeV are studied as a function of the pseudorapidity separation ($\\Delta \\eta$) of the particle pair at small relative azimuthal angle ($ | \\Delta \\phi | < \\pi/3$). The correlations are decomposed into a jet component that dominates the short-range correlations ($ | \\Delta \\eta | < $ 1), and a component that persists at large $\\Delta \\eta$ and may originate from collective behavior of the produced system. The events are classified in terms of the multiplicity of the produced particles. Finite azimuthal anisotropies are observed in high-multiplicity events. The second and third Fourier components of the particle-pair azimuthal correlations, $V_2$ and $V_3$, are extracted after subtraction of the jet component. The single-particle anisotropy parameters $v_2$ and $v_3$ are normalized by their lab frame mid-rapidity value and are studied as a function of $\\eta_{\\text{cm}}$. The normalized $v_2$ distribution is foun...

  10. The impacts of Phalaris arundinacea (reed canary grass) invasion on wetland plant richness in the Oregon Coast Range, USA, depend on beavers

    Science.gov (United States)

    Perkins, T.; Wilson, M.

    2005-01-01

    Invasive plants can threaten diversity and ecosystem function. We examined the relationship between the invasive Phalaris arundinacea (reed canarygrass) and species richness in beaver wetlands in Oregon, USA. Four basins (drainages) were chosen and three sites each of beaver impoundments, unimpounded areas and areas upstream of debris jams were randomly chosen in each basin for further study (n = 36). Analysis of covariance (ANCOVA) showed that the relationship between Phalaris and species richness differed significantly (p = 0.01) by site type. Dam sites (beaver impoundments) exhibited a strong inverse relationship between Phalaris and species richness (bD = a??0.15), with one species lost for each 7% increase in Phalaris cover. In contrast, there was essentially no relationship between Phalaris cover and species richness in jam sites (debris jam impoundments formed by flooding; bJ = +0.01) and unimpounded sites (bU = a??0.03). The cycle of beaver impoundment and abandonment both disrupts the native community and provides an ideal environment for Phalaris, which once established tends to exclude development of herbaceous communities and limits species richness. Because beaver wetlands are a dominant wetland type in the Coast Range, Phalaris invasion presents a real threat to landscape heterogeneity and ecosystem function in the region.

  11. On the nature of anomalies in temperature dependence of the OKh18N1OT steel yield strength after thermal cycling in the low temperature range

    International Nuclear Information System (INIS)

    Medvedev, E.M.; Lavrent'ev, F.F.; Kurmanova, T.N.

    1978-01-01

    Investigated were structural transformations in 0Kh18N10T steel as a result of heating and cooling and of deformation within the range of temperatures between 300 and 77 K, the quantity relationships between the said transformations and the variation of the yield limit with the temperature. The studies were conducted by metallography and mechanical test methods. It was shown that an increase in the number of heating and cooling cycles correlates with a loss in strength of the steel while deformation at 77 K. This anomaly in the temperature relationship of the yield limit is related to the appearance in the course of deformation of α-martensite with a BCC lattice. Deformation at 300 K increases the amount ea of epsilon-martensite, a decrses the effectve size of grain and, in consequence, increases the yield limit. The relationship between the yield limit and the grain size at the temperature of 300 K is described adequately by the Hall-Petch equation

  12. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  13. Packaging of Mason-Pfizer monkey virus (MPMV) genomic RNA depends upon conserved long-range interactions (LRIs) between U5 and gag sequences.

    Science.gov (United States)

    Kalloush, Rawan M; Vivet-Boudou, Valérie; Ali, Lizna M; Mustafa, Farah; Marquet, Roland; Rizvi, Tahir A

    2016-06-01

    MPMV has great potential for development as a vector for gene therapy. In this respect, precisely defining the sequences and structural motifs that are important for dimerization and packaging of its genomic RNA (gRNA) are of utmost importance. A distinguishing feature of the MPMV gRNA packaging signal is two phylogenetically conserved long-range interactions (LRIs) between U5 and gag complementary sequences, LRI-I and LRI-II. To test their biological significance in the MPMV life cycle, we introduced mutations into these structural motifs and tested their effects on MPMV gRNA packaging and propagation. Furthermore, we probed the structure of key mutants using SHAPE (selective 2'hydroxyl acylation analyzed by primer extension). Disrupting base-pairing of the LRIs affected gRNA packaging and propagation, demonstrating their significance to the MPMV life cycle. A double mutant restoring a heterologous LRI-I was fully functional, whereas a similar LRI-II mutant failed to restore gRNA packaging and propagation. These results demonstrate that while LRI-I acts at the structural level, maintaining base-pairing is not sufficient for LRI-II function. In addition, in vitro RNA dimerization assays indicated that the loss of RNA packaging in LRI mutants could not be attributed to the defects in dimerization. Our findings suggest that U5-gag LRIs play an important architectural role in maintaining the structure of the 5' region of the MPMV gRNA, expanding the crucial role of LRIs to the nonlentiviral group of retroviruses. © 2016 Kalloush et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  14. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  15. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  16. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  17. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  18. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  19. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  20. Measurement of the Mis-identification Probability of τ Leptons from Hadronic Jets and from Electrons

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    Measurements of the mis-identification probability of QCD jets and electrons as hadronically decaying τ leptons using tag-and-probe methods are described. The analyses are based on 35pb−1 of proton-proton collision data, taken by the ATLAS experiment at a center-of-mass energy of sqrt(s) = 7 TeV. The mis-identification probabilities range between 10% and 0.1% for QCD jets, and about (1 − 2)% for electrons. They depend on the identification algorithm chosen, the pT and the number of prongs of the τ candidate, and on the amount of pile up present in the event.

  1. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  2. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  3. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  4. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  5. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  6. Underwater Ranging

    OpenAIRE

    S. P. Gaba

    1984-01-01

    The paper deals with underwater laser ranging system, its principle of operation and maximum depth capability. The sources of external noise and methods to improve signal-to-noise ratio are also discussed.

  7. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  8. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  9. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  12. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  14. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  15. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  16. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  17. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  18. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  19. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  20. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  1. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  2. Misclassification probability as obese or lean in hypercaloric and normocaloric diet

    Directory of Open Access Journals (Sweden)

    ANDRÉ F NASCIMENTO

    2008-01-01

    Full Text Available The aim of the present study was to determine the classification error probabilities, as lean or obese, in hypercaloric diet-induced obesity, which depends on the variable used to characterize animal obesity. In addition, the misclassification probabilities in animáis submitted to normocaloric diet were also evaluated. Male Wistar rats were randomly distributed into two groups: normal diet (ND; n=31; 3,5 Kcal/g and hypercaloric diet (HD; n=31; 4,6 Kcal/g. The ND group received commercial Labina rat feed and HD animáis a cycle of five hypercaloric diets for a 14-week period. The variables analysed were body weight, body composition, body weight to length ratio, Lee Índex, body mass Índex and misclassification probability. A 5% significance level was used. The hypercaloric pellet-diet cycle promoted increase of body weight, carcass fat, body weight to length ratio and Lee Índex. The total misclassification probabilities ranged from 19.21% to 40.91%. In conclusión, the results of this experiment show that misclassification probabilities occur when dietary manipulation is used to promote obesity in animáis. This misjudgement ranges from 19.49% to 40.52% in hypercaloric diet and 18.94% to 41.30% in normocaloric diet.

  3. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  4. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  5. Spin-selected velocity dependence of the associative ionization cross section in Na(3p)+Na(3p) collisions over the collision energy range from 2.4 to 290 meV

    International Nuclear Information System (INIS)

    Wang, M.; Keller, J.; Boulmer, J.; Weiner, J.

    1987-01-01

    We report new results on the direct measurement of the associative ionization (AI) cross section in collisions between velocity-selected and spin-oriented Na(3p) atoms. Improvements in the Doppler-shift velocity-selection technique permit measurement over an energy range spanning more than two orders of magnitude from subthermal to suprathermal regions. Spin orientations, parallel and antiparallel, enable determination of the excitation function (velocity dependence of the AI cross section) for the separate singlet and triplet manifolds of Na 2 states contributing to the AI process

  6. Human error recovery failure probability when using soft controls in computerized control rooms

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Wondea

    2014-01-01

    selection and delayed operation were mutually dependent. The ranges of other recovery failure probabilities were 0.227 to 0.546 in terms of using soft controls. Since there is no recovery failure probability database regarding soft controls in advanced MCRs and recovery failure probabilities in other HRA method were obtained by expert judgment, the results in this study would be helpful for HRA experts to decide recovery failure probabilities under advanced MCR environment

  7. Energy dependence of relative abundances and periods of delayed neutron separate groups from neutron induced fission of 239Pu in the virgin neutron energy range 0.37-4.97 MeV

    International Nuclear Information System (INIS)

    Piksajkin, V.M.; Kazakov, L.E.; Isaev, S.T.; Korolev, G.G.; Roshchenko, V.A.; Tertychnyj, R.G.

    2002-01-01

    Relative yield and group period of delayed neutrons induced by the 239 Pu fission in the 0.37-4.97 MeV range were measured. Comparative analysis of experimental data was conducted in terms of middle period of half-life of delayed neutron nuclei-precursors. Character and scale of changing values of delayed neutron group parameters as changing excitation energy of fission compound-nucleus have been demonstrated for the first time. Considerable energy dependence of group parameters under the neutron induced 239 Pu fission that was expressed by the decreasing middle period of half-life of nuclei-precursors by 10 % in the 2.85 eV - 5 MeV range of virgin neutrons was detected [ru

  8. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  10. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  11. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  12. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  13. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  14. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  16. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  17. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  18. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  19. Energy dependence of relative abundances and periods of separate groups of delayed neutrons at neutron induced fission of 239Pu in a range of neutrons energies 0.37 - 5 MeV

    International Nuclear Information System (INIS)

    Roschenko, V.A.; Piksaikin, V.M.; Kazakov, L.E.; Isaev, S.G.; Korolev, G.G.; Tarasko, M.Z.; Tertychnyi, R.G.

    2001-01-01

    The fundamental role of delayed neutrons in behavior, control and safety of reactors is well known today. Delayed neutron data are of great interest not only for reactor physics but also for nuclear fission physics and astrophysics. The purpose of the present work was the measurement of energy dependence of delayed neutrons (DN) group parameters at fission of nuclei 239 Pu in a range of energies of primary neutrons from 0.37 up to 5 MeV. The measurements were executed on installation designed on the basis of the electrostatic accelerator of KG - 2.5 SSC RF IPPE. The data are obtained in 6-group representation. It is shown, that there is a significant energy dependence of DN group parameters in a range of primary neutrons energies from thermal meanings up to 5 MeV, which is expressed in reduction of the average half-life of nuclei of the DN precursors on 10 %. The data, received in the present work, can be used at creation of a set of group constants for reactors with an intermediate spectrum of neutrons. (authors)

  20. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  1. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  2. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr; Rabia, Sherif; Mahmoud, Mohamed; Aly, Moustafa H.; Shihada, Basem

    2016-01-01

    by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio

  3. Absolute transition probabilities for 559 strong lines of neutral cerium

    Energy Technology Data Exchange (ETDEWEB)

    Curry, J J, E-mail: jjcurry@nist.go [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2009-07-07

    Absolute radiative transition probabilities are reported for 559 strong lines of neutral cerium covering the wavelength range 340-880 nm. These transition probabilities are obtained by scaling published relative line intensities (Meggers et al 1975 Tables of Spectral Line Intensities (National Bureau of Standards Monograph 145)) with a smaller set of published absolute transition probabilities (Bisson et al 1991 J. Opt. Soc. Am. B 8 1545). All 559 new values are for lines for which transition probabilities have not previously been available. The estimated relative random uncertainty of the new data is +-35% for nearly all lines.

  4. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  5. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  6. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  7. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  8. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  9. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  10. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  11. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  12. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  13. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  14. Probability of Criticality for MOX SNF

    International Nuclear Information System (INIS)

    P. Gottlieb

    1999-01-01

    The purpose of this calculation is to provide a conservative (upper bound) estimate of the probability of criticality for mixed oxide (MOX) spent nuclear fuel (SNF) of the Westinghouse pressurized water reactor (PWR) design that has been proposed for use. with the Plutonium Disposition Program (Ref. 1, p. 2). This calculation uses a Monte Carlo technique similar to that used for ordinary commercial SNF (Ref. 2, Sections 2 and 5.2). Several scenarios, covering a range of parameters, are evaluated for criticality. Parameters specifying the loss of fission products and iron oxide from the waste package are particularly important. This calculation is associated with disposal of MOX SNF

  15. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  16. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  17. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  18. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  19. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  20. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  1. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  2. Cueing spatial attention through timing and probability.

    Science.gov (United States)

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2013-01-01

    Even when focused on an effortful task we retain the ability to detect salient environmental information, and even irrelevant visual stimuli can be automatically detected. However, to which extent unattended information affects attentional control is not fully understood. Here we provide evidences of how the brain spontaneously organizes its cognitive resources by shifting attention between a selective-attending and a stimulus-driven modality within a single task. Using a spatial cueing paradigm we investigated the effect of cue-target asynchronies as a function of their probabilities of occurrence (i.e., relative frequency). Results show that this accessory information modulates attentional shifts. A valid spatial cue improved participants' performance as compared to an invalid one only in trials in which target onset was highly predictable because of its more robust occurrence. Conversely, cuing proved ineffective when spatial cue and target were associated according to a less frequent asynchrony. These patterns of response depended on asynchronies' probability and not on their duration. Our findings clearly demonstrate that through a fine decision-making, performed trial-by-trial, the brain utilizes implicit information to decide whether or not voluntarily shifting spatial attention. As if according to a cost-planning strategy, the cognitive effort of shifting attention depending on the cue is performed only when the expected advantages are higher. In a trade-off competition for cognitive resources, voluntary/automatic attending may thus be a more complex process than expected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Description of plasmon-like band in silver clusters: the importance of the long-range Hartree-Fock exchange in time-dependent density-functional theory simulations.

    Science.gov (United States)

    Rabilloud, Franck

    2014-10-14

    Absorption spectra of Ag20 and Ag55(q) (q = +1, -3) nanoclusters are investigated in the framework of the time-dependent density functional theory in order to analyse the role of the d electrons in plasmon-like band of silver clusters. The description of the plasmon-like band from calculations using density functionals containing an amount of Hartree-Fock exchange at long range, namely, hybrid and range-separated hybrid (RSH) density functionals, is in good agreement with the classical interpretation of the plasmon-like structure as a collective excitation of valence s-electrons. In contrast, using local or semi-local exchange functionals (generalized gradient approximations (GGAs) or meta-GGAs) leads to a strong overestimation of the role of d electrons in the plasmon-like band. The semi-local asymptotically corrected model potentials also describe the plasmon as mainly associated to d electrons, though calculated spectra are in fairly good agreement with those calculated using the RSH scheme. Our analysis shows that a portion of non-local exchange modifies the description of the plasmon-like band.

  4. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  5. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  6. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  7. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  8. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  9. Ionization induced by strong electromagnetic field in low dimensional systems bound by short range forces

    Energy Technology Data Exchange (ETDEWEB)

    Eminov, P.A., E-mail: peminov@mail.ru [Moscow State University of Instrument Engineering and Computer Sciences, 20 Stromynka Street, Moscow 2107996 (Russian Federation); National Research University Higher School of Economics, 3/12 Bolshoy Trekhsvyatskiy pereulok, Moscow 109028 (Russian Federation)

    2013-10-01

    Ionization processes for a two dimensional quantum dot subjected to combined electrostatic and alternating electric fields of the same direction are studied using quantum mechanical methods. We derive analytical equations for the ionization probability in dependence on characteristic parameters of the system for both extreme cases of a constant electric field and of a linearly polarized electromagnetic wave. The ionization probabilities for a superposition of dc and low frequency ac electric fields of the same direction are calculated. The impulse distribution of ionization probability for a system bound by short range forces is found for a superposition of constant and alternating fields. The total probability for this process per unit of time is derived within exponential accuracy. For the first time the influence of alternating electric field on electron tunneling probability induced by an electrostatic field is studied taking into account the pre-exponential term.

  10. Home range and travels

    Science.gov (United States)

    Stickel, L.F.; King, John A.

    1968-01-01

    The concept of home range was expressed by Seton (1909) in the term 'home region,' which Burr (1940, 1943) clarified with a definition of home range and exemplified in a definitive study of Peromyscus in the field. Burt pointed out the ever-changing characteristics of home-range area and the consequent absence of boundaries in the usual sense--a finding verified by investigators thereafter. In the studies summarized in this paper, sizes of home ranges of Peromyscus varied within two magnitudes, approximately from 0.1 acre to ten acres, in 34 studies conducted in a variety of habitats from the seaside dunes of Florida to the Alaskan forests. Variation in sizes of home ranges was correlated with both environmental and physiological factors; with habitat it was conspicuous, both in the same and different regions. Food supply also was related to size of home range, both seasonally and in relation to habitat. Home ranges generally were smallest in winter and largest in spring, at the onset of the breeding season. Activity and size also were affected by changes in weather. Activity was least when temperatures were low and nights were bright. Effects of rainfall were variable. Sizes varied according to sex and age; young mice remained in the parents' range until they approached maturity, when they began to travel more widely. Adult males commonly had larger home ranges than females, although there were a number of exceptions. An inverse relationship between population density and size of home range was shown in several studies and probably is the usual relationship. A basic need for activity and exploration also appeared to influence size of home range. Behavior within the home range was discussed in terms of travel patterns, travels in relation to home sites and refuges, territory, and stability of size of home range. Travels within the home range consisted of repeated use of well-worn trails to sites of food, shelter, and refuge, plus more random exploratory travels

  11. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  12. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  13. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  14. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-01-01

    of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs

  15. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  16. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  17. Probability theory a concise course

    CERN Document Server

    Rozanov, Y A

    1977-01-01

    This clear exposition begins with basic concepts and moves on to combination of events, dependent events and random variables, Bernoulli trials and the De Moivre-Laplace theorem, a detailed treatment of Markov chains, continuous Markov processes, and more. Includes 150 problems, many with answers. Indispensable to mathematicians and natural scientists alike.

  18. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  19. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  20. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  1. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  2. A new variable interval schedule with constant hazard rate and finite time range.

    Science.gov (United States)

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  3. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  4. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  5. Computer simulation of probability of detection

    International Nuclear Information System (INIS)

    Fertig, K.W.; Richardson, J.M.

    1983-01-01

    This paper describes an integrated model for assessing the performance of a given ultrasonic inspection system for detecting internal flaws, where the performance of such a system is measured by probability of detection. The effects of real part geometries on sound propagations are accounted for and the noise spectra due to various noise mechanisms are measured. An ultrasonic inspection simulation computer code has been developed to be able to detect flaws with attributes ranging over an extensive class. The detection decision is considered to be a binary decision based on one received waveform obtained in a pulse-echo or pitch-catch setup. This study focuses on the detectability of flaws using an amplitude thresholding type. Some preliminary results on the detectability of radially oriented cracks in IN-100 for bore-like geometries are given

  6. Hospital of Diagnosis Influences the Probability of Receiving Curative Treatment for Esophageal Cancer.

    Science.gov (United States)

    van Putten, Margreet; Koëter, Marijn; van Laarhoven, Hanneke W M; Lemmens, Valery E P P; Siersema, Peter D; Hulshof, Maarten C C M; Verhoeven, Rob H A; Nieuwenhuijzen, Grard A P

    2018-02-01

    The aim of this article was to study the influence of hospital of diagnosis on the probability of receiving curative treatment and its impact on survival among patients with esophageal cancer (EC). Although EC surgery is centralized in the Netherlands, the disease is often diagnosed in hospitals that do not perform this procedure. Patients with potentially curable esophageal or gastroesophageal junction tumors diagnosed between 2005 and 2013 who were potentially curable (cT1-3,X, any N, M0,X) were selected from the Netherlands Cancer Registry. Multilevel logistic regression was performed to examine the probability to undergo curative treatment (resection with or without neoadjuvant treatment, definitive chemoradiotherapy, or local tumor excision) according to hospital of diagnosis. Effects of variation in probability of undergoing curative treatment among these hospitals on survival were investigated by Cox regression. All 13,017 patients with potentially curable EC, diagnosed in 91 hospitals, were included. The proportion of patients receiving curative treatment ranged from 37% to 83% and from 45% to 86% in the periods 2005-2009 and 2010-2013, respectively, depending on hospital of diagnosis. After adjustment for patient- and hospital-related characteristics these proportions ranged from 41% to 77% and from 50% to 82%, respectively (both P < 0.001). Multivariable survival analyses showed that patients diagnosed in hospitals with a low probability of undergoing curative treatment had a worse overall survival (hazard ratio = 1.13, 95% confidence interval 1.06-1.20; hazard ratio = 1.15, 95% confidence interval 1.07-1.24). The variation in probability of undergoing potentially curative treatment for EC between hospitals of diagnosis and its impact on survival indicates that treatment decision making in EC may be improved.

  7. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  8. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  9. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  10. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  11. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  12. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    Normally, a consistent basis for calculating partial factors focuses on a homogeneous reliability index neither depending on which material the structure is constructed of nor the ratio between the permanent and variable actions acting on the structure. Furthermore, the reliability index should n...... the characteristic shape coefficients are based on mean values as specified in background documents to the Eurocodes. Importance of hidden safeties judging the reliability is discussed for wind actions on low-rise structures....... not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  13. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  14. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  15. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  16. Gravity and count probabilities in an expanding universe

    Science.gov (United States)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  17. Comparison of fission probabilities with emission of long range particles under the action of slow and fast neutrons on various materials; Probabilites comparees de fission avec emission de particules de long parcours pour divers materiaux sous l'action des neutrons lents et rapides

    Energy Technology Data Exchange (ETDEWEB)

    Netter, F; Faraggi, H; Garin-Bonnet, A; Julien, J; Corge, C [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Turkiewicz, J [Institut de Recherches Nucleaire de Varsovie (Poland)

    1958-07-01

    The authors describe relative cross-section measurements of fission of the isotopes of uranium and plutonium (more particularly {sup 235}U, {sup 238}U, {sup 239}Pu), with emission of long range particles, under the action of neutrons of various energies: thermal neutrons, pile neutrons, neutrons produced with the Van de Graaff accelerator by reaction of protons on tritium. The measurements are carried out: 1) with the aid of photographic plates, by submitting to the action of the neutrons a layer of fissile material coupled with an Ilford nuclear emulsion of 200 microns; a tin sheet laying between the plate and the layer stops the {alpha} particles and the fission fragments. By an appropriate development the tracks of the long range particles can be distinguished in the emulsion, from the tracks of the recoil protons resulting of fission neutrons, or of the last primary neutrons. For neutrons of energy under 1 MeV, the compared frequency of the tracks of long range particles and of the recoils caused by the fission neutrons gives a measurement of the fission cross-section with emission of long range particles relative to the product of the fission cross-section by the mean number of neutrons emitted by fission. For neutrons of higher energy, one measures only the frequency of the tracks of long range particles, comparatively with the flux of primary neutrons. Some precautions are taken to eliminate the action of thermal neutrons in the measurements with fast neutrons. 2) with the aid of a system of ionization chamber and proportional counter, the rate of coincidence between the impulsions caused by the long range particles and the impulsions provided by one of the fission fragments is measured comparatively with the counting rate of fission fragme (author) [French] Les auteurs decrivent des mesures relatives a la section efficace de fission des isotopes de l'uranium et du plutonium (notamment {sup 235}U, {sup 238}U, {sup 239}Pu) avec emission de particules de long

  18. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  19. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  20. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  1. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  2. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  3. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  4. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  5. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  6. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  7. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  8. Climate driven range divergence among host species affects range-wide patterns of parasitism

    Directory of Open Access Journals (Sweden)

    Richard E. Feldman

    2017-01-01

    Full Text Available Species interactions like parasitism influence the outcome of climate-driven shifts in species ranges. For some host species, parasitism can only occur in that part of its range that overlaps with a second host species. Thus, predicting future parasitism may depend on how the ranges of the two hosts change in relation to each other. In this study, we tested whether the climate driven species range shift of Odocoileus virginianus (white-tailed deer accounts for predicted changes in parasitism of two other species from the family Cervidae, Alces alces (moose and Rangifer tarandus (caribou, in North America. We used MaxEnt models to predict the recent (2000 and future (2050 ranges (probabilities of occurrence of the cervids and a parasite Parelaphostrongylus tenuis (brainworm taking into account range shifts of the parasite’s intermediate gastropod hosts. Our models predicted that range overlap between A. alces/R. tarandus and P. tenuis will decrease between 2000 and 2050, an outcome that reflects decreased overlap between A. alces/R. tarandus and O. virginianus and not the parasites, themselves. Geographically, our models predicted increasing potential occurrence of P. tenuis where A. alces/R. tarandus are likely to decline, but minimal spatial overlap where A. alces/R. tarandus are likely to increase. Thus, parasitism may exacerbate climate-mediated southern contraction of A. alces and R. tarandus ranges but will have limited influence on northward range expansion. Our results suggest that the spatial dynamics of one host species may be the driving force behind future rates of parasitism for another host species.

  9. Inferring wavelength dependence of AOD and Ångström exponent over a sub-tropical station in South Africa using AERONET data: Influence of meteorology, long-range transport and curvature effect

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, K. Raghavendra, E-mail: kanike.kumar@gmail.com [Discipline of Physics, School of Chemistry and Physics, Westville Campus, University of KwaZulu-Natal, Durban 4000 (South Africa); Sivakumar, V. [Discipline of Physics, School of Chemistry and Physics, Westville Campus, University of KwaZulu-Natal, Durban 4000 (South Africa); Reddy, R.R.; Gopal, K. Rama [Department of Physics, Aerosol and Atmospheric Research Laboratory, Sri Krishnadevaraya University, Anantapur 515 003, Andhra Pradesh (India); Adesina, A. Joseph [Discipline of Physics, School of Chemistry and Physics, Westville Campus, University of KwaZulu-Natal, Durban 4000 (South Africa)

    2013-09-01

    Aerosol optical properties over a southern sub-tropical site Skukuza, South Africa were studied to determine the variability of the aerosol characteristics using CIMEL Sunphotometer data as part of the AErosol RObotic NETwork (AERONET) from December 2005 to November 2006. Aerosol optical depth (AOD), Ångström exponent (α), and columnar water vapor (CWV) data were collected, analyzed, and compiled. Participating in this network provided a unique opportunity for understanding the sources of aerosols affecting the atmosphere of South Africa (SA) and the regional radiation budget. The meteorological patterns significantly (p < 0.05) influenced the amount and size distribution of the aerosols. Results showed that seasonal variation of AOD at 500 nm (AOD{sub 500}) over the observation site were characterized by low values (0.10–0.13) in autumn, moderate values (0.14–0.16) in summer and winter seasons, and high to very high values (0.18–0.40) during the spring, with an overall mean value of 0.18 ± 0.12. Ångström exponent α{sub 440–870}, varied from 0.5 to 2.89, with significant (p < 0.0001) seasonal variability. CWV showed a strong annual cycle with maximum values in the summer and autumn seasons. The relationship between AOD, Ångström exponent (α), and CWV showed a strong dependence (p < 0.0001) of α on AOD and CWV, while there was no significant correlation between AOD and CWV. Investigation of the adequacy of the simple use of the spectral AOD and Ångström exponent data was used in deriving the curvature (a{sub 2}) showed to obtain information for determining the aerosol-particle size. The negative a{sub 2} values are characterized by aerosol-size dominated by fine-mode (0.1–1 μm), while the positive curvatures indicate abundance of coarse particles (> 1 μm). Trajectory cluster analyses revealed that the air masses during the autumn and winter seasons have longer advection pathways, passing over the ocean and continent. This is reflected in the

  10. Anticipating Central Asian Water Stress: Variation in River Flow Dependency on Melt Waters from Alpine to Plains in the Remote Tien Shan Range, Kyrgyzstan Using a Rapid Hydro Assessment Methodology

    Science.gov (United States)

    Hill, A. F.; Wilson, A. M.; Williams, M. W.

    2016-12-01

    The future of mountain water resources in High Asia is of high interest to water managers, development organizations and policy makers given large populations downstream reliant on snow and ice sourced river flow. Together with historical and cultural divides among ex-Soviet republics, a lack of central water management following the Soviet break-up has led to water stress as trans-boundary waters weave through and along borders. New upstream hydropower development, a thirsty downstream agricultural sector and a shrinking Aral Sea has led to increasing tension in the region. Despite these pressures and in contrast to eastern High Asia's Himalayan basins (Ganges, Brahmaputra), little attention has been given to western High Asia draining the Pamir and Tien Shan ranges (Syr Darya and Amu Darya basins) to better understand the hydrology of this vast and remote area. Difficult access and challenging terrain exacerbate challenges to working in this remote mountain region. As part of the Contributions to High Asia Runoff from Ice and Snow (CHARIS) project, we asked how does river flow source water composition change over an alpine-to-plains domain of Kyrgyzstan's Naryn River in the Syr Darya basin? In addition, what may the future hold for river flow in Central Asia given the differing responses of snow and ice to climate changes? Utilizing a Rapid Hydrologic Assessment methodology including a suite of pre-field mapping techniques we collected in situ water chemistry data at targeted, remote mountain sites over 450km of the Naryn River over an elevation gradient from glacial headwaters to the lower lying areas - places where people, hydropower and agriculture utilize water. Chemical and isotope tracers were used to separate stream flow to understand relative dependency on melt waters as the river moves downstream from glaciers and snow covered areas. This case study demonstrates a technique to acquire field data over large scales in remote regions that facilitates

  11. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  12. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  13. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  14. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  15. Evaluation of DNA match probability in criminal case.

    Science.gov (United States)

    Lee, J W; Lee, H S; Park, M; Hwang, J J

    2001-02-15

    The new emphasis on quantification of evidence has led to perplexing courtroom decisions and it has been difficult for forensic scientists to pursue logical arguments. Especially, for evaluating DNA evidence, though both the genetic relationship for two compared persons and the examined locus system should be considered, the understanding for this has not yet drawn much attention. In this paper, we suggest to calculate the match probability by using coancestry coefficient when the family relationship is considered, and thus the performances of the identification values depending on the calculation of match probability are compared under various situations.

  16. Theoretical analysis on the probability of initiating persistent fission chain

    International Nuclear Information System (INIS)

    Liu Jianjun; Wang Zhe; Zhang Ben'ai

    2005-01-01

    For the finite multiplying system of fissile material in the presence of a weak neutron source, the authors analyses problems on the probability of initiating a persistent fission chain through reckoning the stochastic theory of neutron multiplication. In the theoretical treatment, the conventional point reactor conception model is developed to an improved form with position x and velocity v dependence. The estimated results including approximate value of the probability mentioned above and its distribution are given by means of diffusion approximation and compared with those with previous point reactor conception model. They are basically consistent, however the present model can provide details on the distribution. (authors)

  17. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  18. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  19. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  20. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  1. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  2. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  4. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  5. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  6. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  7. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  8. Cross Check of NOvA Oscillation Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics

    2018-01-12

    In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.

  9. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  10. The transmission probability method in one-dimensional cylindrical geometry

    International Nuclear Information System (INIS)

    Rubin, I.E.

    1983-01-01

    The collision probability method widely used in solving the problems of neutron transpopt in a reactor cell is reliable for simple cells with small number of zones. The increase of the number of zones and also taking into account the anisotropy of scattering greatly increase the scope of calculations. In order to reduce the time of calculation the transmission probability method is suggested to be used for flux calculation in one-dimensional cylindrical geometry taking into account the scattering anisotropy. The efficiency of the suggested method is verified using the one-group calculations for cylindrical cells. The use of the transmission probability method allows to present completely angular and spatial dependences is neutrons distributions without the increase in the scope of calculations. The method is especially effective in solving the multi-group problems

  11. How Life History Can Sway the Fixation Probability of Mutants

    Science.gov (United States)

    Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne

    2016-01-01

    In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737

  12. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  13. The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation

    Science.gov (United States)

    Felder, Guido; Zischg, Andreas; Weingartner, Rolf

    2017-07-01

    Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.

  14. Eliciting conditional and unconditional rank correlations from conditional probabilities

    International Nuclear Information System (INIS)

    Morales, O.; Kurowicka, D.; Roelen, A.

    2008-01-01

    Causes of uncertainties may be interrelated and may introduce dependencies. Ignoring these dependencies may lead to large errors. A number of graphical models in probability theory such as dependence trees, vines and (continuous) Bayesian belief nets [Cooke RM. Markov and entropy properties of tree and vine-dependent variables. In: Proceedings of the ASA section on Bayesian statistical science, 1997; Kurowicka D, Cooke RM. Distribution-free continuous Bayesian belief nets. In: Proceedings of mathematical methods in reliability conference, 2004; Bedford TJ, Cooke RM. Vines-a new graphical model for dependent random variables. Ann Stat 2002; 30(4):1031-68; Kurowicka D, Cooke RM. Uncertainty analysis with high dimensional dependence modelling. New York: Wiley; 2006; Hanea AM, et al. Hybrid methods for quantifying and analyzing Bayesian belief nets. In: Proceedings of the 2005 ENBIS5 conference, 2005; Shachter RD, Kenley CR. Gaussian influence diagrams. Manage Sci 1998; 35(5) .] have been developed to capture dependencies between random variables. The input for these models are various marginal distributions and dependence information, usually in the form of conditional rank correlations. Often expert elicitation is required. This paper focuses on dependence representation, and dependence elicitation. The techniques presented are illustrated with an application from aviation safety

  15. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  16. Attributes of seasonal home range influence choice of migratory strategy in white-tailed deer

    Science.gov (United States)

    Henderson, Charles R.; Mitchell, Michael S.; Myers, Woodrow L.; Lukacs, Paul M.; Nelson, Gerald P.

    2018-01-01

    Partial migration is a common life-history strategy among ungulates living in seasonal environments. The decision to migrate or remain on a seasonal range may be influenced strongly by access to high-quality habitat. We evaluated the influence of access to winter habitat of high quality on the probability of a female white-tailed deer (Odocoileus virginianus) migrating to a separate summer range and the effects of this decision on survival. We hypothesized that deer with home ranges of low quality in winter would have a high probability of migrating, and that survival of an individual in winter would be influenced by the quality of their home range in winter. We radiocollared 67 female white-tailed deer in 2012 and 2013 in eastern Washington, United States. We estimated home range size in winter using a kernel density estimator; we assumed the size of the home range was inversely proportional to its quality and the proportion of crop land within the home range was proportional to its quality. Odds of migrating from winter ranges increased by 3.1 per unit increase in home range size and decreased by 0.29 per unit increase in the proportion of crop land within a home range. Annual survival rate for migrants was 0.85 (SD = 0.05) and 0.84 (SD = 0.09) for residents. Our finding that an individual with a low-quality home range in winter is likely to migrate to a separate summer range accords with the hypothesis that competition for a limited amount of home ranges of high quality should result in residents having home ranges of higher quality than migrants in populations experiencing density dependence. We hypothesize that density-dependent competition for high-quality home ranges in winter may play a leading role in the selection of migration strategy by female white-tailed deer.

  17. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  18. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  20. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  1. Calculation of probabilities of rotational transitions of two-atom molecules in the collision with heavy particles

    International Nuclear Information System (INIS)

    Vargin, A.N.; Ganina, N.A.; Konyukhov, V.K.; Selyakov, V.I.

    1975-01-01

    The problem of calculation of collisional probabilities of rotational transitions (CPRT) in molecule-molecule and molecule-atom interactions in a three-dimensional space has been solved in this paper. A quasiclassical approach was used. The calculation of collisional probabilities of rotational transitions trajectory was carried out in the following way. The particle motion trajectory was calculated by a classical method and the time dependence of the perturbation operator was obtained, its averaging over wave functions of initial and finite states produced CPRT. The classical calculation of the molecule motion trajectory was justified by triviality of the de Broglie wavelength, compared with characteristic atomic distances, and by triviality of a transfered rotational quantum compared with the energy of translational motion of particles. The results of calculation depend on the chosen interaction potential of collisional particles. It follows from the Messy criterion that the region of nonadiabaticity of interaction may be compared with internuclear distances of a molecule. Therefore, for the description of the interaction a short-range potential is required. Analytical expressions were obtained appropriate for practical calculations for one- and two-quantum rotational transitions of diatomic molecules. The CPRT was averaged over the Maxwell distribution over velocities and analytical dependences on a gas temperature were obtained. The results of the numerical calculation of probabilities for the HCl-HCl, HCl-He, CO-CO interactions are presented to illustrate the method

  2. Short-range fundamental forces

    International Nuclear Information System (INIS)

    Antoniadis, I.; Baessler, S.; Buchner, M.; Fedorov, V.V.; Hoedl, S.; Nesvizhevsky, V.V.; Pignol, G.; Protasov, K.V.; Lambrecht, A.; Reynaud, S.; Sobolev, Y.

    2010-01-01

    We consider theoretical motivations to search for extra short-range fundamental forces as well as experiments constraining their parameters. The forces could be of two types: 1) spin-independent forces; 2) spin-dependent axion-like forces. Different experimental techniques are sensitive in respective ranges of characteristic distances. The techniques include measurements of gravity at short distances, searches for extra interactions on top of the Casimir force, precision atomic and neutron experiments. We focus on neutron constraints, thus the range of characteristic distances considered here corresponds to the range accessible for neutron experiments

  3. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  4. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  5. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  6. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  7. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  8. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  9. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  10. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  11. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  12. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  13. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  14. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  15. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  16. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  17. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  18. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  19. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  20. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  1. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  2. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  3. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  4. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  5. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  6. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  7. Levy's zero-one law in game-theoretic probability

    OpenAIRE

    Shafer, Glenn; Vovk, Vladimir; Takemura, Akimichi

    2009-01-01

    We prove a game-theoretic version of Levy's zero-one law, and deduce several corollaries from it, including non-stochastic versions of Kolmogorov's zero-one law, the ergodicity of Bernoulli shifts, and a zero-one law for dependent trials. Our secondary goal is to explore the basic definitions of game-theoretic probability theory, with Levy's zero-one law serving a useful role.

  8. Research advances in probability of causation calculation of radiogenic neoplasms

    International Nuclear Information System (INIS)

    Ning Jing; Yuan Yong; Xie Xiangdong; Yang Guoshan

    2009-01-01

    Probability of causation (PC) was used to facilitate the adjudication of compensation claims for cancers diagnosed following exposure to ionizing radiation. In this article, the excess cancer risk assessment models used for PC calculation are reviewed. Cancer risk transfer models between different populations, dependence of cancer risk on dose and dose rate, modification by epidemiological risk factors and application of PC are also discussed in brief. (authors)

  9. Effect of energy level sequences and neutron–proton interaction on α-particle preformation probability

    International Nuclear Information System (INIS)

    Ismail, M.; Adel, A.

    2013-01-01

    A realistic density-dependent nucleon–nucleon (NN) interaction with finite-range exchange part which produces the nuclear matter saturation curve and the energy dependence of the nucleon–nucleus optical model potential is used to calculate the preformation probability, S α , of α-decay from different isotones with neutron numbers N=124,126,128,130 and 132. We studied the variation of S α with the proton number, Z, for each isotone and found the effect of neutron and proton energy levels of parent nuclei on the behavior of the α-particle preformation probability. We found that S α increases regularly with the proton number when the proton pair in α-particle is emitted from the same level and the neutron level sequence is not changed during the Z-variation. In this case the neutron–proton (n–p) interaction of the two levels, contributing to emission process, is too small. On the contrary, if the proton or neutron level sequence is changed during the emission process, S α behaves irregularly, the irregular behavior increases if both proton and neutron levels are changed. This behavior is accompanied by change or rapid increase in the strength of n–p interaction

  10. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  11. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  12. The rate of beneficial mutations surfing on the wave of a range expansion.

    Directory of Open Access Journals (Sweden)

    Rémi Lehe

    Full Text Available Many theoretical and experimental studies suggest that range expansions can have severe consequences for the gene pool of the expanding population. Due to strongly enhanced genetic drift at the advancing frontier, neutral and weakly deleterious mutations can reach large frequencies in the newly colonized regions, as if they were surfing the front of the range expansion. These findings raise the question of how frequently beneficial mutations successfully surf at shifting range margins, thereby promoting adaptation towards a range-expansion phenotype. Here, we use individual-based simulations to study the surfing statistics of recurrent beneficial mutations on wave-like range expansions in linear habitats. We show that the rate of surfing depends on two strongly antagonistic factors, the probability of surfing given the spatial location of a novel mutation and the rate of occurrence of mutations at that location. The surfing probability strongly increases towards the tip of the wave. Novel mutations are unlikely to surf unless they enjoy a spatial head start compared to the bulk of the population. The needed head start is shown to be proportional to the inverse fitness of the mutant type, and only weakly dependent on the carrying capacity. The precise location dependence of surfing probabilities is derived from the non-extinction probability of a branching process within a moving field of growth rates. The second factor is the mutation occurrence which strongly decreases towards the tip of the wave. Thus, most successful mutations arise at an intermediate position in the front of the wave. We present an analytic theory for the tradeoff between these factors that allows to predict how frequently substitutions by beneficial mutations occur at invasion fronts. We find that small amounts of genetic drift increase the fixation rate of beneficial mutations at the advancing front, and thus could be important for adaptation during species invasions.

  13. Failure-probability driven dose painting

    International Nuclear Information System (INIS)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Berthelsen, Anne K.; Bentzen, Søren M.

    2013-01-01

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity

  14. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  15. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  16. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr

    2016-08-01

    In this paper, we propose and derive a slotted-time model for analyzing the burst blocking probability in Optical Burst Switched (OBS) networks. We evaluated the immediate and delayed signaling reservation schemes. The proposed model compares the performance of both just-in-time (JIT) and just-enough-time (JET) signaling protocols associated with of void/non-void filling link scheduling schemes. It also considers none and limited range wavelength conversions scenarios. Our model is distinguished by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio compared with full wavelength conversion.

  17. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  18. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  19. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  20. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  1. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  2. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  3. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  4. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  5. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  6. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  7. BENTON RANGE ROADLESS AREA, CALIFORNIA.

    Science.gov (United States)

    McKee, Edwin H.; Rains, Richard L.

    1984-01-01

    On the basis of a mineral survey, two parts of the Benton Range Roadless Area, California are considered to have mineral-resource potential. The central and southern part of the roadless area, near several nonoperating mines, has a probable potential for tungsten and gold-silver mineralization in tactite zones. The central part of the area has a substantiated resource potential for gold and silver in quartz veins. Detailed mapping and geochemical sampling for tungsten, gold, and silver in the central and southern part of the roadless area might indicate targets for shallow drilling exploration.

  8. Depth- and range-dependent variation in the performance of aquatic telemetry systems: understanding and predicting the susceptibility of acoustic tag–receiver pairs to close proximity detection interference

    Directory of Open Access Journals (Sweden)

    Stephen R. Scherrer

    2018-01-01

    Full Text Available Background Passive acoustic telemetry using coded transmitter tags and stationary receivers is a popular method for tracking movements of aquatic animals. Understanding the performance of these systems is important in array design and in analysis. Close proximity detection interference (CPDI is a condition where receivers fail to reliably detect tag transmissions. CPDI generally occurs when the tag and receiver are near one another in acoustically reverberant settings. Here we confirm transmission multipaths reflected off the environment arriving at a receiver with sufficient delay relative to the direct signal cause CPDI. We propose a ray-propagation based model to estimate the arrival of energy via multipaths to predict CPDI occurrence, and we show how deeper deployments are particularly susceptible. Methods A series of experiments were designed to develop and validate our model. Deep (300 m and shallow (25 m ranging experiments were conducted using Vemco V13 acoustic tags and VR2-W receivers. Probabilistic modeling of hourly detections was used to estimate the average distance a tag could be detected. A mechanistic model for predicting the arrival time of multipaths was developed using parameters from these experiments to calculate the direct and multipath path lengths. This model was retroactively applied to the previous ranging experiments to validate CPDI observations. Two additional experiments were designed to validate predictions of CPDI with respect to combinations of deployment depth and distance. Playback of recorded tags in a tank environment was used to confirm multipaths arriving after the receiver’s blanking interval cause CPDI effects. Results Analysis of empirical data estimated the average maximum detection radius (AMDR, the farthest distance at which 95% of tag transmissions went undetected by receivers, was between 840 and 846 m for the deep ranging experiment across all factor permutations. From these results, CPDI was

  9. Depth- and range-dependent variation in the performance of aquatic telemetry systems: understanding and predicting the susceptibility of acoustic tag-receiver pairs to close proximity detection interference.

    Science.gov (United States)

    Scherrer, Stephen R; Rideout, Brendan P; Giorli, Giacomo; Nosal, Eva-Marie; Weng, Kevin C

    2018-01-01

    Passive acoustic telemetry using coded transmitter tags and stationary receivers is a popular method for tracking movements of aquatic animals. Understanding the performance of these systems is important in array design and in analysis. Close proximity detection interference (CPDI) is a condition where receivers fail to reliably detect tag transmissions. CPDI generally occurs when the tag and receiver are near one another in acoustically reverberant settings. Here we confirm transmission multipaths reflected off the environment arriving at a receiver with sufficient delay relative to the direct signal cause CPDI. We propose a ray-propagation based model to estimate the arrival of energy via multipaths to predict CPDI occurrence, and we show how deeper deployments are particularly susceptible. A series of experiments were designed to develop and validate our model. Deep (300 m) and shallow (25 m) ranging experiments were conducted using Vemco V13 acoustic tags and VR2-W receivers. Probabilistic modeling of hourly detections was used to estimate the average distance a tag could be detected. A mechanistic model for predicting the arrival time of multipaths was developed using parameters from these experiments to calculate the direct and multipath path lengths. This model was retroactively applied to the previous ranging experiments to validate CPDI observations. Two additional experiments were designed to validate predictions of CPDI with respect to combinations of deployment depth and distance. Playback of recorded tags in a tank environment was used to confirm multipaths arriving after the receiver's blanking interval cause CPDI effects. Analysis of empirical data estimated the average maximum detection radius (AMDR), the farthest distance at which 95% of tag transmissions went undetected by receivers, was between 840 and 846 m for the deep ranging experiment across all factor permutations. From these results, CPDI was estimated within a 276.5 m radius of the

  10. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  11. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  12. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    Science.gov (United States)

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  13. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  14. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  15. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  16. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  17. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  18. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  19. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  20. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)