WorldWideScience

Sample records for range dependent probability

  1. Dependence in Probability and Statistics

    CERN Document Server

    Doukhan, Paul; Surgailis, Donatas; Teyssiere, Gilles

    2010-01-01

    This volume collects recent works on weakly dependent, long-memory and multifractal processes and introduces new dependence measures for studying complex stochastic systems. Other topics include the statistical theory for bootstrap and permutation statistics for infinite variance processes, the dependence structure of max-stable processes, and the statistical properties of spectral estimators of the long memory parameter. The asymptotic behavior of Fejer graph integrals and their use for proving central limit theorems for tapered estimators are investigated. New multifractal processes are intr

  2. Ladar range image denoising by a nonlocal probability statistics algorithm

    Science.gov (United States)

    Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi

    2013-01-01

    According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.

  3. Probability as a theory dependent concept

    NARCIS (Netherlands)

    Atkinson, D; Peijnenburg, J

    1999-01-01

    It is argued that probability should be defined implicitly by the distributions of possible measurement values characteristic of a theory. These distributions are tested by, but not defined in terms of, relative frequencies of occurrences of events of a specified kind. The adoption of an a priori

  4. Energy dependence of gap survival probability and antishadowing

    OpenAIRE

    Troshin, S M; Tyurin, N. E.

    2004-01-01

    We discuss energy dependence of gap survival probability which follows from rational form of amplitude unitarization. In contrast to eikonal form of unitarization which leads to decreasing energy dependence of gap survival probability, we predict a non-monotonous form for this dependence.

  5. Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.

  6. Ruin probabilities in models with a Markov chain dependence structure

    OpenAIRE

    Constantinescu, Corina; Kortschak, Dominik; Maume-Deschamps, Véronique

    2013-01-01

    International audience; In this paper we derive explicit expressions for the probability of ruin in a renewal risk model with dependence described-by/incorporated-in the real-valued random variable Zk = −cτk + Xk , namely the loss between the (k − 1)–th and the k–th claim. Here c represents the constant premium rate, τk the inter-arrival time between the (k − 1)–th and the k–th claim and Xk is the size of the k–th claim. The dependence structure among (Zk )k>0 is given/driven by a Markov chai...

  7. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Voltage dependency of transmission probability of aperiodic DNA molecule

    Science.gov (United States)

    Wiliyanti, V.; Yudiarsah, E.

    2017-07-01

    Characteristics of electron transports in aperiodic DNA molecules have been studied. Double stranded DNA model with the sequences of bases, GCTAGTACGTGACGTAGCTAGGATATGCCTGA, in one chain and its complements on the other chains has been used. Tight binding Hamiltonian is used to model DNA molecules. In the model, we consider that on-site energy of the basis has a linearly dependency on the applied electric field. Slater-Koster scheme is used to model electron hopping constant between bases. The transmission probability of electron from one electrode to the next electrode is calculated using a transfer matrix technique and scattering matrix method simultaneously. The results show that, generally, higher voltage gives a slightly larger value of the transmission probability. The applied voltage seems to shift extended states to lower energy. Meanwhile, the value of the transmission increases with twisting motion frequency increment.

  9. Time-dependent probability density function in cubic stochastic processes

    Science.gov (United States)

    Kim, Eun-jin; Hollerbach, Rainer

    2016-11-01

    We report time-dependent probability density functions (PDFs) for a nonlinear stochastic process with a cubic force using analytical and computational studies. Analytically, a transition probability is formulated by using a path integral and is computed by the saddle-point solution (instanton method) and a new nonlinear transformation of time. The predicted PDF p (x ,t ) in general involves a time integral, and useful PDFs with explicit dependence on x and t are presented in certain limits (e.g., in the short and long time limits). Numerical simulations of the Fokker-Planck equation provide exact time evolution of the PDFs and confirm analytical predictions in the limit of weak noise. In particular, we show that transient PDFs behave drastically differently from the stationary PDFs in regard to the asymmetry (skewness) and kurtosis. Specifically, while stationary PDFs are symmetric with the kurtosis smaller than 3, transient PDFs are skewed with the kurtosis larger than 3; transient PDFs are much broader than stationary PDFs. We elucidate the effect of nonlinear interaction on the strong fluctuations and intermittency in the relaxation process.

  10. Improved cumulative probabilities and range accuracy of a pulsed Geiger-mode avalanche photodiode laser ranging system with turbulence effects.

    Science.gov (United States)

    Luo, Hanjun; Ouyang, Zhengbiao; Liu, Qiang; Lu, Zhenli; Li, Bin

    2017-10-10

    There exists a performance limitation in a pulsed Geiger-mode avalanche photodiode laser ranging system because of the echo intensity random fluctuation caused by turbulence effects. To suppress the influence of turbulence effects, we present a cumulative pulse detection technique with the ability to achieve improved cumulative probabilities and range accuracy. Based on the modulated Poisson model, the cumulative probabilities, range accuracy, and their influencing factors are investigated for a cumulative Q-switched laser pulse train. The results show that the improved cumulative probabilities and range accuracy can be obtained by utilizing cumulative pulse detection, with the condition that the echo intensity is 10, the echo pulse width is 10 ns, and the turbulence degree is 3, the target detection probability increases by 0.4, the false alarm probability decreases by 0.08, and the accuracy and precision increase by 46 cm and 27 cm, respectively.

  11. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  12. Stochastic processes and long range dependence

    CERN Document Server

    Samorodnitsky, Gennady

    2016-01-01

    This monograph is a gateway for researchers and graduate students to explore the profound, yet subtle, world of long-range dependence (also known as long memory). The text is organized around the probabilistic properties of stationary processes that are important for determining the presence or absence of long memory. The first few chapters serve as an overview of the general theory of stochastic processes which gives the reader sufficient background, language, and models for the subsequent discussion of long memory. The later chapters devoted to long memory begin with an introduction to the subject along with a brief history of its development, followed by a presentation of what is currently the best known approach, applicable to stationary processes with a finite second moment. The book concludes with a chapter devoted to the author’s own, less standard, point of view of long memory as a phase transition, and even includes some novel results. Most of the material in the book has not previously been publis...

  13. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  14. Resource Selection Probability Functions for Gopher Tortoise: Providing a Management Tool Applicable Across the Species' Range

    Science.gov (United States)

    Kowal, Virginia A.; Schmolke, Amelie; Kanagaraj, Rajapandian; Bruggeman, Douglas

    2014-03-01

    The gopher tortoise ( Gopherus polyphemus) is protected by conservation policy throughout its range. Efforts to protect the species from further decline demand detailed understanding of its habitat requirements, which have not yet been rigorously defined. Current methods of identifying gopher tortoise habitat typically rely on coarse soil and vegetation classifications, and are prone to over-prediction of suitable habitat. We used a logistic resource selection probability function in an information-theoretic framework to understand the relative importance of various environmental factors to gopher tortoise habitat selection, drawing on nationwide environmental datasets, and an existing tortoise survey of the Ft. Benning military base. We applied the normalized difference vegetation index (NDVI) as an index of vegetation density, and found that NDVI was strongly negatively associated with active burrow locations. Our results showed that the most parsimonious model included variables from all candidate model types (landscape features, topography, soil, vegetation), and the model groups describing soil or vegetation alone performed poorly. These results demonstrate with a rigorous quantitative approach that although soil and vegetation are important to the gopher tortoise, they are not sufficient to describe suitable habitat. More widely, our results highlight the feasibility of constructing highly accurate habitat suitability models from data that are widely available throughout the species' range. Our study shows that the widespread availability of national environmental datasets describing important components of gopher tortoise habitat, combined with existing tortoise surveys on public lands, can be leveraged to inform knowledge of habitat suitability and target recovery efforts range-wide.

  15. A compact result for the time-dependent probability of fixation at a neutral locus.

    Science.gov (United States)

    Waxman, D

    2011-04-07

    A result is derived, in the form of a sum, for the time-dependent probability of fixation of an unlinked neutral locus. The result captures many of the key features of the probability of fixation in a highly compact form. For 'small' times (t ≲ 4N(e)) a single term of the sum accurately determines the time-dependent probability of fixation. This is in contrast to the well-known result of Kimura, which requires the contribution of many terms in a different sum, for 'small' times. Going beyond small times, an approximation is derived for the time-dependent probability of fixation which applies for all times when the initial relative allele frequency is small. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Tucker, W. Troy (Applied Biomathematics, Setauket, NY); Zhang, Jianzhong (Iowa State University, Ames, IA); Ginzburg, Lev (Applied Biomathematics, Setauket, NY); Berleant, Daniel J. (Iowa State University, Ames, IA); Ferson, Scott (Applied Biomathematics, Setauket, NY); Hajagos, Janos (Applied Biomathematics, Setauket, NY); Nelsen, Roger B. (Lewis & Clark College, Portland, OR)

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  17. A method for acquiring random range uncertainty probability distributions in proton therapy

    Science.gov (United States)

    Holloway, S. M.; Holloway, M. D.; Thomas, S. J.

    2018-01-01

    In treatment planning we depend upon accurate knowledge of geometric and range uncertainties. If the uncertainty model is inaccurate then the plan will produce under-dosing of the target and/or overdosing of OAR. We aim to provide a method for which centre and site-specific population range uncertainty due to inter-fraction motion can be quantified to improve the uncertainty model in proton treatment planning. Daily volumetric MVCT data from previously treated radiotherapy patients has been used to investigate inter-fraction changes to water equivalent path-length (WEPL). Daily image-guidance scans were carried out for each patient and corrected for changes in CTV position (using rigid transformations). An effective depth algorithm was used to determine residual range changes, after corrections had been applied, throughout the treatment by comparing WEPL within the CTV at each fraction for several beam angles. As a proof of principle this method was used to quantify uncertainties for inter-fraction range changes for a sample of head and neck patients of Σ=3.39 mm, σ = 4.72 mm and overall mean = -1.82 mm. For prostate Σ=5.64 mm, σ = 5.91 mm and overall mean = 0.98 mm. The choice of beam angle for head and neck did not affect the inter-fraction range error significantly; however this was not the same for prostate. Greater range changes were seen using a lateral beam compared to an anterior beam for prostate due to relative motion of the prostate and femoral heads. A method has been developed to quantify population range changes due to inter-fraction motion that can be adapted for the clinic. The results of this work highlight the importance of robust planning and analysis in proton therapy. Such information could be used in robust optimisation algorithms or treatment plan robustness analysis. Such knowledge will aid in establishing beam start conditions at planning and for establishing adaptive planning protocols.

  18. Ruin Probabilities with Dependence on the Number of Claims within a Fixed Time Window

    Directory of Open Access Journals (Sweden)

    Corina Constantinescu

    2016-06-01

    Full Text Available We analyse the ruin probabilities for a renewal insurance risk process with inter-arrival times depending on the claims that arrive within a fixed (past time window. This dependence could be explained through a regenerative structure. The main inspiration of the model comes from the bonus-malus (BM feature of pricing car insurance. We discuss first the asymptotic results of ruin probabilities for different regimes of claim distributions. For numerical results, we recognise an embedded Markov additive process, and via an appropriate change of measure, ruin probabilities could be computed to a closed-form formulae. Additionally, we employ the importance sampling simulations to derive ruin probabilities, which further permit an in-depth analysis of a few concrete cases.

  19. Evolution of density-dependent movement during experimental range expansions.

    Science.gov (United States)

    Fronhofer, E A; Gut, S; Altermatt, F

    2017-12-01

    Range expansions and biological invasions are prime examples of transient processes that are likely impacted by rapid evolutionary changes. As a spatial process, range expansions are driven by dispersal and movement behaviour. Although it is widely accepted that dispersal and movement may be context-dependent, for instance density-dependent, and best represented by reaction norms, the evolution of density-dependent movement during range expansions has received little experimental attention. We therefore tested current theory predicting the evolution of increased movement at low densities at range margins using highly replicated and controlled range expansion experiments across multiple genotypes of the protist model system Tetrahymena thermophila. Although rare, we found evolutionary changes during range expansions even in the absence of initial standing genetic variation. Range expansions led to the evolution of negatively density-dependent movement at range margins. In addition, we report the evolution of increased intrastrain competitive ability and concurrently decreased population growth rates in range cores. Our findings highlight the importance of understanding movement and dispersal as evolving reaction norms and plastic life-history traits of central relevance for range expansions, biological invasions and the dynamics of spatially structured systems in general. © 2017 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2017 European Society For Evolutionary Biology.

  20. Feature context-dependency and complexity-reduction in probability landscapes for integrative genomics.

    Science.gov (United States)

    Lesne, Annick; Benecke, Arndt

    2008-09-10

    The question of how to integrate heterogeneous sources of biological information into a coherent framework that allows the gene regulatory code in eukaryotes to be systematically investigated is one of the major challenges faced by systems biology. Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, have been proposed as a possible approach to the systematic discovery and analysis of correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Much of the available experimental sequence and genome activity information is de facto, but not necessarily obviously, context dependent. Furthermore, the context dependency of the relevant information is itself dependent on the biological question addressed. It is hence necessary to develop a systematic way of discovering the context-dependency of functional genomics information in a flexible, question-dependent manner. We demonstrate here how feature context-dependency can be systematically investigated using probability landscapes. Furthermore, we show how different feature probability profiles can be conditionally collapsed to reduce the computational and formal, mathematical complexity of probability landscapes. Interestingly, the possibility of complexity reduction can be linked directly to the analysis of context-dependency. These two advances in our understanding of the properties of probability landscapes not only simplify subsequent cross-correlation analysis in hypothesis-driven model building and testing, but also provide additional insights into the biological gene regulatory problems studied. Furthermore, insights into the nature of individual features and a classification of features according to their minimal context-dependency are achieved. The formal structure proposed contributes to a concrete and tangible basis for attempting to formulate novel mathematical structures for describing gene regulation in

  1. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  2. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  3. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  4. Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law

    Science.gov (United States)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro

    2013-07-01

    There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.

  5. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Science.gov (United States)

    Dawson, Michael R W; Gupta, Maya

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  6. Dependence of shake probability on nuclear charge in Li-, Na- and K-like ions

    Energy Technology Data Exchange (ETDEWEB)

    Kupliauskiene, A. [Vilnius University Institute of Theoretical Physics and Astronomy, A. Gostauto 12, LT-01108 Vilnius (Lithuania)]. E-mail: akupl@itpa.lt; Glemza, K. [Vilnius University, Saul e-dot tekio 9, LT-10222 Vilnius (Lithuania)

    2005-07-01

    In sudden perturbation approximation, the probability of the shake-up process accompanying inner-shell ionization is calculated for the isoelectronic sequences of Li-, Na- and K-like ions in the ground and excited np and nd states. Numerical solutions of Hartree-Fock equations and hydrogen-like radial orbitals are used. Very large differences between the results of both approximations for all ions and strong dependences on ion charge are obtained at the beginning of the isoelectronic sequences.

  7. Learning a Flexible K-Dependence Bayesian Classifier from the Chain Rule of Joint Probability Distribution

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available As one of the most common types of graphical models, the Bayesian classifier has become an extremely popular approach to dealing with uncertainty and complexity. The scoring functions once proposed and widely used for a Bayesian network are not appropriate for a Bayesian classifier, in which class variable C is considered as a distinguished one. In this paper, we aim to clarify the working mechanism of Bayesian classifiers from the perspective of the chain rule of joint probability distribution. By establishing the mapping relationship between conditional probability distribution and mutual information, a new scoring function, Sum_MI, is derived and applied to evaluate the rationality of the Bayesian classifiers. To achieve global optimization and high dependence representation, the proposed learning algorithm, the flexible K-dependence Bayesian (FKDB classifier, applies greedy search to extract more information from the K-dependence network structure. Meanwhile, during the learning procedure, the optimal attribute order is determined dynamically, rather than rigidly. In the experimental study, functional dependency analysis is used to improve model interpretability when the structure complexity is restricted.

  8. Profiting from Probability; Combining Low and High Probability Isotopes as a Tool Extending the Dynamic Range of an Assay Measuring Amphetamine and Methamphetamine in Urine.

    Science.gov (United States)

    Miller, Anna M; Goggin, Melissa M; Nguyen, An; Gozum, Stephanie D; Janis, Gregory C

    2017-06-01

    A wide range of concentrations are frequently observed when measuring drugs of abuse in urine toxicology samples; this is especially true for amphetamine and methamphetamine. Routine liquid chromatography-tandem mass spectrometry confirmatory methods commonly anchored at a 50 ng/mL lower limit of quantitation can span approximately a 100-fold concentration range before regions of non-linearity are reached deteriorating accurate quantitation and qualitative assessments. In our experience, approximately a quarter of amphetamine and methamphetamine positive samples are above a 5,000 ng/mL upper limit of quantitation and thus require reanalysis with dilution for accurate quantitative and acceptable qualitative results. We present here the development of an analytical method capable of accurately quantifying samples with concentrations spanning several orders of magnitude without the need for sample dilution and reanalysis. For each analyte the major isotopes were monitored for analysis through the lower concentration ranges (50-5,000 ng/mL), and the naturally occurring, low probability 13C2 isotopes were monitored for the analysis of the high concentration samples (5,000-100,000 ng/mL amphetamine and 5,000-200,000 ng/mL methamphetamine). The method simultaneously monitors transitions for the molecules containing only 12C and 13C2 isotopologues eliminating the need for re-extraction and reanalysis of high concentration samples. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Freeze/thaw-induced embolism: probability of critical bubble formation depends on speed of ice formation

    Directory of Open Access Journals (Sweden)

    Sanna eSevanto

    2012-06-01

    Full Text Available Bubble formation in the conduits of woody plants sets a challenge for uninterrupted water transportation from the soil up to the canopy. Freezing and thawing of stems has been shown to increase the number of air-filled (embolized conduits, especially in trees with large conduit diameters. Despite numerous experimental studies, the mechanisms leading to bubble formation during freezing have not been addressed theoretically. We used classical nucleation theory and fluid mechanics to show which mechanisms are most likely to be responsible for bubble formation during freezing and what parameters determine the likelihood of the process. Our results confirm the common assumption that bubble formation during freezing is most likely due to gas segregation by ice. If xylem conduit walls are not permeable to the salts expelled by ice during the freezing process, osmotic pressures high enough for air seeding could be created. The build-up rate of segregated solutes in front of the ice-water interface depends equally on conduit diameter and freezing velocity. Therefore, bubble formation probability depends on these variables. The dependence of bubble formation probability on freezing velocity means that the experimental results obtained for cavitation threshold conduit diameters during freeze/thaw cycles depend on the experimental setup; namely sample size and cooling rate. The velocity dependence also suggests that to avoid bubble formation during freezing trees should have narrow conduits where freezing is likely to be fast (e.g. branches or outermost layer of the xylem. Avoidance of bubble formation during freezing could thus be one piece of the explanation why xylem conduit size of temperate and boreal zone trees varies quite systematically.

  10. Freeze/Thaw-Induced Embolism: Probability of Critical Bubble Formation Depends on Speed of Ice Formation

    Science.gov (United States)

    Sevanto, Sanna; Holbrook, N. Michele; Ball, Marilyn C.

    2012-01-01

    Bubble formation in the conduits of woody plants sets a challenge for uninterrupted water transportation from the soil up to the canopy. Freezing and thawing of stems has been shown to increase the number of air-filled (embolized) conduits, especially in trees with large conduit diameters. Despite numerous experimental studies, the mechanisms leading to bubble formation during freezing have not been addressed theoretically. We used classical nucleation theory and fluid mechanics to show which mechanisms are most likely to be responsible for bubble formation during freezing and what parameters determine the likelihood of the process. Our results confirm the common assumption that bubble formation during freezing is most likely due to gas segregation by ice. If xylem conduit walls are not permeable to the salts expelled by ice during the freezing process, osmotic pressures high enough for air seeding could be created. The build-up rate of segregated solutes in front of the ice-water interface depends equally on conduit diameter and freezing velocity. Therefore, bubble formation probability depends on these variables. The dependence of bubble formation probability on freezing velocity means that the experimental results obtained for cavitation threshold conduit diameters during freeze/thaw cycles depend on the experimental setup; namely sample size and cooling rate. The velocity dependence also suggests that to avoid bubble formation during freezing trees should have narrow conduits where freezing is likely to be fast (e.g., branches or outermost layer of the xylem). Avoidance of bubble formation during freezing could thus be one piece of the explanation why xylem conduit size of temperate and boreal zone trees varies quite systematically. PMID:22685446

  11. Dependence of shake probability on nuclear charge in Li-, Na- and K-like ions

    OpenAIRE

    Kupliauskiene, A.; Glemza, K.

    2005-01-01

    In sudden perturbation approximation, the probability of the shake-up process accompanying inner-shell ionization is calculated for the isoelectronic sequences of Li-, Na- and K-like ions in the ground and excited $n$p and $n$d states. Numerical solutions of Hartree-Fock equations and hydrogen-like radial orbitals are used. Very large differences between the results of both approximations for all ions and strong dependences on ion charge are obtained at the beginning of the isoelectronic sequ...

  12. Energy dependence of polymer gels in the orthovoltage energy range

    Directory of Open Access Journals (Sweden)

    Yvonne Roed

    2014-03-01

    Full Text Available Purpose: Ortho-voltage energies are often used for treatment of patients’ superficial lesions, and also for small- animal irradiations. Polymer-Gel dosimeters such as MAGAT (Methacrylic acid Gel and THPC are finding increasing use for 3-dimensional verification of radiation doses in a given treatment geometry. For mega-voltage beams, energy dependence of MAGAT has been quoted as nearly energy-independent. In the kilo-voltage range, there is hardly any literature to shade light on its energy dependence.Methods: MAGAT was used to measure depth-dose for 250 kVp beam. Comparison with ion-chamber data showed a discrepancy increasing significantly with depth. An over-response as much as 25% was observed at a depth of 6 cm.Results and Conclusion: Investigation concluded that 6 cm water in the beam resulted in a half-value-layer (HVL change from 1.05 to 1.32 mm Cu. This amounts to an effective-energy change from 81.3 to 89.5 keV. Response measurements of MAGAT at these two energies explained the observed discrepancy in depth-dose measurements. Dose-calibration curves of MAGAT for (i 250 kVp beam, and (ii 250 kVp beam through 6 cm of water column are presented showing significant energy dependence.-------------------Cite this article as: Roed Y, Tailor R, Pinksy L, Ibbott G. Energy dependence of polymer gels in the orthovoltage energy range. Int J Cancer Ther Oncol 2014; 2(2:020232. DOI: 10.14319/ijcto.0202.32 

  13. Long-range dependence and sea level forecasting

    CERN Document Server

    Ercan, Ali; Abbasov, Rovshan K

    2013-01-01

    This study shows that the Caspian Sea level time series possess long range dependence even after removing linear trends, based on analyses of the Hurst statistic, the sample autocorrelation functions, and the periodogram of the series. Forecasting performance of ARMA, ARIMA, ARFIMA and Trend Line-ARFIMA (TL-ARFIMA) combination models are investigated. The forecast confidence bands and the forecast updating methodology, provided for ARIMA models in the literature, are modified for the ARFIMA models. Sample autocorrelation functions are utilized to estimate the differencing lengths of the ARFIMA

  14. Measurement of 240Pu Angular Momentum Dependent Fission Probabilities Using the (α ,α') Reaction

    Science.gov (United States)

    Koglin, Johnathon; Burke, Jason; Fisher, Scott; Jovanovic, Igor

    2017-09-01

    The surrogate reaction method often lacks the theoretical framework and necessary experimental data to constrain models especially when rectifying differences between angular momentum state differences between the desired and surrogate reaction. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(α ,α' f) reaction - a surrogate for the 239Pu(n , f) - and fission fragment angular distributions. Fission probability measurements were performed at a beam energy of 35.9(2) MeV at eleven scattering angles from 40° to 140°e in 10° intervals and at nuclear excitation energies up to 16 MeV. Fission fragment angular distributions were measured in six bins from 4.5 MeV to 8.0 MeV and fit to expected distributions dependent on the vibrational and rotational excitations at the saddle point. In this way, the contributions to the total fission probability from specific states of K angular momentum projection on the symmetry axis are extracted. A sizable data collection is presented to be considered when constraining microscopic cross section calculations.

  15. Probable causes of increasing brucellosis in free-ranging elk of the Greater Yellowstone Ecosystem

    Science.gov (United States)

    Cross, P.C.; Cole, E.K.; Dobson, A.P.; Edwards, W.H.; Hamlin, K.L.; Luikart, G.; Middleton, A.D.; Scurlock, B.M.; White, P.J.

    2010-01-01

    While many wildlife species are threatened, some populations have recovered from previous overexploitation, and data linking these population increases with disease dynamics are limited. We present data suggesting that free-ranging elk (Cervus elaphus) are a maintenance host for Brucella abortus in new areas of the Greater Yellowstone Ecosystem (GYE). Brucellosis seroprevalence in free-ranging elk increased from 0-7% in 1991-1992 to 8-20% in 2006-2007 in four of six herd units around the GYE. These levels of brucellosis are comparable to some herd units where elk are artificially aggregated on supplemental feeding grounds. There are several possible mechanisms for this increase that we evaluated using statistical and population modeling approaches. Simulations of an age-structured population model suggest that the observed levels of seroprevalence are unlikely to be sustained by dispersal from supplemental feeding areas with relatively high seroprevalence or an older age structure. Increases in brucellosis seroprevalence and the total elk population size in areas with feeding grounds have not been statistically detectable. Meanwhile, the rate of seroprevalence increase outside the feeding grounds was related to the population size and density of each herd unit. Therefore, the data suggest that enhanced elk-to-elk transmission in free-ranging populations may be occurring due to larger winter elk aggregations. Elk populations inside and outside of the GYE that traditionally did not maintain brucellosis may now be at risk due to recent population increases. In particular, some neighboring populations of Montana elk were 5-9 times larger in 2007 than in the 1970s, with some aggregations comparable to the Wyoming feeding-ground populations. Addressing the unintended consequences of these increasing populations is complicated by limited hunter access to private lands, which places many ungulate populations out of administrative control. Agency-landowner hunting access

  16. Fluorosis as a probable cause of chronic lameness in free ranging eastern grey kangaroos (Macropus giganteus).

    Science.gov (United States)

    Clarke, Emily; Beveridge, Ian; Slocombe, Ron; Coulson, Graeme

    2006-12-01

    A population of eastern grey kangaroos (Macropus giganteus) inhabiting heathland and farmland surrounding an aluminum smelter at Portland, Victoria, Australia, exhibited clinical signs of lameness. An investigation was undertaken to determine the cause of this lameness. Hematology, necropsy, histopathology, fecal egg count, total worm count, reproductive status, and the population age range were examined and failed to reveal any additional underlying disease state. The specific problem of lameness was addressed with bone histopathology, radiography, quantitative ultrasonography, microradiography, and multielement analysis of bone ash samples. The significant lesions observed were: osteophytosis of the distal tibia and fibula, tarsal bones, metatarsus IV, and proximal coccygeal vertebrae; osteopenia of the femur, tibia, and metatarsus IV; incisor enamel hypoplasia; stained, uneven, and abnormal teeth wear; abnormal bone matrix mineralization and mottling; increased bone density; and elevated bone fluoride levels. Microradiography of affected kangaroos exhibited "black osteons," which are a known manifestation of fluorosis. Collectively, these lesions were consistent with a diagnosis of fluorosis.

  17. A Hidden Semi-Markov Model with Duration-Dependent State Transition Probabilities for Prognostics

    Directory of Open Access Journals (Sweden)

    Ning Wang

    2014-01-01

    Full Text Available Realistic prognostic tools are essential for effective condition-based maintenance systems. In this paper, a Duration-Dependent Hidden Semi-Markov Model (DD-HSMM is proposed, which overcomes the shortcomings of traditional Hidden Markov Models (HMM, including the Hidden Semi-Markov Model (HSMM: (1 it allows explicit modeling of state transition probabilities between the states; (2 it relaxes observations’ independence assumption by accommodating a connection between consecutive observations; and (3 it does not follow the unrealistic Markov chain’s memoryless assumption and therefore it provides a more powerful modeling and analysis capability for real world problems. To facilitate the computation of the proposed DD-HSMM methodology, new forward-backward algorithm is developed. The demonstration and evaluation of the proposed methodology is carried out through a case study. The experimental results show that the DD-HSMM methodology is effective for equipment health monitoring and management.

  18. Record length requirement of long-range dependent teletraffic

    Science.gov (United States)

    Li, Ming

    2017-04-01

    This article contributes the highlights mainly in two folds. On the one hand, it presents a formula to compute the upper bound of the variance of the correlation periodogram measurement of teletraffic (traffic for short) with long-range dependence (LRD) for a given record length T and a given value of the Hurst parameter H (Theorems 1 and 2). On the other hand, it proposes two formulas for the computation of the variance upper bound of the correlation periodogram measurement of traffic of fractional Gaussian noise (fGn) type and the generalized Cauchy (GC) type, respectively (Corollaries 1 and 2). They may constitute a reference guideline of record length requirement of traffic with LRD. In addition, record length requirement for the correlation periodogram measurement of traffic with either the Schuster type or the Bartlett one is studied and the present results about it show that both types of periodograms may be used for the correlation measurement of traffic with a pre-desired variance bound of correlation estimation. Moreover, real traffic in the Internet Archive by the Special Interest Group on Data Communication under the Association for Computing Machinery of US (ACM SIGCOMM) is analyzed in the case study in this topic.

  19. Time-dependent probability density functions and information geometry in stochastic logistic and Gompertz models

    Science.gov (United States)

    Tenkès, Lucille-Marie; Hollerbach, Rainer; Kim, Eun-jin

    2017-12-01

    A probabilistic description is essential for understanding growth processes in non-stationary states. In this paper, we compute time-dependent probability density functions (PDFs) in order to investigate stochastic logistic and Gompertz models, which are two of the most popular growth models. We consider different types of short-correlated multiplicative and additive noise sources and compare the time-dependent PDFs in the two models, elucidating the effects of the additive and multiplicative noises on the form of PDFs. We demonstrate an interesting transition from a unimodal to a bimodal PDF as the multiplicative noise increases for a fixed value of the additive noise. A much weaker (leaky) attractor in the Gompertz model leads to a significant (singular) growth of the population of a very small size. We point out the limitation of using stationary PDFs, mean value and variance in understanding statistical properties of the growth in non-stationary states, highlighting the importance of time-dependent PDFs. We further compare these two models from the perspective of information change that occurs during the growth process. Specifically, we define an infinitesimal distance at any time by comparing two PDFs at times infinitesimally apart and sum these distances in time. The total distance along the trajectory quantifies the total number of different states that the system undergoes in time, and is called the information length. We show that the time-evolution of the two models become more similar when measured in units of the information length and point out the merit of using the information length in unifying and understanding the dynamic evolution of different growth processes.

  20. Probability Model of Center-of-mass Calibration of Satellites' Retro-reflectors Used for Laser Ranging

    Directory of Open Access Journals (Sweden)

    ZHAO Qunhe

    2015-04-01

    Full Text Available Satellite laser ranging system calculates the distance from ground-based observatories to satellites using the round-trip travel time of laser pulse. The position of retro-reflectors on satellites needs to be corrected which is helpful to improvie the measuring precision of satellite laser ranging. The correction errors of center-of-mass(CoMare mainly caused by the distribution effects of retro-reflectors on satellites.CoM is related to incident angle, structural alignment of retro-reflectors and ground-based position. Based on the reflecting probability of photons for retro-reflectors is proportional to the cross sections of retro-reflectors, the cross section area of corner reflectors is fitted and the probabilistic model is established using incident angle as the random variable. The corrections of CoMs of spherical satellite such as LAGEOS-1/2 are calculated and different CoM values are applied for SLR precise orbit determination using long-term full rate observation data with different WRMS results analyzed. At last, for the planar array retro-reflectors, the CoMs of BeiDou navigational satellite such as BeiDou-M3 are also calculated and analyzed using one month SLR full rate data. The result shows that the calculated CoMs based on probability theory have the comparative precision in SLR precise orbit determination.

  1. Probability of loss of assured safety in systems with multiple time-dependent failure modes.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Pilch, Martin.; Sallaberry, Cedric Jean-Marie.

    2012-09-01

    Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high-consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to deactivate the entire system before the SL system fails (i.e., degrades into a configuration that could allow an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). Representations for PLOAS for situations in which both link physical properties and link failure properties are time-dependent are derived and numerically evaluated for a variety of WL/SL configurations, including PLOAS defined by (i) failure of all SLs before failure of any WL, (ii) failure of any SL before failure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS are considered.

  2. Further Evidence That the Effects of Repetition on Subjective Time Depend on Repetition Probability.

    Science.gov (United States)

    Skylark, William J; Gheorghiu, Ana I

    2017-01-01

    Repeated stimuli typically have shorter apparent duration than novel stimuli. Most explanations for this effect have attributed it to the repeated stimuli being more expected or predictable than the novel items, but an emerging body of work suggests that repetition and expectation exert distinct effects on time perception. The present experiment replicated a recent study in which the probability of repetition was varied between blocks of trials. As in the previous work, the repetition effect was smaller when repeats were common (and therefore more expected) than when they were rare. These results add to growing evidence that, contrary to traditional accounts, expectation increases apparent duration whereas repetition compresses subjective time, perhaps via a low-level process like adaptation. These opposing processes can be seen as instances of a more general "processing principle," according to which subjective time is a function of the perceptual strength of the stimulus representation, and therefore depends on a confluence of "bottom-up" and "top-down" variables.

  3. Effective-range dependence of two-dimensional Fermi gases

    Science.gov (United States)

    Schonenberg, L. M.; Verpoort, P. C.; Conduit, G. J.

    2017-08-01

    The Feshbach resonance provides precise control over the scattering length and effective range of interactions between ultracold atoms. We propose the ultratransferable pseudopotential to model effective interaction ranges -1.5 ≤kF2Reff2≤0 , where Reff is the effective range and kF is the Fermi wave vector, describing narrow to broad Feshbach resonances. We develop a mean-field treatment and exploit the pseudopotential to perform a variational and diffusion Monte Carlo study of the ground state of the two-dimensional Fermi gas, reporting on the ground-state energy, contact, condensate fraction, momentum distribution, and pair-correlation functions as a function of the effective interaction range across the BEC-BCS crossover. The limit kF2Reff2→-∞ is a gas of bosons with zero binding energy, whereas ln(kFa )→-∞ corresponds to noninteracting bosons with infinite binding energy.

  4. Effective-range dependence of resonant Fermi gases

    Science.gov (United States)

    Schonenberg, L. M.; Conduit, G. J.

    2017-01-01

    A Fermi gas of cold atoms allows precise control over the dimensionless effective range, kFReff , of the Feshbach resonance. Our pseudopotential formalism allows us to create smooth potentials with effective range, -2 ≤kFReff≤2 , which we use for a variational and diffusion Monte Carlo study of the ground state of a unitary Fermi gas. We report values for the universal constants of ξ =0.388 (1 ) and ζ =0.087 (1 ) , and compute the condensate fraction, momentum distribution, and pair correlations functions. Finally, we show that a gas with kFReff≳1.9 is thermodynamically unstable.

  5. The temperature dependence of the BK channel activity - kinetics, thermodynamics, and long-range correlations.

    Science.gov (United States)

    Wawrzkiewicz-Jałowiecka, Agata; Dworakowska, Beata; Grzywna, Zbigniew J

    2017-10-01

    Large-conductance, voltage dependent, Ca2+-activated potassium channels (BK) are transmembrane proteins that regulate many biological processes by controlling potassium flow across cell membranes. Here, we investigate to what extent temperature (in the range of 17-37°C with ΔT=5°C step) is a regulating parameter of kinetic properties of the channel gating and memory effect in the series of dwell-time series of subsequent channel's states, at membrane depolarization and hyperpolarization. The obtained results indicate that temperature affects strongly the BK channels' gating, but, counterintuitively, it exerts no effect on the long-range correlations, as measured by the Hurst coefficient. Quantitative differences between dependencies of appropriate channel's characteristics on temperature are evident for different regimes of voltage. Examining the characteristics of BK channel activity as a function of temperature allows to estimate the net activation energy (Eact) and changes of thermodynamic parameters (ΔH, ΔS, ΔG) by channel opening. Larger Eact corresponds to the channel activity at membrane hyperpolarization. The analysis of entropy and enthalpy changes of closed to open channel's transition suggest the entropy-driven nature of the increase of open state probability during voltage activation and supports the hypothesis about the voltage-dependent geometry of the channel vestibule. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Measurement of Angular-Momentum-Dependent Fission Probabilities of 240Pu

    Science.gov (United States)

    Koglin, Johnathon; Burke, Jason; Jovanovic, Igor

    2016-09-01

    An experimental technique using the surrogate reaction method has been developed to measure fission probabilities of actinides as a function of angular momentum state of the fissioning nucleus near the fission barrier. In this work, the 240Pu (α ,α' f) reaction was used as a surrogate for 239Pu (n , f) . An array of 12 silicon telescopes positioned at 10 degree intervals from 40 to 140 degrees detect the outgoing reaction particle for identification and measurement of the excitation energy. The angular momentum state is determined by measuring the angular distribution of fission fragments. The expected distributions are predicted from the Wigner d function. An array of 50 photovoltaic (solar) cells detects fission fragments with 10-degree granularity. The solar cells are sensitive to fission fragments but have no response to light ions. Relative contributions from different angular momentum states are extracted from the measured distributions and compared across all α particle scattering angles to determine fission probability at a specific angular momentum state. The first experiment using this technique was recently completed using 37 MeV α particles incident on 240Pu. First results will be discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. Department of Homeland Security under Grant Award Nu.

  7. Fuzzy-logic detection and probability of hail exploiting short-range X-band weather radar

    Science.gov (United States)

    Capozzi, Vincenzo; Picciotti, Errico; Mazzarella, Vincenzo; Marzano, Frank Silvio; Budillon, Giorgio

    2018-03-01

    This work proposes a new method for hail precipitation detection and probability, based on single-polarization X-band radar measurements. Using a dataset consisting of reflectivity volumes, ground truth observations and atmospheric sounding data, a probability of hail index, which provides a simple estimate of the hail potential, has been trained and adapted within Naples metropolitan environment study area. The probability of hail has been calculated starting by four different hail detection methods. The first two, based on (1) reflectivity data and temperature measurements and (2) on vertically-integrated liquid density product, respectively, have been selected from the available literature. The other two techniques are based on combined criteria of the above mentioned methods: the first one (3) is based on the linear discriminant analysis, whereas the other one (4) relies on the fuzzy-logic approach. The latter is an innovative criterion based on a fuzzyfication step performed through ramp membership functions. The performances of the four methods have been tested using an independent dataset: the results highlight that the fuzzy-oriented combined method performs slightly better in terms of false alarm ratio, critical success index and area under the relative operating characteristic. An example of application of the proposed hail detection and probability products is also presented for a relevant hail event, occurred on 21 July 2014.

  8. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  9. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift.

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  10. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  11. A better understanding of long-range temporal dependence of traffic flow time series

    Science.gov (United States)

    Feng, Shuo; Wang, Xingmin; Sun, Haowei; Zhang, Yi; Li, Li

    2018-02-01

    Long-range temporal dependence is an important research perspective for modelling of traffic flow time series. Various methods have been proposed to depict the long-range temporal dependence, including autocorrelation function analysis, spectral analysis and fractal analysis. However, few researches have studied the daily temporal dependence (i.e. the similarity between different daily traffic flow time series), which can help us better understand the long-range temporal dependence, such as the origin of crossover phenomenon. Moreover, considering both types of dependence contributes to establishing more accurate model and depicting the properties of traffic flow time series. In this paper, we study the properties of daily temporal dependence by simple average method and Principal Component Analysis (PCA) based method. Meanwhile, we also study the long-range temporal dependence by Detrended Fluctuation Analysis (DFA) and Multifractal Detrended Fluctuation Analysis (MFDFA). The results show that both the daily and long-range temporal dependence exert considerable influence on the traffic flow series. The DFA results reveal that the daily temporal dependence creates crossover phenomenon when estimating the Hurst exponent which depicts the long-range temporal dependence. Furthermore, through the comparison of the DFA test, PCA-based method turns out to be a better method to extract the daily temporal dependence especially when the difference between days is significant.

  12. A Modified Generalized Fisher Method for Combining Probabilities from Dependent Tests

    Directory of Open Access Journals (Sweden)

    Hongying (Daisy eDai

    2014-02-01

    Full Text Available Rapid developments in molecular technology have yielded a large amount of high throughput genetic data to understand the mechanism for complex traits. The increase of genetic variants requires hundreds and thousands of statistical tests to be performed simultaneously in analysis, which poses a challenge to control the overall Type I error rate. Combining p-values from multiple hypothesis testing has shown promise for aggregating effects in high-dimensional genetic data analysis. Several p-value combining methods have been developed and applied to genetic data; see [Dai, et al. 2012b] for a comprehensive review. However, there is a lack of investigations conducted for dependent genetic data, especially for weighted p-value combining methods. Single nucleotide polymorphisms (SNPs are often correlated due to linkage disequilibrium. Other genetic data, including variants from next generation sequencing, gene expression levels measured by microarray, protein and DNA methylation data, etc. also contain complex correlation structures. Ignoring correlation structures among genetic variants may lead to severe inflation of Type I error rates for omnibus testing of p-values. In this work, we propose modifications to the Lancaster procedure by taking the correlation structure among p-values into account. The weight function in the Lancaster procedure allows meaningful biological information to be incorporated into the statistical analysis, which can increase the power of the statistical testing and/or remove the bias in the process. Extensive empirical assessments demonstrate that the modified Lancaster procedure largely reduces the Type I error rates due to correlation among p-values, and retains considerable power to detect signals among p-values. We applied our method to reassess published renal transplant data, and identified a novel association between B cell pathways and allograft tolerance.

  13. Probable Posttraumatic Stress Disorder and Women’s Use of Aggression in Intimate Relationships: The Moderating Role of Alcohol Dependence

    Science.gov (United States)

    Weiss, Nicole H.; Duke, Aaron A.; Sullivan, Tami P.

    2015-01-01

    Posttraumatic stress disorder (PTSD) is highly prevalent among individuals who experience intimate partner violence (IPV) and associated with aggression in intimate relationships. The present study examined whether alcohol dependence (AD) attenuates the relation between PTSD and IPV-victimized women’s use of physical, psychological, and sexual aggression. Participants were recruited from the community and included 147 women who engage in substance use and experience IPV [80.3% Black; M age = 38.2 years (SD = 10.6); M income = $14,323 (SD = $12,832)]. Women with (vs. without) AD reported using significantly more physical and psychological aggression (ηp2 = .12 and .03, respectively). The probable PTSD × AD interaction emerged as a significant correlate of physical and sexual aggression (ηp2s = .03). Post-hoc analyses revealed higher levels of physical aggression among women with probable PTSD and AD and no-PTSD and AD compared to women with probable PTSD and no-AD (Cohen’s ds = 1.09 and 0.63, respectively) and women with no-PTSD and no-AD (Cohen’s ds = 0.92 and 0.60, respectively). Further, women with PTSD and AD reported higher levels of sexual aggression than women with no-PTSD and AD (Cohen’s d = 0.80). Findings suggest the utility of identifying and treating PTSD-AD among IPV-victimized women. PMID:25322884

  14. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  15. Multi-configuration time-dependent density-functional theory based on range separation

    DEFF Research Database (Denmark)

    Fromager, E.; Knecht, S.; Jensen, Hans Jørgen Aagaard

    2013-01-01

    Multi-configuration range-separated density-functional theory is extended to the time-dependent regime. An exact variational formulation is derived. The approximation, which consists in combining a long-range Multi-Configuration- Self-Consistent Field (MCSCF) treatment with an adiabatic short......-range density-functional (DFT) description, is then considered. The resulting time-dependent multi-configuration short-range DFT (TD-MC-srDFT) model is applied to the calculation of singlet excitation energies in H, Be, and ferrocene, considering both short-range local density (srLDA) and generalized gradient...

  16. Probability Model of Center-of-mass Calibration of Satellites' Retro-reflectors Used for Laser Ranging

    OpenAIRE

    ZHAO Qunhe; Wang, Xiaoya; He, Bing; ZHANG Zhongping; Wanzhen CHEN; Chen, Hongyu; Jiang, Hu; HU Xiaogong

    2015-01-01

    Satellite laser ranging system calculates the distance from ground-based observatories to satellites using the round-trip travel time of laser pulse. The position of retro-reflectors on satellites needs to be corrected which is helpful to improvie the measuring precision of satellite laser ranging. The correction errors of center-of-mass(CoM)are mainly caused by the distribution effects of retro-reflectors on satellites.CoM is related to incident angle, structural alignment of retro-reflector...

  17. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  18. Probable posttraumatic stress disorder and women's use of aggression in intimate relationships: the moderating role of alcohol dependence.

    Science.gov (United States)

    Weiss, Nicole H; Duke, Aaron A; Sullivan, Tami P

    2014-10-01

    Posttraumatic stress disorder (PTSD) is highly prevalent among individuals who experience intimate partner violence (IPV) and is associated with aggression in intimate relationships. The present study examined whether alcohol dependence (AD) attenuates the relation between PTSD and IPV-victimized women's use of physical, psychological, and sexual aggression. Participants were recruited from the community and included 147 women who engaged in substance use and experienced IPV (80.3% Black; M age = 38.24 years, SD = 10.62; M income = $14,323, SD = $12,832). Women with (vs. without) AD reported using significantly more physical and psychological aggression (ηp (2)  = .12 and .03, respectively). The probable PTSD × AD interaction emerged as a significant correlate of physical and sexual aggression (ηp (2)  = .03). Post hoc analyses revealed higher levels of physical aggression among women with probable PTSD and AD and no-PTSD and AD compared to women with probable PTSD and no-AD (Cohen's ds = 1.09 and 0.63, respectively) and women without PTSD and no-AD (Cohen's ds = 0.92 and 0.60, respectively). Further, women with PTSD and AD reported higher levels of sexual aggression than women without PTSD and AD (Cohen's d = 0.80). Findings suggest the utility of identifying and treating PTSD-AD among IPV-victimized women. Copyright © 2014 International Society for Traumatic Stress Studies.

  19. System Estimation of Panel Data Models under Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre

    A general dynamic panel data model is considered that incorporates individual and interactive fixed effects allowing for contemporaneous correlation in model innovations. The model accommodates general stationary or nonstationary long-range dependence through interactive fixed effects and innovat...

  20. Improving Delay-Range-Dependent Stability Condition for Systems with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Wei Qian

    2013-01-01

    Full Text Available This paper discusses the delay-range-dependent stability for systems with interval time-varying delay. Through defining the new Lyapunov-Krasovskii functional and estimating the derivative of the LKF by introducing new vectors, using free matrices and reciprocally convex approach, the new delay-range-dependent stability conditions are obtained. Two well-known examples are given to illustrate the less conservatism of the proposed theoretical results.

  1. Investigation of photon detection probability dependence of SPADnet-I digital photon counter as a function of angle of incidence, wavelength and polarization

    Energy Technology Data Exchange (ETDEWEB)

    Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor

    2015-01-01

    SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.

  2. A novel nuclear dependence of nucleon–nucleon short-range correlations

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Hongkai [College of Physics and Electronic Engineering, Northwest Normal University, Lanzhou 730070 (China); Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Wang, Rong, E-mail: rwang@impcas.ac.cn [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Lanzhou University, Lanzhou 730000 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Huang, Yin [Lanzhou University, Lanzhou 730000 (China); Chen, Xurong [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2017-06-10

    A linear correlation is found between the magnitude of nucleon–nucleon short-range correlations and the nuclear binding energy per nucleon with pairing energy removed. By using this relation, the strengths of nucleon–nucleon short-range correlations of some unmeasured nuclei are predicted. Discussions on nucleon–nucleon pairing energy and nucleon–nucleon short-range correlations are made. The found nuclear dependence of nucleon–nucleon short-range correlations may shed some lights on the short-range structure of nucleus.

  3. Trait mindfulness, reasons for living and general symptom severity as predictors of suicide probability in males with substance abuse or dependence.

    Directory of Open Access Journals (Sweden)

    Parvaneh Mohammadkhani

    2015-03-01

    Full Text Available The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms.Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS.The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001. The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001.It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability.

  4. Robustness of Estimators of Long-Range Dependence and Self-Similarity under non-Gaussianity

    CERN Document Server

    Franzke, Christian L E; Watkins, Nicholas W; Gramacy, Robert B; Hughes, Cecilia

    2011-01-01

    Long-range dependence and non-Gaussianity are ubiquitous in many natural systems like ecosystems, biological systems and climate. However, it is not always appreciated that both phenomena usually occur together in natural systems and that the superposition of both phenomena constitute the self-similarity of a system. These features, which are common in complex systems, impact the attribution of trends and the occurrence and clustering of extremes. The risk assessment of systems with these properties will lead to different outcomes (e.g. return periods) than the more common assumption of independence of extremes. Two paradigmatic models are discussed which can simultaneously account for long-range dependence and non-Gaussianity: Autoregressive Fractional Integrated Moving Average (ARFIMA) and Linear Fractional Stable Motion (LFSM). Statistical properties of estimators for long-range dependence and self-similarity are critically assessed. It is found that the most popular estimators are not robust. In particula...

  5. Fractality Evidence and Long-Range Dependence on Capital Markets: a Hurst Exponent Evaluation

    Science.gov (United States)

    Oprean, Camelia; Tănăsescu, Cristina

    2014-07-01

    Since the existence of market memory could implicate the rejection of the efficient market hypothesis, the aim of this paper is to find any evidence that selected emergent capital markets (eight European and BRIC markets, namely Hungary, Romania, Estonia, Czech Republic, Brazil, Russia, India and China) evince long-range dependence or the random walk hypothesis. In this paper, the Hurst exponent as calculated by R/S fractal analysis and Detrended Fluctuation Analysis is our measure of long-range dependence in the series. The results reinforce our previous findings and suggest that if stock returns present long-range dependence, the random walk hypothesis is not valid anymore and neither is the market efficiency hypothesis.

  6. Delay-range-dependent chaos synchronization approach under varying time-lags and delayed nonlinear coupling.

    Science.gov (United States)

    Zaheer, Muhammad Hamad; Rehan, Muhammad; Mustafa, Ghulam; Ashraf, Muhammad

    2014-11-01

    This paper proposes a novel state feedback delay-range-dependent control approach for chaos synchronization in coupled nonlinear time-delay systems. The coupling between two systems is esteemed to be nonlinear subject to time-lags. Time-varying nature of both the intrinsic and the coupling delays is incorporated to broad scope of the present study for a better-quality synchronization controller synthesis. Lyapunov-Krasovskii (LK) functional is employed to derive delay-range-dependent conditions that can be solved by means of the conventional linear matrix inequality (LMI)-tools. The resultant control approach for chaos synchronization of the master-slave time-delay systems considers non-zero lower bound of the intrinsic as well as the coupling time-delays. Further, the delay-dependent synchronization condition has been established as a special case of the proposed LK functional treatment. Furthermore, a delay-range-dependent condition, independent of the delay-rate, has been provided to address the situation when upper bound of the delay-derivative is unknown. A robust state feedback control methodology is formulated for synchronization of the time-delay chaotic networks against the L2 norm bounded perturbations by minimizing the L2 gain from the disturbance to the synchronization error. Numerical simulation results are provided for the time-delay chaotic networks to show effectiveness of the proposed delay-range-dependent chaos synchronization methodologies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Constraints on spin-dependent short-range interactions using gravitational quantum levels of ultracold neutrons

    CERN Document Server

    Baeßler, S; Pignol, G; Protasov, K V; Voronin, A Yu

    2009-01-01

    In this paper, we discuss a possibility to improve constraints on spin-dependent short-range interactions in the range of 1 - 200 micrometer significantly. For such interactions, our constraints are without competition at the moment. They were obtained through the observation of gravitationally bound states of ultracold neutrons. We are going to improve these constraints by about three orders of magnitude in a dedicated experiment with polarized neutrons using the next-generation spectrometer GRANIT.

  8. Functional framework and hardware platform for dependability study in short range wireless embedded systems

    NARCIS (Netherlands)

    Senouci, B.; Annema, Anne J.; Bentum, Marinus Jan; Kerkhoff, Hans G.

    2011-01-01

    A new direction in short-range wireless applications has appeared in the form of high-speed data communication devices for distances of a few meters. Behind these embedded applications, a complex Hardware/Software architecture is built. Dependability is one of the major challenges in these systems.

  9. Analytic model utilizing the complex ABCD method for range dependency of a monostatic coherent lidar

    DEFF Research Database (Denmark)

    Olesen, Anders Sig; Pedersen, Anders Tegtmeier; Hanson, Steen Grüner

    2014-01-01

    In this work, we present an analytic model for analyzing the range and frequency dependency of a monostatic coherent lidar measuring velocities of a diffuse target. The model of the signal power spectrum includes both the contribution from the optical system as well as the contribution from the t...

  10. Autonomic imbalance induced breakdown of long-range dependence in healthy heart rate.

    Science.gov (United States)

    Aoyagi, N; Struzik, Z R; Kiyono, K; Yamamoto, Y

    2007-01-01

    The investigation of the relation between the long-range correlation property of heart rate and autonomic balance. An investigation of the fractal scaling properties of heart rate variability was carried out by using detrended fluctuation analysis (DFA). Eleven healthy subjects were examined for two consecutive days, which included usual daily activity, strenuous prolonged experimental exercise, and sleep. We also considered two patient groups with autonomic dysfunction characterized by selective sympathetic and parasympathetic dominance. Robust long-range dependence in heart rate is observed only in the state of usual daily activity, characterized by normal heart rate typical of balanced autonomic sympathetic and parasympathetic regulation. This confirms the previously postulated behavioral independence of heart rate regulation, but reveals that the occurrence of 1/f, long-range dependence is restricted to only the state of autonomic balance. Both the sympathetic dominant high heart rate state, realized during strenuous experimental exercise, and the parasympathetic dominant low heart rate state, prevalent in (deep) sleep, are characterized by uncorrelated, near white-noise-like scaling, lacking long-range dependence. Remarkably, the breakdown of the long-range correlations observed in healthy heart rate in the states of sympathetic and parasympathetic dominance is in stark contrast to the increased correlations which have previously been observed in neurogenic parasympathetic and sympathetic dominance in patients suffering from primary autonomic failure and congestive heart failure, respectively. Our findings further reveal the diagnostic capabilities of heart rate dynamics, by differentiating physiological healthy states from pathology.

  11. Stable renal function in insulin-dependent diabetes mellitus 10 years after nephrotic range proteinuria.

    Science.gov (United States)

    Gault, M H; Fernandez, D

    1996-01-01

    It has been considered unlikely that patients with insulin-dependent diabetes and diabetic nephropathy with nephrotic range proteinuria can substantially reduce proteinuria and continue for many years without further loss of renal function. We present a patient who had the diagnosis of insulin-dependent diabetes made at age 15, had his first of 6 laser treatments for proliferative and hemorrhagic retinopathy at age 27 and was found to have nephrotic range proteinuria and edema with hypertension at age 29, when results of a renal biopsy were typical of diabetic nephropathy. Ten years later, with the last 5.5 years on ACE inhibitors, proteinuria has been neprotic range is very rare in reports, if not unique.

  12. Multi-configuration time-dependent density-functional theory based on range separation

    CERN Document Server

    Fromager, Emmanuel; Jensen, Hans Jørgen Aa

    2012-01-01

    Multi-configuration range-separated density-functional theory is extended to the time-dependent regime. An exact variational formulation is derived. The approximation, which consists in combining a long-range Multi-Configuration-Self-Consistent Field (MCSCF) treatment with an adiabatic short-range density-functional (DFT) description, is then considered. The resulting time-dependent multi-configuration short-range DFT (TD-MC-srDFT) model is applied to the calculation of singlet excitation energies in H2, Be and ferrocene, considering both short-range local density (srLDA) and generalized gradient (srGGA) approximations. In contrast to regular TD-DFT, TD-MC-srDFT can describe double excitations. As expected, when modeling long-range interactions with the MCSCF model instead of the adiabatic Buijse-Baerends density-matrix functional as recently proposed by Pernal [K. Pernal, J. Chem. Phys. 136, 184105 (2012)], the description of both the 1^1D doubly-excited state in Be and the 1^1\\Sigma^+_u state in the stretch...

  13. Scale dependence of multiplier distributions for particle concentration, enstrophy, and dissipation in the inertial range of homogeneous turbulence

    Science.gov (United States)

    Hartlep, Thomas; Cuzzi, Jeffrey N.; Weston, Brian

    2017-03-01

    Turbulent flows preferentially concentrate inertial particles depending on their stopping time or Stokes number, which can lead to significant spatial variations in the particle concentration. Cascade models are one way to describe this process in statistical terms. Here, we use a direct numerical simulation (DNS) dataset of homogeneous, isotropic turbulence to determine probability distribution functions (PDFs) for cascade multipliers, which determine the ratio by which a property is partitioned into subvolumes as an eddy is envisioned to decay into smaller eddies. We present a technique for correcting effects of small particle numbers in the statistics. We determine multiplier PDFs for particle number, flow dissipation, and enstrophy, all of which are shown to be scale dependent. However, the particle multiplier PDFs collapse when scaled with an appropriately defined local Stokes number. As anticipated from earlier works, dissipation and enstrophy multiplier PDFs reach an asymptote for sufficiently small spatial scales. From the DNS measurements, we derive a cascade model that is used it to make predictions for the radial distribution function (RDF) for arbitrarily high Reynolds numbers, Re, finding good agreement with the asymptotic, infinite Re inertial range theory of Zaichik and Alipchenkov [New J. Phys. 11, 103018 (2009), 10.1088/1367-2630/11/10/103018]. We discuss implications of these results for the statistical modeling of the turbulent clustering process in the inertial range for high Reynolds numbers inaccessible to numerical simulations.

  14. Short-Range Correlations and Their Implications for Isospin-Dependent Modification of Nuclear Quark Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Arrington, John

    2016-03-25

    The past decade has provided a much clearer picture of the structure of highmomentum components in nucleons, associated with hard, short-distance interactions between pairs of nucleons. Recent Jefferson Lab data on light nuclei suggest a connection between these so-called ’short-range correlations’ and the modification of the quark structure of nucleons in the nuclear environment. In light of this discovery that the detailed nuclear structure is important in describing the nuclear quark distributions, we examine the potential impact of the isospin-dependent structure of nuclei to see at what level this might yield flavor-dependent effects in nuclear quark distributions.

  15. Common long-range dependence in a panel of hourly Nord Pool electricity prices and loads

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre; Haldrup, Niels; Rodríguez-Caballero, Carlos Vladimir

    Equilibrium electricity spot prices and loads are often determined simultaneously in a day-ahead auction market for each hour of the subsequent day. Hence daily observations of hourly prices take the form of a periodic panel rather than a time series of hourly observations. We consider novel panel...... data approaches to analyse the time series and the cross-sectional dependence of hourly Nord Pool electricity spot prices and loads for the period 2000-2013. Hourly electricity prices and loads data are characterized by strong serial long-range dependence in the time series dimension in addition...... of the underlying production technology and because the demand is more volatile than the supply, equilibrium prices and loads are argued to identify the periodic power supply curve. The estimated supply elasticities are estimated from fractionally co-integrated relations and range between 0.5 and 1...

  16. Troponin levels within the normal range and probability of inducible myocardial ischemia and coronary events in patients with acute chest pain.

    Science.gov (United States)

    Bouzas-Mosquera, Alberto; Peteiro, Jesús; Broullón, Francisco J; Constanso, Ignacio P; Rodríguez-Garrido, Jorge L; Martínez, Dolores; Yáñez, Juan C; Bescos, Hildegart; Álvarez-García, Nemesio; Vázquez-Rodríguez, José Manuel

    2016-03-01

    Patients with suspected acute coronary syndromes and negative cardiac troponin (cTn) levels are deemed at low risk. Our aim was to assess the effect of cTn levels on the frequency of inducible myocardial ischemia and subsequent coronary events in patients with acute chest pain and cTn levels within the normal range. We evaluated 4474 patients with suspected acute coronary syndromes, nondiagnostic electrocardiograms and serial cTnI levels below the diagnostic threshold for myocardial necrosis using a conventional or a sensitive cTnI assay. The end points were the probability of inducible myocardial ischemia and coronary events (i.e., coronary death, myocardial infarction or coronary revascularization within 3 months). The probability of inducible myocardial ischemia was significantly higher in patients with detectable peak cTnI levels (25%) than in those with undetectable concentrations (14.6%, pacute coronary syndromes and seemingly negative cTnI. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  17. Density-dependent home-range size revealed by spatially explicit capture–recapture

    Science.gov (United States)

    Efford, M.G.; Dawson, Deanna K.; Jhala, Y.V.; Qureshi, Q.

    2016-01-01

    The size of animal home ranges often varies inversely with population density among populations of a species. This fact has implications for population monitoring using spatially explicit capture–recapture (SECR) models, in which both the scale of home-range movements σ and population density D usually appear as parameters, and both may vary among populations. It will often be appropriate to model a structural relationship between population-specific values of these parameters, rather than to assume independence. We suggest re-parameterizing the SECR model using kp = σp √Dp, where kp relates to the degree of overlap between home ranges and the subscript p distinguishes populations. We observe that kp is often nearly constant for populations spanning a range of densities. This justifies fitting a model in which the separate kp are replaced by the single parameter k and σp is a density-dependent derived parameter. Continuous density-dependent spatial variation in σ may also be modelled, using a scaled non-Euclidean distance between detectors and the locations of animals. We illustrate these methods with data from automatic photography of tigers (Panthera tigris) across India, in which the variation is among populations, from mist-netting of ovenbirds (Seiurus aurocapilla) in Maryland, USA, in which the variation is within a single population over time, and from live-trapping of brushtail possums (Trichosurus vulpecula) in New Zealand, modelling spatial variation within one population. Possible applications and limitations of the methods are discussed. A model in which kp is constant, while density varies, provides a parsimonious null model for SECR. The parameter k of the null model is a concise summary of the empirical relationship between home-range size and density that is useful in comparative studies. We expect deviations from this model, particularly the dependence of kp on covariates, to be biologically interesting.

  18. Space- and Time-Dependent Probabilities for Earthquake Fault Systems from Numerical Simulations: Feasibility Study and First Results

    Science.gov (United States)

    van Aalsburg, Jordan; Rundle, John B.; Grant, Lisa B.; Rundle, Paul B.; Yakovlev, Gleb; Turcotte, Donald L.; Donnellan, Andrea; Tiampo, Kristy F.; Fernandez, Jose

    2010-08-01

    In weather forecasting, current and past observational data are routinely assimilated into numerical simulations to produce ensemble forecasts of future events in a process termed "model steering". Here we describe a similar approach that is motivated by analyses of previous forecasts of the Working Group on California Earthquake Probabilities (WGCEP). Our approach is adapted to the problem of earthquake forecasting using topologically realistic numerical simulations for the strike-slip fault system in California. By systematically comparing simulation data to observed paleoseismic data, a series of spatial probability density functions (PDFs) can be computed that describe the probable locations of future large earthquakes. We develop this approach and show examples of PDFs associated with magnitude M > 6.5 and M > 7.0 earthquakes in California.

  19. Modeling the dependence between number of trials and success probability in beta-binomial-Poisson mixture distributions.

    Science.gov (United States)

    Zhu, Jun; Eickhoff, Jens C; Kaiser, Mark S

    2003-12-01

    Beta-binomial models are widely used for overdispersed binomial data, with the binomial success probability modeled as following a beta distribution. The number of binary trials in each binomial is assumed to be nonrandom and unrelated to the success probability. In many behavioral studies, however, binomial observations demonstrate more complex structures. In this article, a general beta-binomial-Poisson mixture model is developed, to allow for a relation between the number of trials and the success probability for overdispersed binomial data. An EM algorithm is implemented to compute both the maximum likelihood estimates of the model parameters and the corresponding standard errors. For illustration, the methodology is applied to study the feeding behavior of green-backed herons in two southeastern Missouri streams.

  20. Generalized Efficient Inference on Factor Models with Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre

    . Short-memory dynamics are allowed in the common factor structure and possibly heteroskedastic error term. In the estimation, a generalized version of the principal components (PC) approach is proposed to achieve efficiency. Asymptotics for efficient common factor and factor loading as well as long......A dynamic factor model is considered that contains stochastic time trends allowing for stationary and nonstationary long-range dependence. The model nests standard I(0) and I(1) behaviour smoothly in common factors and residuals, removing the necessity of a priori unit-root and stationarity testing...

  1. Strong dependence of shake probability on valence electron state for the inner-shell ionization of atoms

    Energy Technology Data Exchange (ETDEWEB)

    Kupliauskiene, A [Institute of Theoretical Physics and Astronomy of Vilnius University, A Gostauto 12, 2600 Vilnius (Lithuania); Glemza, K [Vilnius University, Sauletekio 9, Vilnius (Lithuania)

    2002-11-28

    The probability of the shaking process accompanying inner-shell ionization (expressed as the square of the overlap integrals of valence electron radial orbitals in the initial and final states) is calculated for a number of second- and third-row atoms and singly and doubly charged ions (in the excited states of n{sub 0} l{sub 0}{sup N} nl (3{<=}n{<=}9, 0{<=}l{<=}3)). Enormous differences are found for the low-excited n s and n p states between shake probabilities that are calculated using numerical solutions of Hartree-Fock equations and hydrogenic radial orbitals (with an effective nuclear charge and with an effective principal quantum number obtained from experimental binding energies). The results can be a useful guide in the search for strong relaxation effects in the Auger decay and inner-shell ionization of excited atoms and ions by photons and electrons as well as in sudden-perturbation approximation calculations.

  2. Transfer probabilities for the reactions O,2014+20O in terms of multiple time-dependent Hartree-Fock-Bogoliubov trajectories

    Science.gov (United States)

    Scamps, Guillaume; Hashimoto, Yukio

    2017-09-01

    The transfer reaction between two nuclei in the superfluid phase is studied with the time-dependent Hartree-Fock-Bogoliubov (TDHFB) theory. To restore the symmetry of the relative gauge angle, a set of independent TDHFB trajectories is taken into account. Then, the transfer probability is computed using a triple projection method. This method is first tested to determine the transfer probabilities on a toy model and compared to the exact solution. It is then applied to the reactions 20O+20O and 14O+20O in a realistic framework with a Gogny interaction.

  3. A long range dependent model with nonlinear innovations for simulating daily river flows

    Directory of Open Access Journals (Sweden)

    P. Elek

    2004-01-01

    Full Text Available We present the analysis aimed at the estimation of flood risks of Tisza River in Hungary on the basis of daily river discharge data registered in the last 100 years. The deseasonalised series has skewed and leptokurtic distribution and various methods suggest that it possesses substantial long memory. This motivates the attempt to fit a fractional ARIMA model with non-Gaussian innovations as a first step. Synthetic streamflow series can then be generated from the bootstrapped innovations. However, there remains a significant difference between the empirical and the synthetic density functions as well as the quantiles. This brings attention to the fact that the innovations are not independent, both their squares and absolute values are autocorrelated. Furthermore, the innovations display non-seasonal periods of high and low variances. This behaviour is characteristic to generalised autoregressive conditional heteroscedastic (GARCH models. However, when innovations are simulated as GARCH processes, the quantiles and extremes of the discharge series are heavily overestimated. Therefore we suggest to fit a smooth transition GARCH-process to the innovations. In a standard GARCH model the dependence of the variance on the lagged innovation is quadratic whereas in our proposed model it is a bounded function. While preserving long memory and eliminating the correlation from both the generating noise and from its square, the new model is superior to the previously mentioned ones in approximating the probability density, the high quantiles and the extremal behaviour of the empirical river flows.

  4. Long Range Dependence Prognostics for Bearing Vibration Intensity Chaotic Time Series

    Directory of Open Access Journals (Sweden)

    Qing Li

    2016-01-01

    Full Text Available According to the chaotic features and typical fractional order characteristics of the bearing vibration intensity time series, a forecasting approach based on long range dependence (LRD is proposed. In order to reveal the internal chaotic properties, vibration intensity time series are reconstructed based on chaos theory in phase-space, the delay time is computed with C-C method and the optimal embedding dimension and saturated correlation dimension are calculated via the Grassberger–Procaccia (G-P method, respectively, so that the chaotic characteristics of vibration intensity time series can be jointly determined by the largest Lyapunov exponent and phase plane trajectory of vibration intensity time series, meanwhile, the largest Lyapunov exponent is calculated by the Wolf method and phase plane trajectory is illustrated using Duffing-Holmes Oscillator (DHO. The Hurst exponent and long range dependence prediction method are proposed to verify the typical fractional order features and improve the prediction accuracy of bearing vibration intensity time series, respectively. Experience shows that the vibration intensity time series have chaotic properties and the LRD prediction method is better than the other prediction methods (largest Lyapunov, auto regressive moving average (ARMA and BP neural network (BPNN model in prediction accuracy and prediction performance, which provides a new approach for running tendency predictions for rotating machinery and provide some guidance value to the engineering practice.

  5. Earthquake simulations with time-dependent nucleation and long-range interactions

    Directory of Open Access Journals (Sweden)

    J. H. Dieterich

    1995-01-01

    Full Text Available A model for rapid simulation of earthquake sequences is introduced which incorporates long-range elastic interactions among fault elements and time-dependent earthquake nucleation inferred from experimentally derived rate- and state-dependent fault constitutive properties. The model consists of a planar two-dimensional fault surface which is periodic in both the x- and y-directions. Elastic interactions among fault elements are represented by an array of elastic dislocations. Approximate solutions for earthquake nucleation and dynamics of earthquake slip are introduced which permit computations to proceed in steps that are determined by the transitions from one sliding state to the next. The transition-driven time stepping and avoidance of systems of simultaneous equations permit rapid simulation of large sequences of earthquake events on computers of modest capacity, while preserving characteristics of the nucleation and rupture propagation processes evident in more detailed models. Earthquakes simulated with this model reproduce many of the observed spatial and temporal characteristics of clustering phenomena including foreshock and aftershock sequences. Clustering arises because the time dependence of the nucleation process is highly sensitive to stress perturbations caused by nearby earthquakes. Rate of earthquake activity following a prior earthquake decays according to Omori's aftershock decay law and falls off with distance.

  6. Interaction Models for Common Long-Range Dependence in Asset Prices Volatility

    Science.gov (United States)

    Teyssière, G.

    We consider a class of microeconomic models with interacting agents which replicate the main properties of asset prices time series: non-linearities in levels and common degree of long-memory in the volatilities and co-volatilities of multivariate time series. For these models, long-range dependence in asset price volatility is the consequence of swings in opinions and herding behavior of market participants, which generate switches in the heteroskedastic structure of asset prices. Thus, the observed long-memory in asset prices volatility might be the outcome of a change-point in the conditional variance process, a conclusion supported by a wavelet anaysis of the volatility series. This explains why volatility processes share only the properties of the second moments of long-memory processes, but not the properties of the first moments.

  7. Genetic association between APOE*4 and neuropsychiatric symptoms in patients with probable Alzheimer's disease is dependent on the psychosis phenotype

    Directory of Open Access Journals (Sweden)

    Christie Drew

    2012-12-01

    Full Text Available Abstract Background Neuropsychiatric symptoms such as psychosis are prevalent in patients with probable Alzheimer’s disease (AD and are associated with increased morbidity and mortality. Because these disabling symptoms are generally not well tolerated by caregivers, patients with these symptoms tend to be institutionalized earlier than patients without them. The identification of protective and risk factors for neuropsychiatric symptoms in AD would facilitate the development of more specific treatments for these symptoms and thereby decrease morbidity and mortality in AD. The E4 allele of the apolipoprotein E (APOE gene is a well-documented risk factor for the development of AD. However, genetic association studies of the APOE 4 allele and BPS in AD have produced conflicting findings. Methods This study investigates the association between APOE and neuropsychiatric symptoms in a large sample of clinically well-characterized subjects with probable AD (n=790 who were systematically evaluated using the Consortium to Establish a Registry for Alzheimer’s Disease (CERAD Behavioral Rating Scale for Dementia (BRSD. Results Our study found that hallucinations were significantly more likely to occur in subjects with no APOΕ4 alleles than in subjects with two Ε4 alleles (15% of subjects and 5% of subjects, respectively; p=.0066, whereas there was no association between the occurrence of delusions, aberrant motor behavior, or agitation and the number of Ε4 alleles. However, 94% of the subjects with hallucinations also had delusions (D+H. Conclusion These findings suggest that in AD the Ε4 allele is differentially associated with D+H but not delusions alone. This is consistent with the hypothesis that distinct psychotic subphenotypes may be associated with the APOE allele.

  8. The temperature dependence of intermediate range oxygen-oxygen correlations in liquid water.

    Science.gov (United States)

    Schlesinger, Daniel; Wikfeldt, K Thor; Skinner, Lawrie B; Benmore, Chris J; Nilsson, Anders; Pettersson, Lars G M

    2016-08-28

    We analyze the recent temperature dependent oxygen-oxygen pair-distribution functions from experimental high-precision x-ray diffraction data of bulk water by Skinner et al. [J. Chem. Phys. 141, 214507 (2014)] with particular focus on the intermediate range where small, but significant, correlations are found out to 17 Å. The second peak in the pair-distribution function at 4.5 Å is connected to tetrahedral coordination and was shown by Skinner et al. to change behavior with temperature below the temperature of minimum isothermal compressibility. Here we show that this is associated also with a peak growing at 11 Å which strongly indicates a collective character of fluctuations leading to the enhanced compressibility at lower temperatures. We note that the peak at ∼13.2 Å exhibits a temperature dependence similar to that of the density with a maximum close to 277 K or 4 °C. We analyze simulations of the TIP4P/2005 water model in the same manner and find excellent agreement between simulations and experiment albeit with a temperature shift of ∼20 K.

  9. Temperature dependence of Henry's law constant in an extended temperature range.

    Science.gov (United States)

    Görgényi, Miklós; Dewulf, Jo; Van Langenhove, Herman

    2002-08-01

    The Henry's law constants H for chloroform, 1,1-dichloroethane, 1,2-dichloropropane, trichloroethene, chlorobenzene, benzene and toluene were determined by the EPICS-SPME technique (equilibrium partitioning in closed systems--solid phase microextraction) in the temperature range 275-343 K. The curvature observed in the ln H vs. 1/T plot was due to the temperature dependence of the change in enthalpy delta H0 during the transfer of 1 mol solute from the aqueous solution to the gas phase. The nonlinearity of the plot was explained by means of a thermodynamic model which involves the temperature dependence of delta H0 of the compounds and the thermal expansion of water in the three-parameter equation ln (H rho TT) = A2/T + BTB + C2, where rho T is the density of water at temperature T, TB = ln(T/298) + (298-T)/T, A2 = -delta H298(0)/R, delta H298(0) is the delta H0 value at 298 K, B = delta Cp0/R, and C2 is a constant. delta Cp0 is the molar heat capacity change in volatilization from the aqueous solution. A statistical comparison of the two models demonstrates the superiority of the three-parameter equation over the two-parameter one ln H vs. 1/T). The new, three-parameter equation allows a more accurate description of the temperature dependence of H, and of the solubility of volatile organic compounds in water at higher temperatures.

  10. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    Energy Technology Data Exchange (ETDEWEB)

    Lehua Pan; G.S. Bodvarsson

    2001-10-22

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions.

  11. IMPLEMENTATION OF ZOOM-DEPENDENT CAMERA CALIBRATION IN CLOSE-RANGE PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    C. S. Fraser

    2012-07-01

    Full Text Available The application of consumer-grade cameras for photogrammetric measurement has traditionally been subject to the requirement that imagery is recorded at fixed zoom and focus settings. The camera is then metrically calibrated, usually via self-calibration, for the lens setting employed. This requirement arises since camera parameters, and especially principal distance and lens distortion coefficients, vary significantly with zoom/focus setting. A recently developed process, titled zoom-dependent (Z-D calibration, removes the necessity for the zoom setting to be fixed during the image capture process. Implementation of Z-D calibration requires that the camera be pre-calibrated at four or more focal settings within the zoom range, nominally at shortest and longest focal lengths, and at two mid-zoom settings. This requirement, coupled with issues of data management in carrying different focal settings for potentially every image within a bundle adjustment, has largely accounted for the reason that Z-D calibration has not previously been implemented within COTS software for close-range photogrammetry. The objective of this paper is to describe the practical implementation of Z-D calibration within software, along with its associated workflow, and to discuss issues that impact upon the accuracy, reliability and appropriateness of the technique. Experimental testing is used to highlight the merits and shortcomings of ZD calibration.

  12. Context-dependent JPEG backward-compatible high-dynamic range image compression

    Science.gov (United States)

    Korshunov, Pavel; Ebrahimi, Touradj

    2013-10-01

    High-dynamic range (HDR) imaging is expected, together with ultrahigh definition and high-frame rate video, to become a technology that may change photo, TV, and film industries. Many cameras and displays capable of capturing and rendering both HDR images and video are already available in the market. The popularity and full-public adoption of HDR content is, however, hindered by the lack of standards in evaluation of quality, file formats, and compression, as well as large legacy base of low-dynamic range (LDR) displays that are unable to render HDR. To facilitate the wide spread of HDR usage, the backward compatibility of HDR with commonly used legacy technologies for storage, rendering, and compression of video and images are necessary. Although many tone-mapping algorithms are developed for generating viewable LDR content from HDR, there is no consensus of which algorithm to use and under which conditions. We, via a series of subjective evaluations, demonstrate the dependency of the perceptual quality of the tone-mapped LDR images on the context: environmental factors, display parameters, and image content itself. Based on the results of subjective tests, it proposes to extend JPEG file format, the most popular image format, in a backward compatible manner to deal with HDR images also. An architecture to achieve such backward compatibility with JPEG is proposed. A simple implementation of lossy compression demonstrates the efficiency of the proposed architecture compared with the state-of-the-art HDR image compression.

  13. Coherent reverberation model based on adiabatic normal mode theory in a range dependent shallow water environment

    Science.gov (United States)

    Li, Zhenglin; Zhang, Renhe; Li, Fenghua

    2010-09-01

    Ocean reverberation in shallow water is often the predominant background interference in active sonar applications. It is still an open problem in underwater acoustics. In recent years, an oscillation phenomenon of the reverberation intensity, due to the interference of the normal modes, has been observed in many experiments. A coherent reverberation theory has been developed and used to explain this oscillation phenomenon [F. Li et al., Journal of Sound and Vibration, 252(3), 457-468, 2002]. However, the published coherent reverberation theory is for the range independent environment. Following the derivations by F. Li and Ellis [D. D. Ellis, J. Acoust. Soc. Am., 97(5), 2804-2814, 1995], a general reverberation model based on the adiabatic normal mode theory in a range dependent shallow water environment is presented. From this theory the coherent or incoherent reverberation field caused by sediment inhomogeneity and surface roughness can be predicted. Observations of reverberation from the 2001 Asian Sea International Acoustic Experiment (ASIAEX) in the East China Sea are used to test the model. Model/data comparison shows that the coherent reverberation model can predict the experimental oscillation phenomenon of reverberation intensity and the vertical correlation of reverberation very well.

  14. NEUTRON-PROTON EFFECTIVE RANGE PARAMETERS AND ZERO-ENERGY SHAPE DEPENDENCE.

    Energy Technology Data Exchange (ETDEWEB)

    HACKENBURG, R.W.

    2005-06-01

    A completely model-independent effective range theory fit to available, unpolarized, np scattering data below 3 MeV determines the zero-energy free proton cross section {sigma}{sub 0} = 20.4287 {+-} 0.0078 b, the singlet apparent effective range r{sub s} = 2.754 {+-} 0.018{sub stat} {+-} 0.056{sub syst} fm, and improves the error slightly on the parahydrogen coherent scattering length, a{sub c} = -3.7406 {+-} 0.0010 fm. The triplet and singlet scattering lengths and the triplet mixed effective range are calculated to be a{sub t} = 5.4114 {+-} 0.0015 fm, a{sub s} = -23.7153 {+-} 0.0043 fm, and {rho}{sub t}(0,-{epsilon}{sub t}) = 1.7468 {+-} 0.0019 fm. The model-independent analysis also determines the zero-energy effective ranges by treating them as separate fit parameters without the constraint from the deuteron binding energy {epsilon}{sub t}. These are determined to be {rho}{sub t}(0,0) = 1.705 {+-} 0.023 fm and {rho}{sub s}(0,0) = 2.665 {+-} 0.056 fm. This determination of {rho}{sub t}(0,0) and {rho}{sub s}(0,0) is most sensitive to the sparse data between about 20 and 600 keV, where the correlation between the determined values of {rho}{sub t}(0,0) and {rho}{sub s}(0,0) is at a minimum. This correlation is responsible for the large systematic error in r{sub s}. More precise data in this range are needed. The present data do not event determine (with confidence) that {rho}{sub t}(0,0) {ne} {rho}{sub t}(0, -{epsilon}{sub t}), referred to here as ''zero-energy shape dependence''. The widely used measurement of {sigma}{sub 0} = 20.491 {+-} 0.014 b from W. Dilg, Phys. Rev. C 11, 103 (1975), is argued to be in error.

  15. Mood-dependent retrieval in visual long-term memory: dissociable effects on retrieval probability and mnemonic precision.

    Science.gov (United States)

    Xie, Weizhen; Zhang, Weiwei

    2017-06-18

    Although memories are more retrievable if observers' emotional states are consistent between encoding and retrieval, it is unclear whether the consistency of emotional states increases the likelihood of successful memory retrieval, the precision of retrieved memories, or both. The present study tested visual long-term memory for everyday objects while consistent or inconsistent emotional contexts between encoding and retrieval were induced using background grey-scale images from the International Affective Picture System (IAPS). In the study phase, participants remembered colours of sequentially presented objects in a negative (Experiment 1a) or positive (Experiment 2a) context. In the test phase, participants estimated the colours of previously studied objects in either negative versus neutral (Experiment 1a) or positive versus neutral (Experiment 2a) contexts. Note, IAPS images in the test phase were always visually different from those initially paired with the studied objects. We found that reinstating negative context and positive context at retrieval resulted in better mnemonic precision and a higher probability of successful retrieval, respectively. Critically, these effects could not be attributed to a negative or positive context at retrieval alone (Experiments 1b and 2b). Together, these findings demonstrated dissociable effects of emotion on the quantitative and qualitative aspects of visual long-term memory retrieval.

  16. Quantifying the range of cross-correlated fluctuations using a q- L dependent AHXA coefficient

    Science.gov (United States)

    Wang, Fang; Wang, Lin; Chen, Yuming

    2018-03-01

    Recently, based on analogous height cross-correlation analysis (AHXA), a cross-correlation coefficient ρ×(L) has been proposed to quantify the levels of cross-correlation on different temporal scales for bivariate series. A limitation of this coefficient is that it cannot capture the full information of cross-correlations on amplitude of fluctuations. In fact, it only detects the cross-correlation at a specific order fluctuation, which might neglect some important information inherited from other order fluctuations. To overcome this disadvantage, in this work, based on the scaling of the qth order covariance and time delay L, we define a two-parameter dependent cross-correlation coefficient ρq(L) to detect and quantify the range and level of cross-correlations. This new version of ρq(L) coefficient leads to the formation of a ρq(L) surface, which not only is able to quantify the level of cross-correlations, but also allows us to identify the range of fluctuation amplitudes that are correlated in two given signals. Applications to the classical ARFIMA models and the binomial multifractal series illustrate the feasibility of this new coefficient ρq(L) . In addition, a statistical test is proposed to quantify the existence of cross-correlations between two given series. Applying our method to the real life empirical data from the 1999-2000 California electricity market, we find that the California power crisis in 2000 destroys the cross-correlation between the price and the load series but does not affect the correlation of the load series during and before the crisis.

  17. Time-dependent fracture probability of bilayer, lithium-disilicate-based, glass-ceramic, molar crowns as a function of core/veneer thickness ratio and load orientation.

    Science.gov (United States)

    Anusavice, Kenneth J; Jadaan, Osama M; Esquivel-Upshaw, Josephine F

    2013-11-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Predicted fracture probabilities (Pf) for centrally loaded 1.6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8mm/0.8mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4mm/1.2mm). CARES/Life results support the proposed crown design and load orientation hypotheses. The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. Copyright © 2013 Academy of Dental Materials. All rights reserved.

  18. Ethanolic extract of Aconiti Brachypodi Radix attenuates nociceptive pain probably via inhibition of voltage-dependent Na⁺ channel.

    Science.gov (United States)

    Ren, Wei; Yuan, Lin; Li, Jun; Huang, Xian-Ju; Chen, Su; Zou, Da-Jiang; Liu, Xiangming; Yang, Xin-Zhou

    2012-01-01

    Aconiti Brachypodi Radix, belonging to the genus of Aconitum (Family Ranunculaceae), are used clinically as anti-rheumatic, anti-inflammatory and anti-nociceptive in traditional medicine of China. However, its mechanism and influence on nociceptive threshold are unknown and need further investigation. The analgesic effects of ethanolic extract of Aconiti Brachypodi Radix (EABR) were thus studied in vivo and in vitro. Three pain models in mice were used to assess the effect of EABR on nociceptive threshold. In vitro study was conducted to clarify the modulation of the extract on the tetrodotoxin-sensitive (TTX-S) sodium currents in rat's dorsal root ganglion (DRG) neurons using whole-cell patch clamp technique. The results showed that EABR (5-20 mg/kg, i.g.) could produce dose-dependent analgesic effect on hot-plate tests as well as writhing response induced by acetic acid. In addition, administration of 2.5-10 mg/kg EABR (i.g.) caused significant decrease in pain responses in the first and second phases of formalin test without altering the PGE₂ production in the hind paw of the mice. Moreover, EABR (10 µg/ml -1 mg/ml) could suppress TTX-S voltage-gated sodium currents in a dose-dependent way, indicating the underlying electrophysiological mechanism of the analgesic effect of the folk plant medicine. Collectively, our results indicated that EABR has analgesic property in three pain models and useful influence on TTX-S sodium currents in DRG neurons, suggesting that the interference with pain messages caused by the modulation of EABR on TTX-S sodium currents in DRG neurones may explain some of its analgesic effect.

  19. Strongly angle-dependent magnetoresistance in Weyl semimetals with long-range disorder

    Science.gov (United States)

    Behrends, Jan; Bardarson, Jens H.

    2017-08-01

    The chiral anomaly in Weyl semimetals states that the left- and right-handed Weyl fermions, constituting the low energy description, are not individually conserved, resulting, for example, in a negative magnetoresistance in such materials. Recent experiments see strong indications of such an anomalous resistance response; however, with a response that at strong fields is more sharply peaked for parallel magnetic and electric fields than expected from simple theoretical considerations. Here, we uncover a mechanism, arising from the interplay between the angle-dependent Landau-level structure and long-range scalar disorder, that has the same phenomenology. In particular, we analytically show, and numerically confirm, that the internode scattering time decreases exponentially with the angle between the magnetic field and the Weyl node separation in the large field limit, while it is insensitive to this angle at weak magnetic fields. Since, in the simplest approximation, the internode scattering time is proportional to the anomaly-related conductivity, this feature may be related to the experimental observations of a sharply peaked magnetoresistance.

  20. Long-range dependence in returns and volatility of global gold market amid financial crises

    Science.gov (United States)

    Omane-Adjepong, Maurice; Boako, Gideon

    2017-04-01

    Using sampled historical daily gold market data from 07-03-1985 to 06-01-2015, and building on a related work by Bentes (2016), this paper examines the presence of long-range dependence (LRD) in the world's gold market returns and volatility, accounting for structural breaks. The sampled gold market data was divided into subsamples based on four global crises: the September 1992 collapse of the European Exchange Rate Mechanism (ERM), the Asian financial crisis of mid-1997, the Subprime meltdown of 2007, and the recent European sovereign debt crisis, which hit the world's market with varying effects. LRD test was carried-out on the full-sample and subsample periods using three semiparametric methods-before and after adjusting for structural breaks. The results show insignificant evidence of LRD in gold returns. However, very diminutive evidence is found for periods characterized by financial/economic shocks, with no significant detections for post-shock periods. Collectively, this is indicative that the gold market is less speculative, and hence could be somehow less risky for hedging and portfolio diversification.

  1. Evidence of Long Range Dependence and Self-similarity in Urban Traffic Systems

    Energy Technology Data Exchange (ETDEWEB)

    Thakur, Gautam S [ORNL; Helmy, Ahmed [University of Florida, Gainesville; Hui, Pan [Hong Kong University of Science & Technology

    2015-01-01

    Transportation simulation technologies should accurately model traffic demand, distribution, and assignment parame- ters for urban environment simulation. These three param- eters significantly impact transportation engineering bench- mark process, are also critical in realizing realistic traffic modeling situations. In this paper, we model and charac- terize traffic density distribution of thousands of locations around the world. The traffic densities are generated from millions of images collected over several years and processed using computer vision techniques. The resulting traffic den- sity distribution time series are then analyzed. It is found using the goodness-of-fit test that the traffic density dis- tributions follows heavy-tail models such as Log-gamma, Log-logistic, and Weibull in over 90% of analyzed locations. Moreover, a heavy-tail gives rise to long-range dependence and self-similarity, which we studied by estimating the Hurst exponent (H). Our analysis based on seven different Hurst estimators strongly indicate that the traffic distribution pat- terns are stochastically self-similar (0.5 H 1.0). We believe this is an important finding that will influence the design and development of the next generation traffic simu- lation techniques and also aid in accurately modeling traffic engineering of urban systems. In addition, it shall provide a much needed input for the development of smart cities.

  2. A time-dependent wave packet approach to atom-diatom reactive collision probabilities - Theory and application to the H + H2(J = 0) system

    Science.gov (United States)

    Neuhauser, Daniel; Baer, Michael; Judson, Richard S.; Kouri, Donald J.

    1990-01-01

    This paper describes a new approach to the study of atom-diatom reactive collisions in three dimensions employing wave packets and the time-dependent Schroedinger equation. The method uses a projection operator approach to couple the inelastic and reactive portions of the total wave function and optical potentials to circumvent the necessity of using product arrangement coordinates. Reactive transition probabilities are calculated from the state resolved flux of the wave packet as it leaves the interaction region in the direction of the reactive arrangement channel. The present approach is used to obtain such vibrationally resolved probabilities for the three-dimensional H + H2 (J = 0) hydrogen exchange reaction, using a body-fixed system of coordinates.

  3. Range-separated time-dependent density-functional theory with a frequency-dependent second-order Bethe-Salpeter correlation kernel

    CERN Document Server

    Rebolini, Elisa

    2015-01-01

    We present a range-separated linear-response time-dependent density-functional theory (TDDFT) which combines a density-functional approximation for the short-range response kernel and a frequency-dependent second-order Bethe-Salpeter approximation for the long-range response kernel. This approach goes beyond the adiabatic approximation usually used in linear-response TDDFT and aims at improving the accuracy of calculations of electronic excitation energies of molecular systems. A detailed derivation of the frequency-dependent second-order Bethe-Salpeter correlation kernel is given using many-body Green-function theory. Preliminary tests of this range-separated TDDFT method are presented for the calculation of excitation energies of four small molecules: N2, CO2, H2CO, and C2H4. The results suggest that the addition of the long-range second-order Bethe-Salpeter correlation kernel overall slightly improves the excitation energies.

  4. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  5. Correction of stress-depended changes of glucoproteid platelet receptors activity by electromagnetic radiation of terahertz range

    Directory of Open Access Journals (Sweden)

    V.F. Kirichuk

    2010-09-01

    Full Text Available The research goal is correction of stress-depended changes of glucoproteid (Gp platelet receptors activity by electromagnetic radiation of terahertz range. Influence of electromagnetic waves of terahertz range at the frequency of molecular spectrum of radiation and absorption of nitrogen oxide on lectin-induced platelet aggregation of white rats in the stressed condition was investigated

  6. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  7. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  8. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  9. Correction of a phase dependent error in a time-of-flight range sensor

    Science.gov (United States)

    Seiter, Johannes; Hofbauer, Michael; Davidovic, Milos; Zimmermann, Horst

    2013-04-01

    Time-of-Flight (TOF) 3D cameras determine the distance information by means of a propagation delay measurement. The delay value is acquired by correlating the sent and received continuous wave signals in discrete phase delay steps. To reduce the measurement time as well as the resources required for signal processing, the number of phase steps can be decreased. However, such a change results in the arising of a crucial systematic distance dependent distance error. In the present publication we investigate this phase dependent error systematically by means of a fiber based measurement setup. Furthermore, the phase shift is varied with an electrical delay line device rather than by moving an object in front of the camera. This procedure allows investigating the above mentioned phase dependent error isolated from other error sources, as, e.g., the amplitude dependent error. In other publications this error is corrected by means of a look-up table stored in a memory device. In our paper we demonstrate an analytical correction method that dramatically minimizes the demanded memory size. For four phase steps, this approach reduces the error dramatically by 89.4 % to 13.5 mm at a modulation frequency of 12.5 MHz. For 20.0 MHz, a reduction of 86.8 % to 11.5 mm could be achieved.

  10. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Directory of Open Access Journals (Sweden)

    Gian Paolo Beretta

    2008-08-01

    Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  11. Sensitivity of physiological emotional measures to odors depends on the product and the pleasantness ranges used

    Directory of Open Access Journals (Sweden)

    Aline Marie Pichon

    2015-12-01

    Full Text Available Emotions are characterized by synchronized changes in several components of an organism. Among them, physiological variations provide energy support for the expression of approach/avoid action tendencies induced by relevant stimuli, while self-reported subjective pleasantness feelings integrate all other emotional components and are plastic.Consequently, emotional responses evoked by odors should be highly differentiated when they are linked to different functions of olfaction (e.g., avoiding environmental hazards. As this differentiation has been observed for contrasted odors (very pleasant or unpleasant, we questioned whether subjective and physiological emotional response indicators could still disentangle subtle affective variations when no clear functional distinction is made (mildly pleasant or unpleasant fragrances. Here, we compared the sensitivity of behavioral and physiological (respiration, skin conductance, facial electromyography (EMG, and heart rate indicators in differentiating odor-elicited emotions in two situations: when a wide range of odor families was presented (e.g., fruity, animal, covering different functional meanings; or in response to a restricted range of products in one particular family (fragrances. Results show clear differences in physiological indicators to odors that display a wide range of reported pleasantness, but these differences almost entirely vanish when fragrances are used even though their subjective pleasantness still differed. Taken together, these results provide valuable information concerning the ability of classic verbal and psychophysiological measures to investigate subtle differences in emotional reactions to a restricted range of similar olfactory stimuli.

  12. Evidence of long range dependence in Asian equity markets: the role of liquidity and market restrictions

    Science.gov (United States)

    Cajueiro, Daniel O.; Tabak, Benjamin M.

    2004-11-01

    In this paper, the efficient market hypothesis is tested for China, Hong Kong and Singapore by means of the long memory dependence approach. We find evidence suggesting that Hong Kong is the most efficient market followed by Chinese A type shares and Singapore and finally by Chinese B type shares, which suggests that liquidity and capital restrictions may play a role in explaining results of market efficiency tests.

  13. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  14. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  15. Lithology- versus base level-dependent morphogenesis of the Hausruck - Kobernaußerwald range

    Science.gov (United States)

    Baumann, Sebastian; Robl, Jörg; Salcher, Bernhard; Prasicek, Günther; Keil, Melanie

    2016-04-01

    The Hausruck - Kobernaußerwald range has the highest relief in the Northern Molasse Basin in front of the Eastern Alps. The highest peaks of the range exceed an elevation of 800 m and are characterized by a local relief of about 400 m relative to the adjacent lowlands. The Hausruck - Kobernaußerwald range has never been glaciated and erosion is solely driven by fluvial incision and corresponding hillslope processes since the inversion of the Molasse Basin. Landslides are frequently observed at hillslopes in the Hausruck domain in the west but are completely missing in the Kobernaußerwald domain in the east. Recent tectonic activity like faulting has not been reported for that region and the stratigraphic record shows no evidence for tectonically induced discontinuities. The morphological expression of the western Kobernaußerwald and the eastern Hausruck apparently differ in their degree of erosional landscape decay with a gently incised western and deeply incised eastern domain. These domains correspond with two different lithological units of the Upper Freshwater Molasse: The simultaneously deposited western Kobernaußerwald Formation (Kobernaußerwald domain) and the eastern Ampfelwang Formation (Hausruck domain) are interpreted as sedimentary deposits of a fluvial fan in proximal and distal position, respectively, and show fining of the sedimentary record from west to east. The stratigraphic highest unit of the study region, the Hausruck Fm., consists of well consolidated fluvial gravels uniformly covering the hill tops of both domains. We used a high resolution LiDAR digital elevation model and performed a series of morphometric analyses to investigate the effects of different base levels and contrasting lithology on the topographic evolution of the Hausruck - Kobernaußerwald range. The analysis of longitudinal river profiles reveals that all channels independent from base level, bed rock and overall morphological expression are well graded with steep

  16. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  17. Low probability of intercept-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems

    Science.gov (United States)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-12-01

    In this paper, we investigate the problem of low probability of intercept (LPI)-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems, where the radar system optimizes the transmitted waveform such that the interference caused to the cellular communication systems is strictly controlled. Assuming that the precise knowledge of the target spectra, the power spectral densities (PSDs) of signal-dependent clutters, the propagation losses of corresponding channels and the communication signals is known by the radar, three different LPI based criteria for radar waveform optimization are proposed to minimize the total transmitted power of the radar system by optimizing the multicarrier radar waveform with a predefined signal-to-interference-plus-noise ratio (SINR) constraint and a minimum required capacity for the cellular communication systems. These criteria differ in the way the communication signals scattered off the target are considered in the radar waveform design: (1) as useful energy, (2) as interference or (3) ignored altogether. The resulting problems are solved analytically and their solutions represent the optimum power allocation for each subcarrier in the multicarrier radar waveform. We show with numerical results that the LPI performance of the radar system can be significantly improved by exploiting the scattered echoes off the target due to cellular communication signals received at the radar receiver.

  18. Low probability of intercept-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems.

    Science.gov (United States)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-01-01

    In this paper, we investigate the problem of low probability of intercept (LPI)-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems, where the radar system optimizes the transmitted waveform such that the interference caused to the cellular communication systems is strictly controlled. Assuming that the precise knowledge of the target spectra, the power spectral densities (PSDs) of signal-dependent clutters, the propagation losses of corresponding channels and the communication signals is known by the radar, three different LPI based criteria for radar waveform optimization are proposed to minimize the total transmitted power of the radar system by optimizing the multicarrier radar waveform with a predefined signal-to-interference-plus-noise ratio (SINR) constraint and a minimum required capacity for the cellular communication systems. These criteria differ in the way the communication signals scattered off the target are considered in the radar waveform design: (1) as useful energy, (2) as interference or (3) ignored altogether. The resulting problems are solved analytically and their solutions represent the optimum power allocation for each subcarrier in the multicarrier radar waveform. We show with numerical results that the LPI performance of the radar system can be significantly improved by exploiting the scattered echoes off the target due to cellular communication signals received at the radar receiver.

  19. Large range localized surface plasmon resonance of Ag nanoparticles films dependent of surface morphology

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Lijuan; Yan, Yaning; Xu, Leilei; Ma, Rongrong; Jiang, Fengxian; Xu, Xiaohong, E-mail: xuxh@dns.sxnu.edu.cn

    2016-03-30

    Graphical abstract: - Highlights: • Large range tuned localized surface plasmon resonance of Ag nanoparticles films. • The noble metal Ag has the strongest localized surface plasmon resonance and low optical loss. Besides, it is the cheaper than other noble metal. • The nanoparticles films fabricated using physical methods have the stronger interaction with substrates than chemical methods, which are not easy exfoliation. - Abstract: Noble metal nanoparticles (NPs) have received enormous attention since it displays uniquely optical and electronic properties. In this work, we study localized surface plasmon resonances (LSPR) at different thicknesses and substrate temperatures of Ag NPs films grown by Laser Molecule Beam Epitaxy (LMBE). The LSPR wavelength can be largely tuned in the visible light range of 470 nm to 770 nm. The surface morphology is characterized by transmission electron microscopy (TEM) and scanning electron microscopy (SEM). The average size of Ag NPs increased with the thickness increased which leading to the LSPR band broaden and wavelength red-shift. As the substrate temperature is increased from RT to 200 °C, the Ag NPs size distribution becomes homogeneous and particle shape changes from oblate spheroid to sphere, the LSPR band displays sharp, blue-shift and significantly symmetric. Obviously, the morphology of Ag NPs films is important for tuning absorption position. We obtain the cubic crystal structure of Ag NPs with a (1 1 1) main diffraction peak from the X-ray diffraction (XRD) spectra. The high resolution TEM (HR-TEM) and selected area electron diffraction (SAED) prove that Ag NPs is polycrystal structure. The Ag NPs films with large range absorption in visible light region can composite with semiconductor to apply in various optical or photoelectric devices.

  20. Dependence of simulations of long range transport on meteorology, model and dust size

    Science.gov (United States)

    Mahowald, N. M.; Albani, S.; Smith, M.; Losno, R.; Marticorena, B.; Ridley, D. A.; Heald, C. L.; Qu, Z.

    2015-12-01

    Mineral aerosols interact with radiation directly, as well as modifying climate, and provide important micronutrients to ocean and land ecosystems. Mineral aerosols are transported long distances from the source regions to remote regions, but the rates at which this occurs can be difficult to deduce from either observations or models. Here we consider interactions between the details of the simulation of dust size and long-range transport. In addition, we compare simulations of dust using multiple reanalysis datasets, as well as different model basis to understand how robust the mean, seasonality and interannual variability are in models. Models can provide insight into how long observations are required in order to characterize the atmospheric concentration and deposition to remote regions.

  1. Communication: Anomalous temperature dependence of the intermediate range order in phosphonium ionic liquids

    Energy Technology Data Exchange (ETDEWEB)

    Hettige, Jeevapani J.; Kashyap, Hemant K.; Margulis, Claudio J., E-mail: claudio-margulis@uiowa.edu [Department of Chemistry, University of Iowa, Iowa City, Iowa 52242 (United States)

    2014-03-21

    In a recent article by the Castner and Margulis groups [Faraday Discuss. 154, 133 (2012)], we described in detail the structure of the tetradecyltrihexylphosphonium bis(trifluoromethylsulfonyl)-amide ionic liquid as a function of temperature using X-ray scattering, and theoretical partitions of the computationally derived structure function. Interestingly, and as opposed to the case in most other ionic-liquids, the first sharp diffraction peak or prepeak appears to increase in intensity as temperature is increased. This phenomenon is counter intuitive as one would expect that intermediate range order fades as temperature increases. This Communication shows that a loss of hydrophobic tail organization at higher temperatures is counterbalanced by better organization of polar components giving rise to the increase in intensity of the prepeak.

  2. Incident particle range dependence of radiation damage in a power bipolar junction transistor

    Science.gov (United States)

    Liu, Chao-Ming; Li, Xing-Ji; Geng, Hong-Bin; Rui, Er-Ming; Guo, Li-Xin; Yang, Jian-Qun

    2012-10-01

    The characteristic degradations in silicon NPN bipolar junction transistors (BJTs) of type 3DD155 are examined under the irradiations of 25-MeV carbon (C), 40-MeV silicon (Si), and 40-MeV chlorine (Cl) ions respectively. Different electrical parameters are measured in-situ during the exposure of heavy ions. The experimental data shows that the changes in the reciprocal of the gain variation (Δ(1/β)) of 3DD155 transistors irradiated respectively by 25-MeV C, 40-MeV Si, and 40-MeV Cl ions each present a nonlinear behaviour at a low fluence and a linear response at a high fluence. The Δ(1/β) of 3DD155 BJT irradiated by 25-MeV C ions is greatest at a given fluence, a little smaller when the device is irradiated by 40-MeV Si ions, and smallest in the case of the 40-MeV Cl ions irradiation. The measured and calculated results clearly show that the range of heavy ions in the base region of BJT affects the level of radiation damage.

  3. Scale-dependent habitat use by a large free-ranging predator, the Mediterranean fin whale

    Science.gov (United States)

    Cotté, Cédric; Guinet, Christophe; Taupier-Letage, Isabelle; Mate, Bruce; Petiau, Estelle

    2009-05-01

    Since the heterogeneity of oceanographic conditions drives abundance, distribution, and availability of prey, it is essential to understand how foraging predators interact with their dynamic environment at various spatial and temporal scales. We examined the spatio-temporal relationships between oceanographic features and abundance of fin whales ( Balaenoptera physalus), the largest free-ranging predator in the Western Mediterranean Sea (WM), through two independent approaches. First, spatial modeling was used to estimate whale density, using waiting distance (the distance between detections) for fin whales along ferry routes across the WM, in relation to remotely sensed oceanographic parameters. At a large scale (basin and year), fin whales exhibited fidelity to the northern WM with a summer-aggregated and winter-dispersed pattern. At mesoscale (20-100 km), whales were found in colder, saltier (from an on-board system) and dynamic areas defined by steep altimetric and temperature gradients. Second, using an independent fin whale satellite tracking dataset, we showed that tracked whales were effectively preferentially located in favorable habitats, i.e. in areas of high predicted densities as identified by our previous model using oceanographic data contemporaneous to the tracking period. We suggest that the large-scale fidelity corresponds to temporally and spatially predictable habitat of whale favorite prey, the northern krill ( Meganyctiphanes norvegica), while mesoscale relationships are likely to identify areas of high prey concentration and availability.

  4. Metabolomic unveiling of a diverse range of green tea (Camellia sinensis) metabolites dependent on geography.

    Science.gov (United States)

    Lee, Jang-Eun; Lee, Bum-Jin; Chung, Jin-Oh; Kim, Hak-Nam; Kim, Eun-Hee; Jung, Sungheuk; Lee, Hyosang; Lee, Sang-Jun; Hong, Young-Shick

    2015-05-01

    Numerous factors such as geographical origin, cultivar, climate, cultural practices, and manufacturing processes influence the chemical compositions of tea, in the same way as growing conditions and grape variety affect wine quality. However, the relationships between these factors and tea chemical compositions are not well understood. In this study, a new approach for non-targeted or global analysis, i.e., metabolomics, which is highly reproducible and statistically effective in analysing a diverse range of compounds, was used to better understand the metabolome of Camellia sinensis and determine the influence of environmental factors, including geography, climate, and cultural practices, on tea-making. We found a strong correlation between environmental factors and the metabolome of green, white, and oolong teas from China, Japan, and South Korea. In particular, multivariate statistical analysis revealed strong inter-country and inter-city relationships in the levels of theanine and catechin derivatives found in green and white teas. This information might be useful for assessing tea quality or producing distinct tea products across different locations, and highlights simultaneous identification of diverse tea metabolites through an NMR-based metabolomics approach. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. High seroprevalence of Toxoplasma gondii and probability of detecting tissue cysts in backyard laying hens compared with hens from large free-range farms.

    Science.gov (United States)

    Schares, G; Bangoura, B; Randau, F; Goroll, T; Ludewig, M; Maksimov, P; Matzkeit, B; Sens, M; Bärwald, A; Conraths, F J; Opsteegh, M; Van der Giessen, J

    2017-10-01

    Serological assays are commonly used to determine the prevalence of Toxoplasma gondii infection in livestock, but the predictive value of seropositivity with respect to the presence of infective tissue cysts is less clear. The present study aimed at the identification of seropositive and seronegative free-range laying hens from organic and backyard farms, and the relationship with the presence of viable tissue cysts. In addition, potential risk and protective factors on the selected farms were investigated. An in-house T. gondii surface antigen (TgSAG1, p30, SRS29B) ELISA was validated with sera from experimentally infected chickens and used to examine 470 serum samples collected from laying hens from large organic and small backyard farms at the end of their laying period. A total of 11.7% (55/470) of all chickens tested positive, and another 18.9% (89/470) of test results were inconclusive. The highest seroprevalences were observed on small backyard farms with 47.7% (41/86) of chickens being seropositive while another 20.9% (18/86) of test results were inconclusive. Twenty-nine seropositive, 20 seronegative and 12 laying hens which yielded inconclusive ELISA results, were selected for further examination. Hearts and limb muscles of these hens were examined for T. gondii tissue cysts in a bioassay with IFNɣ-knockout or IFNɣ-receptor-knockout mice. Viable T. gondii was isolated from 75.9% (22/29) of the seropositive, 25.0% (3/12) of the inconclusive, and 5.0% (1/20) of the seronegative chickens. All 26 chickens tested positive in heart samples, while drumstick muscles (i.e. limb muscles) tested positive only in three. Data on putative risk and protective factors were collected on the farms using a standard questionnaire. Generalised multilevel modelling revealed farm size, cat related factors ('cats on the premise', 'cats used for rodent control'), hen house/hall related factors ('size category of hen house/hall', 'frequency category of cleaning hen house

  6. Combining Density Functional Theory and Green's Function Theory: Range-Separated, Nonlocal, Dynamic, and Orbital-Dependent Hybrid Functional.

    Science.gov (United States)

    Kananenka, Alexei A; Zgid, Dominika

    2017-11-14

    We present a rigorous framework which combines single-particle Green's function theory with density functional theory based on a separation of electron-electron interactions into short- and long-range components. Short-range contribution to the total energy and exchange-correlation potential is provided by a density functional approximation, while the long-range contribution is calculated using an explicit many-body Green's function method. Such a hybrid results in a nonlocal, dynamic, and orbital-dependent exchange-correlation functional of a single-particle Green's function. In particular, we present a range-separated hybrid functional called srSVWN5-lrGF2 which combines the local-density approximation and the second-order Green's function theory. We illustrate that similarly to density functional approximations, the new functional is weakly basis-set dependent. Furthermore, it offers an improved description of the short-range dynamic correlation. The many-body contribution to the functional mitigates the many-electron self-interaction error present in many density functional approximations and provides a better description of molecular properties. Additionally, we illustrate that the new functional can be used to scale down the self-energy and, therefore, introduce an additional sparsity to the self-energy matrix that in the future can be exploited in calculations for large molecules or periodic systems.

  7. Multivariable normal tissue complication probability model-based treatment plan optimization for grade 2-4 dysphagia and tube feeding dependence in head and neck radiotherapy.

    Science.gov (United States)

    Kierkels, Roel G J; Wopken, Kim; Visser, Ruurd; Korevaar, Erik W; van der Schaaf, Arjen; Bijl, Hendrik P; Langendijk, Johannes A

    2016-12-01

    Radiotherapy of the head and neck is challenged by the relatively large number of organs-at-risk close to the tumor. Biologically-oriented objective functions (OF) could optimally distribute the dose among the organs-at-risk. We aimed to explore OFs based on multivariable normal tissue complication probability (NTCP) models for grade 2-4 dysphagia (DYS) and tube feeding dependence (TFD). One hundred head and neck cancer patients were studied. Additional to the clinical plan, two more plans (an OFDYS and OFTFD-plan) were optimized per patient. The NTCP models included up to four dose-volume parameters and other non-dosimetric factors. A fully automatic plan optimization framework was used to optimize the OFNTCP-based plans. All OFNTCP-based plans were reviewed and classified as clinically acceptable. On average, the Δdose and ΔNTCP were small comparing the OFDYS-plan, OFTFD-plan, and clinical plan. For 5% of patients NTCPTFD reduced >5% using OFTFD-based planning compared to the OFDYS-plans. Plan optimization using NTCPDYS- and NTCPTFD-based objective functions resulted in clinically acceptable plans. For patients with considerable risk factors of TFD, the OFTFD steered the optimizer to dose distributions which directly led to slightly lower predicted NTCPTFD values as compared to the other studied plans. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Withdrawal of corticosteroids in inflammatory bowel disease patients after dependency periods ranging from 2 to 45 years: a proposed method.

    LENUS (Irish Health Repository)

    Murphy, S J

    2012-02-01

    BACKGROUND: Even in the biologic era, corticosteroid dependency in IBD patients is common and causes a lot of morbidity, but methods of withdrawal are not well described. AIM: To assess the effectiveness of a corticosteroid withdrawal method. METHODS: Twelve patients (10 men, 2 women; 6 ulcerative colitis, 6 Crohn\\'s disease), median age 53.5 years (range 29-75) were included. IBD patients with quiescent disease refractory to conventional weaning were transitioned to oral dexamethasone, educated about symptoms of the corticosteroid withdrawal syndrome (CWS) and weaned under the supervision of an endocrinologist. When patients failed to wean despite a slow weaning pace and their IBD remaining quiescent, low dose synthetic ACTH stimulation testing was performed to assess for adrenal insufficiency. Multivariate analysis was performed to assess predictors of a slow wean. RESULTS: Median durations for disease and corticosteroid dependency were 21 (range 3-45) and 14 (range 2-45) years respectively. Ten patients (83%) were successfully weaned after a median follow-up from final wean of 38 months (range 5-73). Disease flares occurred in two patients, CWS in five and ACTH testing was performed in 10. Multivariate analysis showed that longer duration of corticosteroid use appeared to be associated with a slower wean (P = 0.056). CONCLUSIONS: Corticosteroid withdrawal using this protocol had a high success rate and durable effect and was effective in patients with long-standing (up to 45 years) dependency. As symptoms of CWS mimic symptoms of IBD disease flares, gastroenterologists may have difficulty distinguishing them, which may be a contributory factor to the frequency of corticosteroid dependency in IBD patients.

  9. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  10. Low Probability of Intercept Laser Range Finder

    Science.gov (United States)

    2017-07-19

    manufactured and used by or for the Government of the United States of America for governmental purposes without the payment of any royalties thereon or...enhances the data security of the proposed system . The remainder of multiplexer 24 output is collimated by a lens 32 and transmitted through free...into the digital domain to facilitate comparison of the transmitted and received signals. Attorney Docket No. 300126 8 of 12 The collected

  11. Task probability and report of feature information: what you know about what you 'see' depends on what you expect to need.

    Science.gov (United States)

    Pilling, Michael; Gellatly, Angus

    2013-07-01

    We investigated the influence of dimensional set on report of object feature information using an immediate memory probe task. Participants viewed displays containing up to 36 coloured geometric shapes which were presented for several hundred milliseconds before one item was abruptly occluded by a probe. A cue presented simultaneously with the probe instructed participants to report either about the colour or shape of the probe item. A dimensional set towards the colour or shape of the presented items was induced by manipulating task probability - the relative probability with which the two feature dimensions required report. This was done across two participant groups: One group was given trials where there was a higher report probability of colour, the other a higher report probability of shape. Two experiments showed that features were reported most accurately when they were of high task probability, though in both cases the effect was largely driven by the colour dimension. Importantly the task probability effect did not interact with display set size. This is interpreted as tentative evidence that this manipulation influences feature processing in a global manner and at a stage prior to visual short term memory. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  13. Yeast silent mating type loci form heterochromatic clusters through silencer protein-dependent long-range interactions.

    Directory of Open Access Journals (Sweden)

    Adriana Miele

    2009-05-01

    Full Text Available The organization of eukaryotic genomes is characterized by the presence of distinct euchromatic and heterochromatic sub-nuclear compartments. In Saccharomyces cerevisiae heterochromatic loci, including telomeres and silent mating type loci, form clusters at the nuclear periphery. We have employed live cell 3-D imaging and chromosome conformation capture (3C to determine the contribution of nuclear positioning and heterochromatic factors in mediating associations of the silent mating type loci. We identify specific long-range interactions between HML and HMR that are dependent upon silencing proteins Sir2p, Sir3p, and Sir4p as well as Sir1p and Esc2p, two proteins involved in establishment of silencing. Although clustering of these loci frequently occurs near the nuclear periphery, colocalization can occur equally at more internal positions and is not affected in strains deleted for membrane anchoring proteins yKu70p and Esc1p. In addition, appropriate nucleosome assembly plays a role, as deletion of ASF1 or combined disruption of the CAF-1 and HIR complexes abolishes the HML-HMR interaction. Further, silencer proteins are required for clustering, but complete loss of clustering in asf1 and esc2 mutants had only minor effects on silencing. Our results indicate that formation of heterochromatic clusters depends on correctly assembled heterochromatin at the silent loci and, in addition, identify an Asf1p-, Esc2p-, and Sir1p-dependent step in heterochromatin formation that is not essential for gene silencing but is required for long-range interactions.

  14. Site-specific and time-dependent activation of the endocannabinoid system after transection of long-range projections.

    Directory of Open Access Journals (Sweden)

    Sonja Kallendrusch

    Full Text Available BACKGROUND: After focal neuronal injury the endocannabinioid system becomes activated and protects or harms neurons depending on cannabinoid derivates and receptor subtypes. Endocannabinoids (eCBs play a central role in controlling local responses and influencing neural plasticity and survival. However, little is known about the functional relevance of eCBs in long-range projection damage as observed in stroke or spinal cord injury (SCI. METHODS: In rat organotypic entorhino-hippocampal slice cultures (OHSC as a relevant and suitable model for investigating projection fibers in the CNS we performed perforant pathway transection (PPT and subsequently analyzed the spatial and temporal dynamics of eCB levels. This approach allows proper distinction of responses in originating neurons (entorhinal cortex, areas of deafferentiation/anterograde axonal degeneration (dentate gyrus and putative changes in more distant but synaptically connected subfields (cornu ammonis (CA 1 region. RESULTS: Using LC-MS/MS, we measured a strong increase in arachidonoylethanolamide (AEA, oleoylethanolamide (OEA and palmitoylethanolamide (PEA levels in the denervation zone (dentate gyrus 24 hours post lesion (hpl, whereas entorhinal cortex and CA1 region exhibited little if any changes. NAPE-PLD, responsible for biosynthesis of eCBs, was increased early, whereas FAAH, a catabolizing enzyme, was up-regulated 48hpl. CONCLUSION: Neuronal damage as assessed by transection of long-range projections apparently provides a strong time-dependent and area-confined signal for de novo synthesis of eCB, presumably to restrict neuronal damage. The present data underlines the importance of activation of the eCB system in CNS pathologies and identifies a novel site-specific intrinsic regulation of eCBs after long-range projection damage.

  15. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  16. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  17. Context-dependent functional dispersion across similar ranges of trait space covered by intertidal rocky shore communities.

    Science.gov (United States)

    Valdivia, Nelson; Segovia-Rivera, Viviana; Fica, Eliseo; Bonta, César C; Aguilera, Moisés A; Broitman, Bernardo R

    2017-03-01

    Functional diversity is intimately linked with community assembly processes, but its large-scale patterns of variation are often not well understood. Here, we investigated the spatiotemporal changes in multiple trait dimensions ("trait space") along vertical intertidal environmental stress gradients and across a landscape scale. We predicted that the range of the trait space covered by local assemblages (i.e., functional richness) and the dispersion in trait abundances (i.e., functional dispersion) should increase from high- to low-intertidal elevations, due to the decreasing influence of environmental filtering. The abundance of macrobenthic algae and invertebrates was estimated at four rocky shores spanning ca. 200 km of the coast over a 36-month period. Functional richness and dispersion were contrasted against matrix-swap models to remove any confounding effect of species richness on functional diversity. Random-slope models showed that functional richness and dispersion significantly increased from high- to low-intertidal heights, demonstrating that under harsh environmental conditions, the assemblages comprised similar abundances of functionally similar species (i.e., trait convergence), while that under milder conditions, the assemblages encompassed differing abundances of functionally dissimilar species (i.e., trait divergence). According to the Akaike information criteria, the relationship between local environmental stress and functional richness was persistent across sites and sampling times, while functional dispersion varied significantly. Environmental filtering therefore has persistent effects on the range of trait space covered by these assemblages, but context-dependent effects on the abundances of trait combinations within such range. Our results further suggest that natural and/or anthropogenic factors might have significant effects on the relative abundance of functional traits, despite that no trait addition or extinction is detected.

  18. Elevation-Dependent Temperature Trends in the Rocky Mountain Front Range: Changes over a 56- and 20-Year Record

    Science.gov (United States)

    McGuire, Chris R.; Nufio, César R.; Bowers, M. Deane; Guralnick, Robert P.

    2012-01-01

    Determining the magnitude of climate change patterns across elevational gradients is essential for an improved understanding of broader climate change patterns and for predicting hydrologic and ecosystem changes. We present temperature trends from five long-term weather stations along a 2077-meter elevational transect in the Rocky Mountain Front Range of Colorado, USA. These trends were measured over two time periods: a full 56-year record (1953–2008) and a shorter 20-year (1989–2008) record representing a period of widely reported accelerating change. The rate of change of biological indicators, season length and accumulated growing-degree days, were also measured over the 56 and 20-year records. Finally, we compared how well interpolated Parameter-elevation Regression on Independent Slopes Model (PRISM) datasets match the quality controlled and weather data from each station. Our results show that warming signals were strongest at mid-elevations over both temporal scales. Over the 56-year record, most sites show warming occurring largely through increases in maximum temperatures, while the 20-year record documents warming associated with increases in maximum temperatures at lower elevations and increases in minimum temperatures at higher elevations. Recent decades have also shown a shift from warming during springtime to warming in July and November. Warming along the gradient has contributed to increases in growing-degree days, although to differing degrees, over both temporal scales. However, the length of the growing season has remained unchanged. Finally, the actual and the PRISM interpolated yearly rates rarely showed strong correlations and suggest different warming and cooling trends at most sites. Interpretation of climate trends and their seasonal biases in the Rocky Mountain Front Range are dependent on both elevation and the temporal scale of analysis. Given mismatches between interpolated data and the directly measured station data, we caution

  19. Elevation-dependent temperature trends in the Rocky Mountain Front Range: changes over a 56- and 20-year record.

    Directory of Open Access Journals (Sweden)

    Chris R McGuire

    Full Text Available Determining the magnitude of climate change patterns across elevational gradients is essential for an improved understanding of broader climate change patterns and for predicting hydrologic and ecosystem changes. We present temperature trends from five long-term weather stations along a 2077-meter elevational transect in the Rocky Mountain Front Range of Colorado, USA. These trends were measured over two time periods: a full 56-year record (1953-2008 and a shorter 20-year (1989-2008 record representing a period of widely reported accelerating change. The rate of change of biological indicators, season length and accumulated growing-degree days, were also measured over the 56 and 20-year records. Finally, we compared how well interpolated Parameter-elevation Regression on Independent Slopes Model (PRISM datasets match the quality controlled and weather data from each station. Our results show that warming signals were strongest at mid-elevations over both temporal scales. Over the 56-year record, most sites show warming occurring largely through increases in maximum temperatures, while the 20-year record documents warming associated with increases in maximum temperatures at lower elevations and increases in minimum temperatures at higher elevations. Recent decades have also shown a shift from warming during springtime to warming in July and November. Warming along the gradient has contributed to increases in growing-degree days, although to differing degrees, over both temporal scales. However, the length of the growing season has remained unchanged. Finally, the actual and the PRISM interpolated yearly rates rarely showed strong correlations and suggest different warming and cooling trends at most sites. Interpretation of climate trends and their seasonal biases in the Rocky Mountain Front Range are dependent on both elevation and the temporal scale of analysis. Given mismatches between interpolated data and the directly measured station data

  20. Dynamical phase transitions in long-range Hamiltonian systems and Tsallis distributions with a time-dependent index.

    Science.gov (United States)

    Campa, Alessandro; Chavanis, Pierre-Henri; Giansanti, Andrea; Morelli, Gianluca

    2008-10-01

    We study dynamical phase transitions in systems with long-range interactions, using the Hamiltonian mean field model as a simple example. These systems generically undergo a violent relaxation to a quasistationary state (QSS) before relaxing towards Boltzmann equilibrium. In the collisional regime, the out-of-equilibrium one-particle distribution function (DF) is a quasistationary solution of the Vlasov equation, slowly evolving in time due to finite- N effects. For subcritical energy densities, we exhibit cases where the DF is well fitted by a Tsallis q distribution with an index q(t) slowly decreasing in time from q approximately = 3 (semiellipse) to q=1 (Boltzmann). When the index q(t) reaches an energy-dependent critical value q_(crit) , the nonmagnetized (homogeneous) phase becomes Vlasov unstable and a dynamical phase transition is triggered, leading to a magnetized (inhomogeneous) state. While Tsallis distributions play an important role in our study, we explain this dynamical phase transition by using only conventional statistical mechanics. For supercritical energy densities, we report the existence of a magnetized QSS with a very long lifetime.

  1. Systematic inference of the long-range dependence and heavy-tail distribution parameters of ARFIMA models

    Science.gov (United States)

    Watkins, Nick; Graves, Timothy; Franzke, Christian; Gramacy, Robert; Tindale, Elizabeth

    2017-04-01

    Long-Range Dependence (LRD) and heavy-tailed distributions are ubiquitous in natural and socio-economic data. Such data can be self-similar whereby both LRD and heavy-tailed distributions contribute to the self-similarity as measured by the Hurst exponent. Some methods widely used in the physical sciences separately estimate these two parameters, which can lead to estimation bias. Those which do simultaneous estimation are based on frequentist methods such as Whittle's approximate maximum likelihood estimator. Here we present a new and systematic Bayesian framework for the simultaneous inference of the LRD and heavy-tailed distribution parameters of a parametric ARFIMA model with non-Gaussian innovations. As innovations we use the alpha-stable and t-distributions which have power law tails. Our algorithm also provides parameter uncertainty estimates. We test our algorithm using synthetic data, and also data from the Geostationary Operational Environmental Satellite system (GOES) solar X-ray time series. These tests show that our algorithm is able to accurately and robustly estimate the LRD and heavy-tailed distribution parameters. See Physica A: Statistical Mechanics and its Applications, (January 2017), DOI: 10.1016/j.physa.2017.01.028

  2. Systematic inference of the long-range dependence and heavy-tail distribution parameters of ARFIMA models

    Science.gov (United States)

    Graves, Timothy; Franzke, Christian L. E.; Watkins, Nicholas W.; Gramacy, Robert B.; Tindale, Elizabeth

    2017-05-01

    Long-Range Dependence (LRD) and heavy-tailed distributions are ubiquitous in natural and socio-economic data. Such data can be self-similar whereby both LRD and heavy-tailed distributions contribute to the self-similarity as measured by the Hurst exponent. Some methods widely used in the physical sciences separately estimate these two parameters, which can lead to estimation bias. Those which do simultaneous estimation are based on frequentist methods such as Whittle's approximate maximum likelihood estimator. Here we present a new and systematic Bayesian framework for the simultaneous inference of the LRD and heavy-tailed distribution parameters of a parametric ARFIMA model with non-Gaussian innovations. As innovations we use the α-stable and t-distributions which have power law tails. Our algorithm also provides parameter uncertainty estimates. We test our algorithm using synthetic data, and also data from the Geostationary Operational Environmental Satellite system (GOES) solar X-ray time series. These tests show that our algorithm is able to accurately and robustly estimate the LRD and heavy-tailed distribution parameters.

  3. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  4. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  5. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  6. Two-step probability plot for parameter estimation of lifetime distribution affected by defect clustering in time-dependent dielectric breakdown

    Science.gov (United States)

    Yokogawa, Shinji

    2017-07-01

    In this study, a simple method of statistical parameter estimation is proposed for lifetime distribution that has three parameters due to the defect clustering in the middle-of-line and back-end-of-line. A two-step procedure provides the estimations of distribution parameters effectively for the time-dependent dielectric breakdown. In the first step, a clustering parameter of distribution, which is one of the shape parameters, is estimated by a linearization treatment of plotted data on the proposed chart. Then, in the second step, shape and scale parameters are estimated by calculating of a slope and an intercept, respectively. The statistical accuracy of the estimates is evaluated using the Monte-Carlo simulation technique and mean squared error of estimates.

  7. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  8. Assessment of charge-transfer excitations with time-dependent, range-separated density functional theory based on long-range MP2 and multiconfigurational self- consistent field wave functions

    DEFF Research Database (Denmark)

    Hedegård, Erik D.; Jensen, Hans Jørgen Aagaard; Knecht, Stefan

    2013-01-01

    Charge transfer excitations can be described within Time-Dependent Density Functional Theory (TD-DFT), not only by means of the Coulomb Attenuated Method (CAM) but also with a combination of wave function theory and TD-DFT based on range separation. The latter approach enables a rigorous formulat......Charge transfer excitations can be described within Time-Dependent Density Functional Theory (TD-DFT), not only by means of the Coulomb Attenuated Method (CAM) but also with a combination of wave function theory and TD-DFT based on range separation. The latter approach enables a rigorous...... formulation of multi-determinantal TD-DFT schemes where excitation classes, which are absent in conventional TD-DFT spectra (like for example double excitations), can be addressed. This paper investigates the combination of both the long-range Multi-Configuration Self-Consistent Field (MCSCF) and Second Order...... Polarization Propagator Approximation (SOPPA) ansätze with a short-range DFT (srDFT) description. We find that the combinations of SOPPA or MCSCF with TD-DFT yield better results than could be expected from the pure wave function schemes. For the Time-Dependent MCSCF short-range DFT ansatz (TD...

  9. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  10. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  11. Scale dependence in habitat selection: The case of the endangered brown bear (Ursus arctos) in the Cantabrian Range (NW Spain)

    Science.gov (United States)

    Maria C. Mateo Sanchez; Samuel A. Cushman; Santiago Saura

    2013-01-01

    Animals select habitat resources at multiple spatial scales. Thus, explicit attention to scale dependency in species-habitat relationships is critical to understand the habitat suitability patterns as perceived by organisms in complex landscapes. Identification of the scales at which particular environmental variables influence habitat selection may be as important as...

  12. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  13. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  14. Overpotential-induced lability of the electronic overlap factor in long-range electrochemical electron transfer: charge and distance dependence

    DEFF Research Database (Denmark)

    Kornyshev, A. A.; Kuznetsov, A. M.; Nielsen, Jens Ulrik

    2000-01-01

    Long-distance electrochemical electron transfer exhibits approximately exponential dependence on the electron transfer distance. On the basis of a jellium model of the metal surface we show that the slope of the logarithm of the current vs. the transfer distance also depends strongly...... on the electrode charge. The slope is smaller the more negative the charge density due to enhanced extension of the surface electronic density profile on the solution side, and thereby better electronic overlap with the reacting molecule. The effect is sensitive to the bulk electron density of the metal...... and the localization of the electronic state at the molecular reactant site. Effects similar to these have been observed experimentally and could be common for electronically light metals....

  15. Position-dependent and millimetre-range photodetection in phototransistors with micrometre-scale graphene on SiC

    Science.gov (United States)

    Sarker, Biddut K.; Cazalas, Edward; Chung, Ting-Fung; Childres, Isaac; Jovanovic, Igor; Chen, Yong P.

    2017-07-01

    The extraordinary optical and electronic properties of graphene make it a promising component of high-performance photodetectors. However, in typical graphene-based photodetectors demonstrated to date, the photoresponse only comes from specific locations near graphene over an area much smaller than the device size. For many optoelectronic device applications, it is desirable to obtain the photoresponse and positional sensitivity over a much larger area. Here, we report the spatial dependence of the photoresponse in backgated graphene field-effect transistors (GFET) on silicon carbide (SiC) substrates by scanning a focused laser beam across the GFET. The GFET shows a nonlocal photoresponse even when the SiC substrate is illuminated at distances greater than 500 µm from the graphene. The photoresponsivity and photocurrent can be varied by more than one order of magnitude depending on the illumination position. Our observations are explained with a numerical model based on charge transport of photoexcited carriers in the substrate.

  16. Simple illustrations of range-dependence and 3-D effects by normal-mode sound propagation modelling

    CERN Document Server

    Ivansson, Sven

    2016-01-01

    As is well known, the sound-speed profile has significant effects on underwater acoustic sound propagation. These effects can be quantified by normal-mode models, for example. The basic case is a laterally homogeneous medium, for which the sound speed and the density depend on depth only and not on horizontal position. Effects of horizontal medium-parameter variation can be quantified by coupled-mode models, with coupling between mode expansions for laterally homogeneous parts of the medium. In the present paper, these effects are illustrated for media with a particularly simple horizontal parameter variation such that mode shapes do not vary with horizontal position. The modal wavenumbers depend on horizontal position, however. At a vertical interface between regions with laterally homogeneous medium parameters, each mode is reflected as well as transmitted. For the media considered, reflection and transmission coefficients can be computed separately for each mode without mode coupling, and this is done recu...

  17. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  18. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  19. Aspect-dependent soil saturation and insight into debris-flow initiation during extreme rainfall in the Colorado Front Range

    Science.gov (United States)

    Ebel, Brian A.; Rengers, Francis K.; Tucker, Gregory E.

    2015-01-01

    Hydrologic processes during extreme rainfall events are poorly characterized because of the rarity of measurements. Improved understanding of hydrologic controls on natural hazards is needed because of the potential for substantial risk during extreme precipitation events. We present field measurements of the degree of soil saturation and estimates of available soil-water storage during the September 2013 Colorado extreme rainfall event at burned (wildfire in 2010) and unburned hillslopes with north- and south-facing slope aspects. Soil saturation was more strongly correlated with slope aspect than with recent fire history; south-facing hillslopes became fully saturated while north-facing hillslopes did not. Our results suggest multiple explanations for why aspect-dependent hydrologic controls favor saturation development on south-facing slopes, causing reductions in effective stress and triggering of slope failures during extreme rainfall. Aspect-dependent hydrologic behavior may result from (1) a larger gravel and stone fraction, and hence lower soil-water storage capacity, on south-facing slopes, and (2) lower weathered-bedrock permeability on south-facing slopes, because of lower tree density and associated deep roots penetrating bedrock as well as less intense weathering, inhibiting soil drainage.

  20. Self-produced Time Intervals Are Perceived as More Variable and/or Shorter Depending on Temporal Context in Subsecond and Suprasecond Ranges

    Directory of Open Access Journals (Sweden)

    Keita eMitani

    2016-06-01

    Full Text Available The processing of time intervals is fundamental for sensorimotor and cognitive functions. Perceptual and motor timing are often performed concurrently (e.g., playing a musical instrument. Although previous studies have shown the influence of body movements on time perception, how we perceive self-produced time intervals has remained unclear. Furthermore, it has been suggested that the timing mechanisms are distinct for the sub- and suprasecond ranges. Here, we compared perceptual performances for self-produced and passively presented time intervals in random contexts (i.e., multiple target intervals presented in a session across the sub- and suprasecond ranges (Experiment 1 and within the sub- (Experiment 2 and suprasecond (Experiment 3 ranges, and in a constant context (i.e., a single target interval presented in a session in the sub- and suprasecond ranges (Experiment 4. We show that self-produced time intervals were perceived as shorter and more variable across the sub- and suprasecond ranges and within the suprasecond range but not within the subsecond range in a random context. In a constant context, the self-produced time intervals were perceived as more variable in the suprasecond range but not in the subsecond range. The impairing effects indicate that motor timing interferes with perceptual timing. The dependence of impairment on temporal contexts suggests multiple timing mechanisms for the subsecond and suprasecond ranges. In addition, violation of the scalar property (i.e., a constant variability to target interval ratio was observed between the sub- and suprasecond ranges. The violation was clearer for motor timing than for perceptual timing. This suggests that the multiple timing mechanisms for the sub- and suprasecond ranges overlap more for perception than for motor. Moreover, the central tendency effect (i.e., where shorter base intervals are overestimated and longer base intervals are underestimated disappeared with subsecond

  1. Altered Long- and Short-Range Functional Connectivity in Patients with Betel Quid Dependence: A Resting-State Functional MRI Study

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2016-12-01

    Full Text Available Objective: Addiction is a chronic relapsing brain disease. Brain structural abnormalities may constitute an abnormal neural network that underlies the risk of drug dependence. We hypothesized that individuals with Betel Quid Dependence (BQD have functional connectivity alterations that can be described by long- and short-range functional connectivity density(FCD maps. Methods: We tested this hypothesis using functional magnetic resonance imaging (fMRI data from subjects of the Han ethnic group in Hainan, China. Here, we examined BQD individuals (n = 33 and age-, sex-, and education-matched healthy controls (HCs (n = 32 in a rs-fMRI study to observe FCD alterations associated with the severity of BQD. Results: Compared with HCs, long-range FCD was decreased in the right anterior cingulate cortex (ACC and increased in the left cerebellum posterior lobe (CPL and bilateral inferior parietal lobule (IPL in the BQD group. Short-range FCD was reduced in the right ACC and left dorsolateral prefrontal cortex (dlPFC, and increased in the left CPL. The short-range FCD alteration in the right ACC displayed a negative correlation with the Betel Quid Dependence Scale (BQDS (r=-0.432, P=0.012, and the long-range FCD alteration of left IPL showed a positive correlation with the duration of BQD(r=0.519, P=0.002 in BQD individuals. Conclusions: fMRI revealed differences in long- and short- range FCD in BQD individuals, and these alterations might be due to BQ chewing, BQ dependency, or risk factors for developing BQD.

  2. Temperature Dependence of Thin Film Spiral Inductors on Alumina Over a Temperature Range of 25 to 475 C

    Science.gov (United States)

    Ponchak, George E.; Jordan, Jennifer L.; Scardelletti, Maximilian C.

    2010-01-01

    In this paper, we present an analysis of inductors on an Alumina substrate over the temperature range of 25 to 475 C. Five sets of inductors, each set consisting of a 1.5, 2.5, 3.5, and a 4.5 turn inductor with different line width and spacing, were measured on a high temperature probe station from 10 MHz to 30 GHz. From these measured characteristics, it is shown that the inductance is nearly independent of temperature for low frequencies compared to the self resonant frequency, the parasitic capacitances are independent of temperature, and the resistance varies nearly linearly with temperature. These characteristics result in the self resonant frequency decreasing by only a few percent as the temperature is increased from 25 to 475 C, but the maximum quality factor decreases by a factor of 2 to 3. These observations based on measured data are confirmed through 2D simulations using Sonnet software.

  3. Quantum wave packet calculation of reaction probabilities, cross sections, and rate constants for the C(1D) + HD reaction

    Science.gov (United States)

    Gogtas, Fahrettin; Bulut, Niyazi; Akpinar, Sinan

    The time-dependent real wave packet method has been used to study the C(1D) + HD reaction. The state-to-state and state-to-all reactive scattering probabilities for a broad range of energies are calculated at zero total angular momentum. The probabilities for J > 0 are estimated from accurately computed J = 0 probabilities by using the J-shifting approximation. The integral cross sections for a large energy range, and thermal rate constants are calculated.

  4. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  5. Gender-Dependent Differences in Hip Range of Motion and Impingement Testing in Asymptomatic College Freshman Athletes.

    Science.gov (United States)

    Czuppon, Sylvia; Prather, Heidi; Hunt, Devyani M; Steger-May, Karen; Bloom, Nancy J; Clohisy, John C; Larsen, Richard; Harris-Hayes, Marcie

    2017-07-01

    Athletic activity is a proposed factor in the development and progression of intra-articular hip pathology. Early diagnosis and preventive treatments in "at-risk" athletes are needed. Our primary objective was to report hip range of motion (ROM) and prevalence of positive impingement testing in asymptomatic college freshman athletes. Our secondary objective was to determine whether an association exists between hip ROM and a positive flexion-adduction-internal rotation (FADIR) test. Cross-sectional study. Collegiate athletic campus. Four hundred thirty (299 male, 131 female) freshman athletes reporting no current or previous hip pain. During the athletes' preseason medical screening, trained examiners performed a hip-specific exam to obtain data for hip ROM and impingement testing. Bilateral passive ROM measures included hip flexion, and hip internal and external rotation with the hip flexed 0° and 90°. Mean age of male participants was 18.5 ± 0.8 and female participants was 18.3 ± 0.6 years (P = .003). Male participants demonstrated less hip ROM than female participants in flexion (115.8 ± 11.2° versus 122.0 ± 10.5°, P college freshman athletes, male athletes generally demonstrated less hip ROM than female athletes. In addition, a positive FADIR was more prevalent than previously reported in healthy young adults. Preseason screenings that use these baseline data in conjunction with other examination findings may allow identification of athletes at future risk for hip pain and/or injury. IV. Copyright © 2017 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  6. TH-C-19A-07: Output Factor Dependence On Range and Modulation for a New Proton Therapy System

    Energy Technology Data Exchange (ETDEWEB)

    Sun, B; Zhao, T; Grantham, K; Goddu, S; Santanam, L; Klein, E [Washington University, St. Louis, MO (United States)

    2014-06-15

    Purpose: Proton treatment planning systems are not able to accurately predict output factors and do not calculate monitor units (MU) for proton fields. Output factors (cGy/MU) for patient-specific fields are usually measured in phantoms or modeled empirically. The purpose of this study is to predict the output factors (OFs) for a given proton (R90) and modulation width (Mod) for the first Mevion S250 proton therapy system. Methods: Using water phantoms and a calibrated ionization chamber-electrometer, over 100 OFs were measured for various R90 and Mod combinations for 24 different options. OFs were measured at the center of the Mod, which coincided with the isocenter. The measured OFs were fitted using an analytic model developed by Kooy (Phys.Med.Biol. 50, 2005) for each option and a derived universal empirical-based polynomial as a function of R90 and Mod for all options. Options are devised for ranges of R90 and Mod. The predicted OFs from both models were compared to measurements. Results: Using the empirical-based model, the values could be predicted to within 3% for at least 90% of measurements and within 5% for 98% of the measurements. Using the analytic model to fit each option with the same effective source position, the prediction is much more accurate. The maximal uncertainty between measured and predicted is within 2% and the averaged root-mean-square is 1.5%. Conclusion: Although the measured data was not exhaustive, both models predicted OFs within acceptable uncertainty. Both models are currently used for a sanity check of our continual patient field OF measurements. As we acquire more patient-field OFs, the model will be refined with an ultimate goal of eliminating the time-consuming patient-specific OF measurements.

  7. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  8. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  9. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  10. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  11. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  12. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  13. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  14. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  15. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  16. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  17. Transition Probabilities of Gd I

    Science.gov (United States)

    Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.

    2011-01-01

    Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.

  18. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  1. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  2. Experience-dependent enhancement of pitch-specific responses in the auditory cortex is limited to acceleration rates in normal voice range.

    Science.gov (United States)

    Krishnan, A; Gandour, J T; Suresh, C H

    2015-09-10

    The aim of this study is to determine how pitch acceleration rates within and outside the normal pitch range may influence latency and amplitude of cortical pitch-specific responses (CPR) as a function of language experience (Chinese, English). Responses were elicited from a set of four pitch stimuli chosen to represent a range of acceleration rates (two each inside and outside the normal voice range) imposed on the high rising Mandarin Tone 2. Pitch-relevant neural activity, as reflected in the latency and amplitude of scalp-recorded CPR components, varied depending on language-experience and pitch acceleration of dynamic, time-varying pitch contours. Peak latencies of CPR components were shorter in the Chinese than the English group across stimuli. Chinese participants showed greater amplitude than English for CPR components at both frontocentral and temporal electrode sites in response to pitch contours with acceleration rates inside the normal voice pitch range as compared to pitch contours with acceleration rates that exceed the normal range. As indexed by CPR amplitude at the temporal sites, a rightward asymmetry was observed for the Chinese group only. Only over the right temporal site was amplitude greater in the Chinese group relative to the English. These findings may suggest that the neural mechanism(s) underlying processing of pitch in the right auditory cortex reflect experience-dependent modulation of sensitivity to acceleration in just those rising pitch contours that fall within the bounds of one's native language. More broadly, enhancement of native pitch stimuli and stronger rightward asymmetry of CPR components in the Chinese group is consistent with the notion that long-term experience shapes adaptive, distributed hierarchical pitch processing in the auditory cortex, and reflects an interaction with higher order, extrasensory processes beyond the sensory memory trace. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  3. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  4. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  5. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  6. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  7. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  8. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  9. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  10. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  11. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  12. Temperature dependence of the thermal diffusivity of GaAs in the 100-305 K range measured by the pulsed photothermal displacement technique

    Science.gov (United States)

    Soltanolkotabi, M.; Bennis, G. L.; Gupta, R.

    1999-01-01

    We have measured the variation of the value of the thermal diffusivity of semi-insulating GaAs in the 100-305 K range. The method used is the pulsed photothermal displacement technique. This is a noncontact technique, and the value of the thermal diffusivity is derived from the temporal evolution of the signal rather than its amplitude. This makes the technique less susceptible to uncertainties. We find that the temperature dependence of the thermal conductivity of semi-insulating GaAs follows a power law as T-1.62, in disagreement with results obtained previously. Possible reasons for the deviation within this very important intermediate temperature range are discussed.

  13. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  14. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  15. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  16. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  17. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  18. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  19. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  20. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  1. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  2. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  3. Selective excavation of decalcified dentin using a mid-infrared tunable nanosecond pulsed laser: wavelength dependency in the 6 μm wavelength range

    Science.gov (United States)

    Ishii, Katsunori; Saiki, Masayuki; Yoshikawa, Kazushi; Yasuo, Kenzo; Yamamoto, Kazuyo; Awazu, Kunio

    2011-07-01

    Selective caries treatment has been anticipated as an essential application of dentistry. In clinic, some lasers have already realized the optical drilling of dental hard tissue. However, conventional lasers lack the selectivity, and still depend on the dentist's ability. Based on the absorption property of carious dentin, 6 μm wavelength range shows specific absorptions and promising characteristics for excavation. The objective of this study is to develop a selective excavation of carious dentin by using the laser ablation with 6 μm wavelength range. A mid-infrared tunable pulsed laser was obtained by difference-frequency generation technique. The wavelength was tuned around the absorption bands called amide 1 and amide 2. In the wavelength range from 5.75 to 6.60 μm, the difference of ablation depth between demineralized and normal dentin was observed. The wavelength at 6.02 μm and the average power density of 15 W/cm2, demineralized dentin was removed selectively with less-invasive effect on normal dentin. The wavelength at 6.42 μm required the increase of average power density, but also showed the possibility of selective ablation. This study provided a valuable insight into a wavelength choice for a novel dental laser device under development for minimal intervention dentistry.

  4. Carbon dots with strong excitation-dependent fluorescence changes towards pH. Application as nanosensors for a broad range of pH

    Energy Technology Data Exchange (ETDEWEB)

    Barati, Ali [Faculty of Chemistry, Institute for Advanced Studies in Basic Sciences, Zanjan (Iran, Islamic Republic of); Department of Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of); Shamsipur, Mojtaba, E-mail: mshamsipur@yahoo.com [Department of Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of); Abdollahi, Hamid, E-mail: abd@iasbs.ac.ir [Faculty of Chemistry, Institute for Advanced Studies in Basic Sciences, Zanjan (Iran, Islamic Republic of)

    2016-08-10

    In this study, preparation of novel pH-sensitive N-doped carbon dots (NCDs) using glucose and urea is reported. The prepared NCDs present strong excitation-dependent fluorescence changes towards the pH that is a new behavior from these nanomaterials. By taking advantage of this unique behavior, two separated ratiometric pH sensors using emission spectra of the NCDs for both acidic (pH 2.0 to 8.0) and basic (pH 7.0 to 14.0) ranges of pH are constructed. Additionally, by considering the entire Excitation–Emission Matrix (EEM) of NCDs as analytical signal and using a suitable multivariate calibration method, a broad range of pH from 2.0 to 14.0 was well calibrated. The multivariate calibration method was independent from the concentration of NCDs and resulted in a very low average prediction error of 0.067 pH units. No changes in the predicted pH under UV irradiation (for 3 h) and at high ionic strength (up to 2 M NaCl) indicated the high stability of this pH nanosensor. The practicality of this pH nanosensor for pH determination in real water samples was validated with good accuracy and repeatability. - Highlights: • Novel pH-sensitive carbon dots with strong FL changes towards pH are reported. • Ratiometric FL pH-sensors for both acidic and basic ranges of pH are constructed. • Multivariate calibration methods were used to calibrate a broad range of pH. • Using EEM of carbon dots and ANN, pH from 2.0 to 14.0 was well calibrated. • The pH prediction is stable even at high ionic strength up to 2 M NaCl.

  5. Temperature- and frequency-dependent dielectric properties of biological tissues within the temperature and frequency ranges typically used for magnetic resonance imaging-guided focused ultrasound surgery.

    Science.gov (United States)

    Fu, Fanrui; Xin, Sherman Xuegang; Chen, Wufan

    2014-02-01

    This study aimed to obtain the temperature- and frequency-dependent dielectric properties of tissues subjected to magnetic resonance (MR) scanning for MR imaging-guided focused ultrasound surgery (MRgFUS). These variables are necessary to calculate radio frequency electromagnetic fields distribution and specific radio frequency energy absorption rate (SAR) in the healthy tissues surrounding the target tumours, and their variation may affect the efficacy of advanced RF pulses. The dielectric properties of porcine uterus, liver, kidney, urinary bladder, skeletal muscle, and fat were determined using an open-ended coaxial probe method. The temperature range was set from 36 °C to 60 °C; and the frequencies were set at 42.58 (1 T), 64 (1.5 T), 128 (3 T), 170 (4 T), 298 (7 T), 400 (9 T), and 468 MHz (11 T). Within the temperature and frequency ranges, the dielectric constants were listed as follows: uterus 49.6-121.64, liver 44.81-127.68, kidney 37.3-169.26, bladder 42.43-125.95, muscle 58.62-171.7, and fat 9.2327-20.2295. The following conductivities were obtained at the same temperature and frequency ranges: uterus 0.5506-1.4419, liver 0.5174-0.9709, kidney 0.8061-1.3625, bladder 0.6766-1.1817, muscle 0.8983-1.3083, and fat 0.1552-0.2316. The obtained data are consistent with the temperature and frequency ranges typically used in MRgFUS and thus can be used as reference to calculate radio frequency electromagnetic fields and SAR distribution inside the healthy tissues subjected to MR scanning for MRgFUS.

  6. Paired-pulse transcranial magnetic stimulation reveals probability-dependent changes in functional connectivity between right inferior frontal cortex and primary motor cortex during go/no-go performance

    Directory of Open Access Journals (Sweden)

    A Dilene van Campen

    2013-11-01

    Full Text Available The functional role of the right inferior frontal cortex (rIFC in mediating human behavior is the subject of ongoing debate. Activation of the rIFC has been associated with both response inhibition and with signaling action adaptation demands resulting from unpredicted events. The goal of this study is to investigate the role of rIFC by combining a go/no-go paradigm with paired-pulse transcranial magnetic stimulation (ppTMS over rIFC and the primary motor cortex (M1 to probe the functional connectivity between these brain areas. Participants performed a go/no-go task with 20% or 80% of the trials requiring response inhibition (no-go trials in a classic and a reversed version of the task, respectively. Responses were slower to infrequent compared to frequent go trials, while commission errors were more prevalent to infrequent compared to frequent no-go trials. We hypothesized that if rIFC is involved primarily in response inhibition, then rIFC should exert an inhibitory influence over M1 on no-go (inhibition trials regardless of no-go probability. If, by contrast, rIFC has a role on unexpected trials other than just response inhibition then rIFC should influence M1 on infrequent trials regardless of response demands. We observed that rIFC suppressed M1 excitability during frequent no-go trials, but not during infrequent no-go trials, suggesting that the role of rIFC in response inhibition is context dependent rather than generic. Importantly, rIFC was found to facilitate M1 excitability on all low frequent trials, irrespective of whether the infrequent event involved response inhibition, a finding more in line with a predictive coding framework of cognitive control.

  7. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  8. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  9. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  10. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  11. Challenging Adiabatic Time-dependent Density Functional Theory with a Hubbard Dimer: The Case of Time-Resolved Long-Range Charge Transfer

    CERN Document Server

    Fuks, Johanna I

    2014-01-01

    We explore an asymmetric two-fermion Hubbard dimer to test the accuracy of the adiabatic approximation of time-dependent density functional theory in modelling time-resolved charge transfer. We show that the model shares essential features of a ground state long-range molecule in real-space, and by applying a resonant field we show that the model also reproduces essential traits of the CT dynamics. The simplicity of the model allows us to propagate with an "adiabatically-exact" approximation, i.e. one that uses the exact ground-state exchange-correlation functional, and compare with the exact propagation. This allows us to study the impact of the time-dependent charge-transfer step feature in the exact correlation potential of real molecules on the resulting dynamics. Tuning the parameters of the dimer allows a study both of charge-transfer between open-shell fragments and between closed-shell fragments. We find that the adiabatically-exact functional is unable to properly transfer charge, even in situations ...

  12. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  13. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  14. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  15. Randomized placebo-controlled dose-ranging and pharmacodynamics study of roxadustat (FG-4592) to treat anemia in nondialysis-dependent chronic kidney disease (NDD-CKD) patients.

    Science.gov (United States)

    Besarab, Anatole; Provenzano, Robert; Hertel, Joachim; Zabaneh, Raja; Klaus, Stephen J; Lee, Tyson; Leong, Robert; Hemmerich, Stefan; Yu, Kin-Hung Peony; Neff, Thomas B

    2015-10-01

    Roxadustat (FG-4592) is an oral hypoxia-inducible factor prolyl hydroxylase inhibitor that stimulates erythropoiesis. This Phase 2a study tested efficacy (Hb response) and safety of roxadustat in anemic nondialysis-dependent chronic kidney disease (NDD-CKD) subjects. NDD-CKD subjects with hemoglobin (Hb) ≤11.0 g/dL were sequentially enrolled into four dose cohorts and randomized to roxadustat or placebo two times weekly (BIW) or three times weekly (TIW) for 4 weeks, in an approximate roxadustat:placebo ratio of 3:1. Efficacy was assessed by (i) mean Hb change (ΔHb) from baseline (BL) and (ii) proportion of Hb responders (ΔHb ≥ 1.0 g/dL). Pharmacodynamic evaluation was performed in a subset of subjects. Safety was evaluated by adverse event frequency/severity. Of 116 subjects receiving treatment, 104 completed 4 weeks of dosing and 96 were evaluable for efficacy. BL characteristics for roxadustat and placebo groups were comparable. In roxadustat-treated subjects, Hb levels increased from BL in a dose-related manner in the 0.7, 1.0, 1.5 and 2.0 mg/kg groups. Maximum ΔHb within the first 6 weeks was significantly higher in the 1.5 and 2.0 mg/kg groups than in the placebo subjects. Hb responder rates were dose dependent and ranged from 30% in the 0.7 mg/kg BIW group to 100% in the 2.0 mg/kg BIW and TIW groups versus 13% in placebo. Roxadustat transiently and moderately increased endogenous erythropoietin and reduced hepcidin. Adverse events were similar in the roxadustat and placebo groups. Roxadustat produced dose-dependent increases in blood Hb among anemic NDD-CKD patients in a placebo-controlled trial. Clintrials.gov #NCT00761657. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA.

  16. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  17. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  18. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  19. Tensile properties of the hip joint ligaments are largely variable and age-dependent - An in-vitro analysis in an age range of 14-93 years.

    Science.gov (United States)

    Schleifenbaum, Stefan; Prietzel, Torsten; Hädrich, Carsten; Möbius, Robert; Sichting, Freddy; Hammer, Niels

    2016-10-03

    Hip joint stability is maintained by the surrounding ligaments, muscles, and the atmospheric pressure exerted via these structures. It is unclear whether the ligaments are capable of preventing dislocation solely due to their tensile properties, and to what extent they undergo age-related changes. This study aimed to obtain stress-strain data of the hip ligaments over a large age range. Stress-strain data of the iliofemoral (IL), ischiofemoral (IS) and pubofemoral ligament (PF) were obtained from cadavers ranging between 14 and 93 years using a highly standardized setting. Maximum strains were compared to the distances required for dislocation. Elastic modulus was 24.4 (IL), 22.4 (IS) and 24.9N/mm(2) (PF) respectively. Maximum strain was 84.5%, 86.1%, 72.4% and ultimate stress 10.0, 7.7 and 6.5N/mm(2) for the IL, IS and PF respectively. None of these values varied significantly between ligaments or sides. The IS' elastic modulus was higher and maximum strain lower in males. Lower elastic moduli of the PF and higher maximum strains for the IS and PF were revealed in the ≥55 compared to the ligaments are largely variable. The IS and PF change age-dependently. Though the hip ligaments contribute to hip stability, the IS and cranial IL may not prevent dislocation due to their elasticity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. An all-timescales rainfall probability distribution

    Science.gov (United States)

    Papalexiou, S. M.; Koutsoyiannis, D.

    2009-04-01

    The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.

  1. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  2. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  3. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  4. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  5. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  6. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  7. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  8. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  9. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  10. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  13. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  14. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  15. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  16. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  17. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  18. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  19. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  20. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  1. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  2. Pseudorapidity dependence of long-range two-particle correlations in $p$Pb collisions at $\\sqrt{s_{NN}}=5.02$ TeV

    Energy Technology Data Exchange (ETDEWEB)

    Khachatryan, Vardan; et al.

    2016-04-18

    Two-particle correlations in pPb collisions at a nucleon-nucleon center-of-mass energy of 5.02 TeV are studied as a function of the pseudorapidity separation (Delta eta) of the particle pair at small relative azimuthal angle (abs(Delta phi)< pi/3). The correlations are decomposed into a jet component that dominates the short-range correlations (abs(Delta eta) < 1), and a component that persists at large Delta eta and may originate from collective behavior of the produced system. The events are classified in terms of the multiplicity of the produced particles. Finite azimuthal anisotropies are observed in high-multiplicity events. The second and third Fourier components of the particle-pair azimuthal correlations, V[2] and V[3], are extracted after subtraction of the jet component. The single-particle anisotropy parameters v[2] and v[3] are normalized by their lab frame mid-rapidity value and are studied as a function of eta[cm]. The normalized v[2] distribution is found to be asymmetric about eta[cm] = 0, with smaller values observed at forward pseudorapidity, corresponding to the direction of the proton beam, while no significant pseudorapidity dependence is observed for the normalized v[3] distribution within the statistical uncertainties.

  3. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  4. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  5. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  6. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  7. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  8. Haavelmo's Probability Approach and the Cointegrated VAR

    DEFF Research Database (Denmark)

    Juselius, Katarina

    dependent residuals, normalization, reduced rank, model selection, missing variables, simultaneity, autonomy and iden- ti…cation. Speci…cally the paper discusses (1) the conditions under which the VAR model represents a full probability formulation of a sample of time-series observations, (2...

  9. A nonparametric method for predicting survival probabilities

    NARCIS (Netherlands)

    van der Klaauw, B.; Vriend, S.

    2015-01-01

    Public programs often use statistical profiling to assess the risk that applicants will become long-term dependent on the program. The literature uses linear probability models and (Cox) proportional hazard models to predict duration outcomes. These either focus on one threshold duration or impose

  10. Comonotonic Book-Making with Nonadditive Probabilities

    NARCIS (Netherlands)

    Diecidue, E.; Wakker, P.P.

    2000-01-01

    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the

  11. Robust Model-Free Multiclass Probability Estimation

    Science.gov (United States)

    Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng

    2010-01-01

    Classical statistical approaches for multiclass probability estimation are typically based on regression techniques such as multiple logistic regression, or density estimation approaches such as linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). These methods often make certain assumptions on the form of probability functions or on the underlying distributions of subclasses. In this article, we develop a model-free procedure to estimate multiclass probabilities based on large-margin classifiers. In particular, the new estimation scheme is employed by solving a series of weighted large-margin classifiers and then systematically extracting the probability information from these multiple classification rules. A main advantage of the proposed probability estimation technique is that it does not impose any strong parametric assumption on the underlying distribution and can be applied for a wide range of large-margin classification methods. A general computational algorithm is developed for class probability estimation. Furthermore, we establish asymptotic consistency of the probability estimates. Both simulated and real data examples are presented to illustrate competitive performance of the new approach and compare it with several other existing methods. PMID:21113386

  12. Liquefaction Probability Curves for Surficial Geologic Units

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2009-12-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both

  13. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  14. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  15. Criterion 1: Conservation of biological diversity - Indicator 8: The number of forest dependent species that occupy a small portion of their former range

    Science.gov (United States)

    Curtis H. Flather; Carolyn Hull Sieg; Michael S. Knowles; Jason McNees

    2003-01-01

    This indicator measures the portion of a species' historical distribution that is currently occupied as a surrogate measure of genetic diversity. Based on data for 1,642 terrestrial animals associated with forests, most species (88 percent) were found to fully occupy their historic range - at least as measured by coarse state-level occurrence patterns. Of the 193...

  16. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  17. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  18. Return probability: Exponential versus Gaussian decay

    Energy Technology Data Exchange (ETDEWEB)

    Izrailev, F.M. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)]. E-mail: izrailev@sirio.ifuap.buap.mx; Castaneda-Mendoza, A. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)

    2006-02-13

    We analyze, both analytically and numerically, the time-dependence of the return probability in closed systems of interacting particles. Main attention is paid to the interplay between two regimes, one of which is characterized by the Gaussian decay of the return probability, and another one is the well-known regime of the exponential decay. Our analytical estimates are confirmed by the numerical data obtained for two models with random interaction. In view of these results, we also briefly discuss the dynamical model which was recently proposed for the implementation of a quantum computation.

  19. Randomized placebo-controlled dose-ranging and pharmacodynamics study of roxadustat (FG-4592) to treat anemia in nondialysis-dependent chronic kidney disease (NDD-CKD) patients

    OpenAIRE

    Besarab, Anatole; Provenzano, Robert; Hertel, Joachim; Zabaneh, Raja; Klaus, Stephen J; Lee, Tyson; Leong, Robert; Hemmerich, Stefan; Yu, Kin-Hung Peony; Neff, Thomas B

    2015-01-01

    Background Roxadustat (FG-4592) is an oral hypoxia-inducible factor prolyl hydroxylase inhibitor that stimulates erythropoiesis. This Phase 2a study tested efficacy (Hb response) and safety of roxadustat in anemic nondialysis-dependent chronic kidney disease (NDD-CKD) subjects. Methods NDD-CKD subjects with hemoglobin (Hb) ≤11.0 g/dL were sequentially enrolled into four dose cohorts and randomized to roxadustat or placebo two times weekly (BIW) or three times weekly (TIW) for 4 weeks, in an a...

  20. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  1. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  2. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  3. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  4. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  5. A comparison of wrist function, range of motion and pain between sports and non sports wheelchair-dependent persons with carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Farshad Okhovatian

    2012-04-01

    Full Text Available Background and Aim: Carpal tunnel syndrome is common among handicapped people using wheelchair, and repeated wrist movements increase the risk of incidence of this syndrome. In present study, performance, pain and range of motion of wrist were compared between the athletes and non-athlete handicapped people suffering from carpal tunnel syndrome. Materials and Methods : In this descriptive study, all members of handicapped basketball team in Tehran (35 persons and 33 wheelchair-bound non-athlete handicapped persons residing in Tehran sanitariums were studied (similar with respect to age, weight, height, years of using wheelchair and level of disability.In this study, Clinical Questionnaire and Nerve Conduction Study were used for diagnosing carpal tunnel syndrome, VAS Scale for measuring pain, Goniometer for measuring range of motion of wrist, and Self-Administered Questionnaire for investigating severity of symptoms and performance.Results: The finings of this study indicated that there was no significant difference between two athlete and non-athlete handicapped groups with carpal tunnel syndrome in prevalence of carpal tunnel syndrome, severity of pain, performance and range of motion of wrist (p>0.05. Among 35 athletes, 6 persons (mean age: 36±3.11, mean weight: 68±4.74 and mean height: 172±7 and among 33 non-athletes, 5 persons (mean age: 41±7.1, mean weight: 73±3 and mean height: 173±5 had carpal tunnel syndrome.Conclusion : Unlike what is supposed, repeated movements of wrist is not the only factor predisposing the athlete handicapped people to carpal tunnel syndrome, So other influencing factors should be considered.

  6. Analysis of selected fungi variation and its dependence on season and mountain range in southern Poland-key factors in drawing up trial guidelines for aeromycological monitoring.

    Science.gov (United States)

    Pusz, Wojciech; Weber, Ryszard; Dancewicz, Andrzej; Kita, Włodzimierz

    2017-09-27

    The aim of the study was to identify fungal spores, in particular plant pathogenic fungi, occurring in the air in selected mountain ranges. The results revealed not only the array of fungal species migrating with air currents from the Czech Republic and Slovakia but also how the season of the year affects the distribution of spores. Such studies may lay a foundation for future aeromycological monitoring, in accordance with the requirements for integrated plant protection. Aeromycological research was carried out between 2013 and 2016 at 3-month intervals in mountainous areas along the southern borders of Poland: the Bieszczady, the Pieniny, the Giant Mountains (Karkonosze) and the Babia Góra Massif. The research relied on impact method employing Air Ideal 3P sampler, which, by drawing in atmospheric air, also collects fungal spores. Regardless of altitudinal zonation, the changing weather conditions appeared to be the main reason for the variations in the number of the fungal spores under study in those years.

  7. Failure probability of regional flood defences

    Directory of Open Access Journals (Sweden)

    Lendering Kasper

    2016-01-01

    Full Text Available Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This paper proposed a methodology to quantify the probability of flooding of regional flood defence systems, which required several additions to the methodology used for the primary flood defence system. These additions focused on a method to account for regulation of regional water levels, the possibility of (reduced intrusion resistance due to maintenance dredging in regional water, the probability of traffic loads and the influence of dependence between regional water levels and the phreatic surface of a regional flood defence. In addition, reliability updating is used to demonstrate the potential for updating the probability of failure of regional flood defences with performance observations. The results demonstrated that the proposed methodology can be used to determine the probability of flooding of a regional flood defence system. In doing so, the methodology contributes to improving flood risk management in these systems.

  8. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  9. Training Teachers to Teach Probability

    Science.gov (United States)

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  10. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  11. Bond length (Ti-O) dependence of nano ATO3-based (A = Pb, Ba, Sr) perovskite structures: Optical investigation in IR range

    Science.gov (United States)

    Ghasemifard, Mahdi; Ghamari, Misagh; Okay, Cengiz

    2018-01-01

    In the current study, ABO3 (A = Pb, Ba, Sr and B = Ti) perovskite structures are produced by the auto-combustion route by using citric acid (CA) and nitric acid (NA) as fuel and oxidizer. The X-ray diffraction (XRD) patterns confirmed the perovskite nanostructure with cubic, tetragonal, and rhombohedral for SrTiO3, PbTiO3, and BaTiO3, respectively. Using Scherrer’s equation and XRD pattern, the average crystallite size of the samples were acquired. The effect of Ti-O bond length on the structure of the samples was evaluated. The type of structures obtained depends on Ti-O bond length which is in turn influenced by A2+ substitutions. Microstructural studies of nanostructures calcined at 850∘C confirmed the formation of polyhedral particles with a narrow size distribution. The values of optical band gaps were measured and the impact of A2+ was discussed. The optical properties such as the complex refractive index and dielectric function were calculated by IR spectroscopy and Kramers-Kronig (K-K) relations. Lead, as the element with the highest density as compared to other elements, changes the optical constants, remarkably due to altering titanium and oxygen distance in TO6 groups.

  12. Porous silicon-VO{sub 2} based hybrids as possible optical temperature sensor: Wavelength-dependent optical switching from visible to near-infrared range

    Energy Technology Data Exchange (ETDEWEB)

    Antunez, E. E.; Salazar-Kuri, U.; Estevez, J. O.; Basurto, M. A.; Agarwal, V., E-mail: vagarwal@uaem.mx [Centro de Investigación en Ingeniería y Ciencias Aplicadas, Instituto de Investigación en Ciencias Básicas y Aplicadas, UAEM, Av. Universidad 1001, Col. Chamilpa, Cuernavaca, Mor. 62209 (Mexico); Campos, J. [Instituto de Energías Renovables, UNAM, Priv. Xochicalco S/N, Temixco, Mor. 62580 (Mexico); Jiménez Sandoval, S. [Laboratorio de Investigación en Materiales, Centro de Investigación y estudios Avanzados del Instituto Politécnico Nacional, Unidad Querétaro, Qro. 76001 (Mexico)

    2015-10-07

    Morphological properties of thermochromic VO{sub 2}—porous silicon based hybrids reveal the growth of well-crystalized nanometer-scale features of VO{sub 2} as compared with typical submicron granular structure obtained in thin films deposited on flat substrates. Structural characterization performed as a function of temperature via grazing incidence X-ray diffraction and micro-Raman demonstrate reversible semiconductor-metal transition of the hybrid, changing from a low-temperature monoclinic VO{sub 2}(M) to a high-temperature tetragonal rutile VO{sub 2}(R) crystalline structure, coupled with a decrease in phase transition temperature. Effective optical response studied in terms of red/blue shift of the reflectance spectra results in a wavelength-dependent optical switching with temperature. As compared to VO{sub 2} film over crystalline silicon substrate, the hybrid structure is found to demonstrate up to 3-fold increase in the change of reflectivity with temperature, an enlarged hysteresis loop and a wider operational window for its potential application as an optical temperature sensor. Such silicon based hybrids represent an exciting class of functional materials to display thermally triggered optical switching culminated by the characteristics of each of the constituent blocks as well as device compatibility with standard integrated circuit technology.

  13. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  14. Wide-range antifungal antagonism of Paenibacillus ehimensis IB-X-b and its dependence on chitinase and beta-1,3-glucanase production.

    Science.gov (United States)

    Aktuganov, G; Melentjev, A; Galimzianova, N; Khalikova, E; Korpela, T; Susi, P

    2008-07-01

    Previously, we isolated a strain of Bacillus that had antifungal activity and produced lytic enzymes with fungicidal potential. In the present study, we identified the bacterium as Paenibacillus ehimensis and further explored its antifungal properties. In liquid co-cultivation assays, P. ehimensis IB-X-b decreased biomass production of several pathogenic fungi by 45%-75%. The inhibition was accompanied by degradation of fungal cell walls and alterations in hyphal morphology. Residual medium from cultures of P. ehimensis IB-X-b inhibited fungal growth, indicating the inhibitors were secreted into the medium. Of the 2 major lytic enzymes, chitinases were only induced by chitin-containing substrates, whereas beta-1,3-glucanase showed steady levels in all carbon sources. Both purified chitinase and beta-1,3-glucanase degraded cell walls of macerated fungal mycelia, whereas only the latter also degraded cell walls of intact mycelia. The results indicate synergism between the antifungal action mechanisms of these enzymes in which beta-1,3-glucanase is the initiator of the cell wall hydrolysis, whereas the degradation process is reinforced by chitinases. Paenibacillus ehimensis IB-X-b has pronounced antifungal activity with a wide range of fungi and has potential as a biological control agent against plant pathogenic fungi.

  15. Pseudorapidity dependence of long-range two-particle correlations in pPb collisions at $\\sqrt{s_{\\mathrm{NN}}}=$ 5.02 TeV

    CERN Document Server

    Khachatryan, Vardan; Tumasyan, Armen; Adam, Wolfgang; Aşılar, Ece; Bergauer, Thomas; Brandstetter, Johannes; Brondolin, Erica; Dragicevic, Marko; Erö, Janos; Flechl, Martin; Friedl, Markus; Fruehwirth, Rudolf; Ghete, Vasile Mihai; Hartl, Christian; Hörmann, Natascha; Hrubec, Josef; Jeitler, Manfred; Knünz, Valentin; König, Axel; Krammer, Manfred; Krätschmer, Ilse; Liko, Dietrich; Matsushita, Takashi; Mikulec, Ivan; Rabady, Dinyar; Rad, Navid; Rahbaran, Babak; Rohringer, Herbert; Schieck, Jochen; Schöfbeck, Robert; Strauss, Josef; Treberer-Treberspurg, Wolfgang; Waltenberger, Wolfgang; Wulz, Claudia-Elisabeth; Mossolov, Vladimir; Shumeiko, Nikolai; Suarez Gonzalez, Juan; Alderweireldt, Sara; Cornelis, Tom; De Wolf, Eddi A; Janssen, Xavier; Knutsson, Albert; Lauwers, Jasper; Luyckx, Sten; Van De Klundert, Merijn; Van Haevermaet, Hans; Van Mechelen, Pierre; Van Remortel, Nick; Van Spilbeeck, Alex; Abu Zeid, Shimaa; Blekman, Freya; D'Hondt, Jorgen; Daci, Nadir; De Bruyn, Isabelle; Deroover, Kevin; Heracleous, Natalie; Keaveney, James; Lowette, Steven; Moreels, Lieselotte; Olbrechts, Annik; Python, Quentin; Strom, Derek; Tavernier, Stefaan; Van Doninck, Walter; Van Mulders, Petra; Van Onsem, Gerrit Patrick; Van Parijs, Isis; Barria, Patrizia; Brun, Hugues; Caillol, Cécile; Clerbaux, Barbara; De Lentdecker, Gilles; Fasanella, Giuseppe; Favart, Laurent; Goldouzian, Reza; Grebenyuk, Anastasia; Karapostoli, Georgia; Lenzi, Thomas; Léonard, Alexandre; Maerschalk, Thierry; Marinov, Andrey; Perniè, Luca; Randle-conde, Aidan; Seva, Tomislav; Vander Velde, Catherine; Vanlaer, Pascal; Yonamine, Ryo; Zenoni, Florian; Zhang, Fengwangdong; Beernaert, Kelly; Benucci, Leonardo; Cimmino, Anna; Crucy, Shannon; Dobur, Didar; Fagot, Alexis; Garcia, Guillaume; Gul, Muhammad; Mccartin, Joseph; Ocampo Rios, Alberto Andres; Poyraz, Deniz; Ryckbosch, Dirk; Salva Diblen, Sinem; Sigamani, Michael; Tytgat, Michael; Van Driessche, Ward; Yazgan, Efe; Zaganidis, Nicolas; Basegmez, Suzan; Beluffi, Camille; Bondu, Olivier; Brochet, Sébastien; Bruno, Giacomo; Caudron, Adrien; Ceard, Ludivine; Delaere, Christophe; Favart, Denis; Forthomme, Laurent; Giammanco, Andrea; Jafari, Abideh; Jez, Pavel; Komm, Matthias; Lemaitre, Vincent; Mertens, Alexandre; Musich, Marco; Nuttens, Claude; Perrini, Lucia; Piotrzkowski, Krzysztof; Popov, Andrey; Quertenmont, Loic; Selvaggi, Michele; Vidal Marono, Miguel; Beliy, Nikita; Hammad, Gregory Habib; Aldá Júnior, Walter Luiz; Alves, Fábio Lúcio; Alves, Gilvan; Brito, Lucas; Correa Martins Junior, Marcos; Hamer, Matthias; Hensel, Carsten; Moraes, Arthur; Pol, Maria Elena; Rebello Teles, Patricia; Belchior Batista Das Chagas, Ewerton; Carvalho, Wagner; Chinellato, Jose; Custódio, Analu; Melo Da Costa, Eliza; De Jesus Damiao, Dilson; De Oliveira Martins, Carley; Fonseca De Souza, Sandro; Huertas Guativa, Lina Milena; Malbouisson, Helena; Matos Figueiredo, Diego; Mora Herrera, Clemencia; Mundim, Luiz; Nogima, Helio; Prado Da Silva, Wanda Lucia; Santoro, Alberto; Sznajder, Andre; Tonelli Manganote, Edmilson José; Vilela Pereira, Antonio; Ahuja, Sudha; Bernardes, Cesar Augusto; De Souza Santos, Angelo; Dogra, Sunil; Tomei, Thiago; De Moraes Gregores, Eduardo; Mercadante, Pedro G; Moon, Chang-Seong; Novaes, Sergio F; Padula, Sandra; Romero Abad, David; Ruiz Vargas, José Cupertino; Aleksandrov, Aleksandar; Hadjiiska, Roumyana; Iaydjiev, Plamen; Rodozov, Mircho; Stoykova, Stefka; Sultanov, Georgi; Vutova, Mariana; Dimitrov, Anton; Glushkov, Ivan; Litov, Leander; Pavlov, Borislav; Petkov, Peicho; Ahmad, Muhammad; Bian, Jian-Guo; Chen, Guo-Ming; Chen, He-Sheng; Chen, Mingshui; Cheng, Tongguang; Du, Ran; Jiang, Chun-Hua; Leggat, Duncan; Plestina, Roko; Romeo, Francesco; Shaheen, Sarmad Masood; Spiezia, Aniello; Tao, Junquan; Wang, Chunjie; Wang, Zheng; Zhang, Huaqiao; Asawatangtrakuldee, Chayanit; Ban, Yong; Li, Qiang; Liu, Shuai; Mao, Yajun; Qian, Si-Jin; Wang, Dayong; Xu, Zijun; Avila, Carlos; Cabrera, Andrés; Chaparro Sierra, Luisa Fernanda; Florez, Carlos; Gomez, Juan Pablo; Gomez Moreno, Bernardo; Sanabria, Juan Carlos; Godinovic, Nikola; Lelas, Damir; Puljak, Ivica; Ribeiro Cipriano, Pedro M; Antunovic, Zeljko; Kovac, Marko; Brigljevic, Vuko; Kadija, Kreso; Luetic, Jelena; Micanovic, Sasa; Sudic, Lucija; Attikis, Alexandros; Mavromanolakis, Georgios; Mousa, Jehad; Nicolaou, Charalambos; Ptochos, Fotios; Razis, Panos A; Rykaczewski, Hans; Bodlak, Martin; Finger, Miroslav; Finger Jr, Michael; Abdelalim, Ahmed Ali; Awad, Adel; Mahrous, Ayman; Radi, Amr; Calpas, Betty; Kadastik, Mario; Murumaa, Marion; Raidal, Martti; Tiko, Andres; Veelken, Christian; Eerola, Paula; Pekkanen, Juska; Voutilainen, Mikko; Härkönen, Jaakko; Karimäki, Veikko; Kinnunen, Ritva; Lampén, Tapio; Lassila-Perini, Kati; Lehti, Sami; Lindén, Tomas; Luukka, Panja-Riina; Peltola, Timo; Tuominiemi, Jorma; Tuovinen, Esa; Wendland, Lauri; Talvitie, Joonas; Tuuva, Tuure; Besancon, Marc; Couderc, Fabrice; Dejardin, Marc; Denegri, Daniel; Fabbro, Bernard; Faure, Jean-Louis; Favaro, Carlotta; Ferri, Federico; Ganjour, Serguei; Givernaud, Alain; Gras, Philippe; Hamel de Monchenault, Gautier; Jarry, Patrick; Locci, Elizabeth; Machet, Martina; Malcles, Julie; Rander, John; Rosowsky, André; Titov, Maksym; Zghiche, Amina; Antropov, Iurii; Baffioni, Stephanie; Beaudette, Florian; Busson, Philippe; Cadamuro, Luca; Chapon, Emilien; Charlot, Claude; Davignon, Olivier; Filipovic, Nicolas; Granier de Cassagnac, Raphael; Jo, Mihee; Lisniak, Stanislav; Mastrolorenzo, Luca; Miné, Philippe; Naranjo, Ivo Nicolas; Nguyen, Matthew; Ochando, Christophe; Ortona, Giacomo; Paganini, Pascal; Pigard, Philipp; Regnard, Simon; Salerno, Roberto; Sauvan, Jean-Baptiste; Sirois, Yves; Strebler, Thomas; Yilmaz, Yetkin; Zabi, Alexandre; Agram, Jean-Laurent; Andrea, Jeremy; Aubin, Alexandre; Bloch, Daniel; Brom, Jean-Marie; Buttignol, Michael; Chabert, Eric Christian; Chanon, Nicolas; Collard, Caroline; Conte, Eric; Coubez, Xavier; Fontaine, Jean-Charles; Gelé, Denis; Goerlach, Ulrich; Goetzmann, Christophe; Le Bihan, Anne-Catherine; Merlin, Jeremie Alexandre; Skovpen, Kirill; Van Hove, Pierre; Gadrat, Sébastien; Beauceron, Stephanie; Bernet, Colin; Boudoul, Gaelle; Bouvier, Elvire; Carrillo Montoya, Camilo Andres; Chierici, Roberto; Contardo, Didier; Courbon, Benoit; Depasse, Pierre; El Mamouni, Houmani; Fan, Jiawei; Fay, Jean; Gascon, Susan; Gouzevitch, Maxime; Ille, Bernard; Lagarde, Francois; Laktineh, Imad Baptiste; Lethuillier, Morgan; Mirabito, Laurent; Pequegnot, Anne-Laure; Perries, Stephane; Ruiz Alvarez, José David; Sabes, David; Sgandurra, Louis; Sordini, Viola; Vander Donckt, Muriel; Verdier, Patrice; Viret, Sébastien; Toriashvili, Tengizi; Tsamalaidze, Zviad; Autermann, Christian; Beranek, Sarah; Feld, Lutz; Heister, Arno; Kiesel, Maximilian Knut; Klein, Katja; Lipinski, Martin; Ostapchuk, Andrey; Preuten, Marius; Raupach, Frank; Schael, Stefan; Schulte, Jan-Frederik; Verlage, Tobias; Weber, Hendrik; Zhukov, Valery; Ata, Metin; Brodski, Michael; Dietz-Laursonn, Erik; Duchardt, Deborah; Endres, Matthias; Erdmann, Martin; Erdweg, Sören; Esch, Thomas; Fischer, Robert; Güth, Andreas; Hebbeker, Thomas; Heidemann, Carsten; Hoepfner, Kerstin; Knutzen, Simon; Kreuzer, Peter; Merschmeyer, Markus; Meyer, Arnd; Millet, Philipp; Mukherjee, Swagata; Olschewski, Mark; Padeken, Klaas; Papacz, Paul; Pook, Tobias; Radziej, Markus; Reithler, Hans; Rieger, Marcel; Scheuch, Florian; Sonnenschein, Lars; Teyssier, Daniel; Thüer, Sebastian; Cherepanov, Vladimir; Erdogan, Yusuf; Flügge, Günter; Geenen, Heiko; Geisler, Matthias; Hoehle, Felix; Kargoll, Bastian; Kress, Thomas; Künsken, Andreas; Lingemann, Joschka; Nehrkorn, Alexander; Nowack, Andreas; Nugent, Ian Michael; Pistone, Claudia; Pooth, Oliver; Stahl, Achim; Aldaya Martin, Maria; Asin, Ivan; Bartosik, Nazar; Behnke, Olaf; Behrens, Ulf; Borras, Kerstin; Burgmeier, Armin; Campbell, Alan; Contreras-Campana, Christian; Costanza, Francesco; Diez Pardos, Carmen; Dolinska, Ganna; Dooling, Samantha; Dorland, Tyler; Eckerlin, Guenter; Eckstein, Doris; Eichhorn, Thomas; Flucke, Gero; Gallo, Elisabetta; Garay Garcia, Jasone; Geiser, Achim; Gizhko, Andrii; Gunnellini, Paolo; Hauk, Johannes; Hempel, Maria; Jung, Hannes; Kalogeropoulos, Alexis; Karacheban, Olena; Kasemann, Matthias; Katsas, Panagiotis; Kieseler, Jan; Kleinwort, Claus; Korol, Ievgen; Lange, Wolfgang; Leonard, Jessica; Lipka, Katerina; Lobanov, Artur; Lohmann, Wolfgang; Mankel, Rainer; Melzer-Pellmann, Isabell-Alissandra; Meyer, Andreas Bernhard; Mittag, Gregor; Mnich, Joachim; Mussgiller, Andreas; Naumann-Emme, Sebastian; Nayak, Aruna; Ntomari, Eleni; Perrey, Hanno; Pitzl, Daniel; Placakyte, Ringaile; Raspereza, Alexei; Roland, Benoit; Sahin, Mehmet Özgür; Saxena, Pooja; Schoerner-Sadenius, Thomas; Seitz, Claudia; Spannagel, Simon; Trippkewitz, Karim Damun; Walsh, Roberval; Wissing, Christoph; Blobel, Volker; Centis Vignali, Matteo; Draeger, Arne-Rasmus; Erfle, Joachim; Garutti, Erika; Goebel, Kristin; Gonzalez, Daniel; Görner, Martin; Haller, Johannes; Hoffmann, Malte; Höing, Rebekka Sophie; Junkes, Alexandra; Klanner, Robert; Kogler, Roman; Kovalchuk, Nataliia; Lapsien, Tobias; Lenz, Teresa; Marchesini, Ivan; Marconi, Daniele; Meyer, Mareike; Nowatschin, Dominik; Ott, Jochen; Pantaleo, Felice; Peiffer, Thomas; Perieanu, Adrian; Pietsch, Niklas; Poehlsen, Jennifer; Rathjens, Denis; Sander, Christian; Scharf, Christian; Schleper, Peter; Schlieckau, Eike; Schmidt, Alexander; Schumann, Svenja; Schwandt, Joern; Sola, Valentina; Stadie, Hartmut; Steinbrück, Georg; Stober, Fred-Markus Helmut; Tholen, Heiner; Troendle, Daniel; Usai, Emanuele; Vanelderen, Lukas; Vanhoefer, Annika; Vormwald, Benedikt; Barth, Christian; Baus, Colin; Berger, Joram; Böser, Christian; Butz, Erik; Chwalek, Thorsten; Colombo, Fabio; De Boer, Wim; Descroix, Alexis; Dierlamm, Alexander; Fink, Simon; Frensch, Felix; Friese, Raphael; Giffels, Manuel; Gilbert, Andrew; Haitz, Dominik; Hartmann, Frank; Heindl, Stefan Michael; Husemann, Ulrich; Katkov, Igor; Kornmayer, Andreas; Lobelle Pardo, Patricia; Maier, Benedikt; Mildner, Hannes; Mozer, Matthias Ulrich; Müller, Thomas; Müller, Thomas; Plagge, Michael; Quast, Gunter; Rabbertz, Klaus; Röcker, Steffen; Roscher, Frank; Schröder, Matthias; Sieber, Georg; Simonis, Hans-Jürgen; Ulrich, Ralf; Wagner-Kuhr, Jeannine; Wayand, Stefan; Weber, Marc; Weiler, Thomas; Williamson, Shawn; Wöhrmann, Clemens; Wolf, Roger; Anagnostou, Georgios; Daskalakis, Georgios; Geralis, Theodoros; Giakoumopoulou, Viktoria Athina; Kyriakis, Aristotelis; Loukas, Demetrios; Psallidas, Andreas; Topsis-Giotis, Iasonas; Agapitos, Antonis; Kesisoglou, Stilianos; Panagiotou, Apostolos; Saoulidou, Niki; Tziaferi, Eirini; Evangelou, Ioannis; Flouris, Giannis; Foudas, Costas; Kokkas, Panagiotis; Loukas, Nikitas; Manthos, Nikolaos; Papadopoulos, Ioannis; Paradas, Evangelos; Strologas, John; Bencze, Gyorgy; Hajdu, Csaba; Hazi, Andras; Hidas, Pàl; Horvath, Dezso; Sikler, Ferenc; Veszpremi, Viktor; Vesztergombi, Gyorgy; Zsigmond, Anna Julia; Beni, Noemi; Czellar, Sandor; Karancsi, János; Molnar, Jozsef; Szillasi, Zoltan; Bartók, Márton; Makovec, Alajos; Raics, Peter; Trocsanyi, Zoltan Laszlo; Ujvari, Balazs; Choudhury, Somnath; Mal, Prolay; Mandal, Koushik; Sahoo, Deepak Kumar; Sahoo, Niladribihari; Swain, Sanjay Kumar; Bansal, Sunil; Beri, Suman Bala; Bhatnagar, Vipin; Chawla, Ridhi; Gupta, Ruchi; Bhawandeep, Bhawandeep; Kalsi, Amandeep Kaur; Kaur, Anterpreet; Kaur, Manjit; Kumar, Ramandeep; Mehta, Ankita; Mittal, Monika; Singh, Jasbir; Walia, Genius; Kumar, Ashok; Bhardwaj, Ashutosh; Choudhary, Brajesh C; Garg, Rocky Bala; Malhotra, Shivali; Naimuddin, Md; Nishu, Nishu; Ranjan, Kirti; Sharma, Ramkrishna; Sharma, Varun; Bhattacharya, Satyaki; Chatterjee, Kalyanmoy; Dey, Sourav; Dutta, Suchandra; Majumdar, Nayana; Modak, Atanu; Mondal, Kuntal; Mukhopadhyay, Supratik; Roy, Ashim; Roy, Debarati; Roy Chowdhury, Suvankar; Sarkar, Subir; Sharan, Manoj; Abdulsalam, Abdulla; Chudasama, Ruchi; Dutta, Dipanwita; Jha, Vishwajeet; Kumar, Vineet; Mohanty, Ajit Kumar; Pant, Lalit Mohan; Shukla, Prashant; Topkar, Anita; Aziz, Tariq; Banerjee, Sudeshna; Bhowmik, Sandeep; Chatterjee, Rajdeep Mohan; Dewanjee, Ram Krishna; Dugad, Shashikant; Ganguly, Sanmay; Ghosh, Saranya; Guchait, Monoranjan; Gurtu, Atul; Jain, Sandhya; Kole, Gouranga; Kumar, Sanjeev; Mahakud, Bibhuprasad; Maity, Manas; Majumder, Gobinda; Mazumdar, Kajari; Mitra, Soureek; Mohanty, Gagan Bihari; Parida, Bibhuti; Sarkar, Tanmay; Sur, Nairit; Sutar, Bajrang; Wickramage, Nadeesha; Chauhan, Shubhanshu; Dube, Sourabh; Kapoor, Anshul; Kothekar, Kunal; Sharma, Seema; Bakhshiansohi, Hamed; Behnamian, Hadi; Etesami, Seyed Mohsen; Fahim, Ali; Khakzad, Mohsen; Mohammadi Najafabadi, Mojtaba; Naseri, Mohsen; Paktinat Mehdiabadi, Saeid; Rezaei Hosseinabadi, Ferdos; Safarzadeh, Batool; Zeinali, Maryam; Felcini, Marta; Grunewald, Martin; Abbrescia, Marcello; Calabria, Cesare; Caputo, Claudio; Colaleo, Anna; Creanza, Donato; Cristella, Leonardo; De Filippis, Nicola; De Palma, Mauro; Fiore, Luigi; Iaselli, Giuseppe; Maggi, Giorgio; Maggi, Marcello; Miniello, Giorgia; My, Salvatore; Nuzzo, Salvatore; Pompili, Alexis; Pugliese, Gabriella; Radogna, Raffaella; Ranieri, Antonio; Selvaggi, Giovanna; Silvestris, Lucia; Venditti, Rosamaria; Abbiendi, Giovanni; Battilana, Carlo; Benvenuti, Alberto; Bonacorsi, Daniele; Braibant-Giacomelli, Sylvie; Brigliadori, Luca; Campanini, Renato; Capiluppi, Paolo; Castro, Andrea; Cavallo, Francesca Romana; Chhibra, Simranjit Singh; Codispoti, Giuseppe; Cuffiani, Marco; Dallavalle, Gaetano-Marco; Fabbri, Fabrizio; Fanfani, Alessandra; Fasanella, Daniele; Giacomelli, Paolo; Grandi, Claudio; Guiducci, Luigi; Marcellini, Stefano; Masetti, Gianni; Montanari, Alessandro; Navarria, Francesco; Perrotta, Andrea; Rossi, Antonio; Rovelli, Tiziano; Siroli, Gian Piero; Tosi, Nicolò; Cappello, Gigi; Chiorboli, Massimiliano; Costa, Salvatore; Di Mattia, Alessandro; Giordano, Ferdinando; Potenza, Renato; Tricomi, Alessia; Tuve, Cristina; Barbagli, Giuseppe; Ciulli, Vitaliano; Civinini, Carlo; D'Alessandro, Raffaello; Focardi, Ettore; Gori, Valentina; Lenzi, Piergiulio; Meschini, Marco; Paoletti, Simone; Sguazzoni, Giacomo; Viliani, Lorenzo; Benussi, Luigi; Bianco, Stefano; Fabbri, Franco; Piccolo, Davide; Primavera, Federica; Calvelli, Valerio; Ferro, Fabrizio; Lo Vetere, Maurizio; Monge, Maria Roberta; Robutti, Enrico; Tosi, Silvano; Brianza, Luca; Dinardo, Mauro Emanuele; Fiorendi, Sara; Gennai, Simone; Gerosa, Raffaele; Ghezzi, Alessio; Govoni, Pietro; Malvezzi, Sandra; Manzoni, Riccardo Andrea; Marzocchi, Badder; Menasce, Dario; Moroni, Luigi; Paganoni, Marco; Pedrini, Daniele; Ragazzi, Stefano; Redaelli, Nicola; Tabarelli de Fatis, Tommaso; Buontempo, Salvatore; Cavallo, Nicola; Di Guida, Salvatore; Esposito, Marco; Fabozzi, Francesco; Iorio, Alberto Orso Maria; Lanza, Giuseppe; Lista, Luca; Meola, Sabino; Merola, Mario; Paolucci, Pierluigi; Sciacca, Crisostomo; Thyssen, Filip; Azzi, Patrizia; Bacchetta, Nicola; Bellato, Marco; Benato, Lisa; Boletti, Alessio; Branca, Antonio; Dall'Osso, Martino; Dorigo, Tommaso; Fantinel, Sergio; Fanzago, Federica; Gonella, Franco; Gozzelino, Andrea; Kanishchev, Konstantin; Lacaprara, Stefano; Margoni, Martino; Meneguzzo, Anna Teresa; Montecassiano, Fabio; Passaseo, Marina; Pazzini, Jacopo; Pegoraro, Matteo; Pozzobon, Nicola; Ronchese, Paolo; Simonetto, Franco; Torassa, Ezio; Tosi, Mia; Ventura, Sandro; Zanetti, Marco; Zotto, Pierluigi; Zucchetta, Alberto; Braghieri, Alessandro; Magnani, Alice; Montagna, Paolo; Ratti, Sergio P; Re, Valerio; Riccardi, Cristina; Salvini, Paola; Vai, Ilaria; Vitulo, Paolo; Alunni Solestizi, Luisa; Bilei, Gian Mario; Ciangottini, Diego; Fanò, Livio; Lariccia, Paolo; Mantovani, Giancarlo; Menichelli, Mauro; Saha, Anirban; Santocchia, Attilio; Androsov, Konstantin; Azzurri, Paolo; Bagliesi, Giuseppe; Bernardini, Jacopo; Boccali, Tommaso; Castaldi, Rino; Ciocci, Maria Agnese; Dell'Orso, Roberto; Donato, Silvio; Fedi, Giacomo; Foà, Lorenzo; Giassi, Alessandro; Grippo, Maria Teresa; Ligabue, Franco; Lomtadze, Teimuraz; Martini, Luca; Messineo, Alberto; Palla, Fabrizio; Rizzi, Andrea; Savoy-Navarro, Aurore; Serban, Alin Titus; Spagnolo, Paolo; Tenchini, Roberto; Tonelli, Guido; Venturi, Andrea; Verdini, Piero Giorgio; Barone, Luciano; Cavallari, Francesca; D'imperio, Giulia; Del Re, Daniele; Diemoz, Marcella; Gelli, Simone; Jorda, Clara; Longo, Egidio; Margaroli, Fabrizio; Meridiani, Paolo; Organtini, Giovanni; Paramatti, Riccardo; Preiato, Federico; Rahatlou, Shahram; Rovelli, Chiara; Santanastasio, Francesco; Traczyk, Piotr; Amapane, Nicola; Arcidiacono, Roberta; Argiro, Stefano; Arneodo, Michele; Bellan, Riccardo; Biino, Cristina; Cartiglia, Nicolo; Costa, Marco; Covarelli, Roberto; Degano, Alessandro; Demaria, Natale; Finco, Linda; Kiani, Bilal; Mariotti, Chiara; Maselli, Silvia; Migliore, Ernesto; Monaco, Vincenzo; Monteil, Ennio; Obertino, Maria Margherita; Pacher, Luca; Pastrone, Nadia; Pelliccioni, Mario; Pinna Angioni, Gian Luca; Ravera, Fabio; Romero, Alessandra; Ruspa, Marta; Sacchi, Roberto; Solano, Ada; Staiano, Amedeo; Belforte, Stefano; Candelise, Vieri; Casarsa, Massimo; Cossutti, Fabio; Della Ricca, Giuseppe; Gobbo, Benigno; La Licata, Chiara; Marone, Matteo; Schizzi, Andrea; Zanetti, Anna; Kropivnitskaya, Anna; Nam, Soon-Kwon; Kim, Dong Hee; Kim, Gui Nyun; Kim, Min Suk; Kong, Dae Jung; Lee, Sangeun; Oh, Young Do; Sakharov, Alexandre; Son, Dong-Chul; Brochero Cifuentes, Javier Andres; Kim, Hyunsoo; Kim, Tae Jeong; Song, Sanghyeon; Cho, Sungwoong; Choi, Suyong; Go, Yeonju; Gyun, Dooyeon; Hong, Byung-Sik; Kim, Hyunchul; Kim, Yongsun; Lee, Byounghoon; Lee, Kisoo; Lee, Kyong Sei; Lee, Songkyo; Park, Sung Keun; Roh, Youn; Yoo, Hwi Dong; Choi, Minkyoo; Kim, Hyunyong; Kim, Ji Hyun; Lee, Jason Sang Hun; Park, Inkyu; Ryu, Geonmo; Ryu, Min Sang; Choi, Young-Il; Goh, Junghwan; Kim, Donghyun; Kwon, Eunhyang; Lee, Jongseok; Yu, Intae; Dudenas, Vytautas; Juodagalvis, Andrius; Vaitkus, Juozas; Ahmed, Ijaz; Ibrahim, Zainol Abidin; Komaragiri, Jyothsna Rani; Md Ali, Mohd Adli Bin; Mohamad Idris, Faridah; Wan Abdullah, Wan Ahmad Tajuddin; Yusli, Mohd Nizam; Zolkapli, Zukhaimira; Casimiro Linares, Edgar; Castilla-Valdez, Heriberto; De La Cruz-Burelo, Eduard; Heredia-De La Cruz, Ivan; Hernandez-Almada, Alberto; Lopez-Fernandez, Ricardo; Sánchez Hernández, Alberto; Carrillo Moreno, Salvador; Vazquez Valencia, Fabiola; Pedraza, Isabel; Salazar Ibarguen, Humberto Antonio; Morelos Pineda, Antonio; Krofcheck, David; Butler, Philip H; Ahmad, Ashfaq; Ahmad, Muhammad; Hassan, Qamar; Hoorani, Hafeez R; Khan, Wajid Ali; Khurshid, Taimoor; Shoaib, Muhammad; Bialkowska, Helena; Bluj, Michal; Boimska, Bożena; Frueboes, Tomasz; Górski, Maciej; Kazana, Malgorzata; Nawrocki, Krzysztof; Romanowska-Rybinska, Katarzyna; Szleper, Michal; Zalewski, Piotr; Brona, Grzegorz; Bunkowski, Karol; Byszuk, Adrian; Doroba, Krzysztof; Kalinowski, Artur; Konecki, Marcin; Krolikowski, Jan; Misiura, Maciej; Olszewski, Michal; Walczak, Marek; Bargassa, Pedrame; Beirão Da Cruz E Silva, Cristóvão; Di Francesco, Agostino; Faccioli, Pietro; Ferreira Parracho, Pedro Guilherme; Gallinaro, Michele; Hollar, Jonathan; Leonardo, Nuno; Lloret Iglesias, Lara; Nguyen, Federico; Rodrigues Antunes, Joao; Seixas, Joao; Toldaiev, Oleksii; Vadruccio, Daniele; Varela, Joao; Vischia, Pietro; Afanasiev, Serguei; Bunin, Pavel; Gavrilenko, Mikhail; Golutvin, Igor; Gorbunov, Ilya; Kamenev, Alexey; Karjavin, Vladimir; Lanev, Alexander; Malakhov, Alexander; Matveev, Viktor; Moisenz, Petr; Palichik, Vladimir; Perelygin, Victor; Shmatov, Sergey; Shulha, Siarhei; Skatchkov, Nikolai; Smirnov, Vitaly; Zarubin, Anatoli; Golovtsov, Victor; Ivanov, Yury; Kim, Victor; Kuznetsova, Ekaterina; Levchenko, Petr; Murzin, Victor; Oreshkin, Vadim; Smirnov, Igor; Sulimov, Valentin; Uvarov, Lev; Vavilov, Sergey; Vorobyev, Alexey; Andreev, Yuri; Dermenev, Alexander; Gninenko, Sergei; Golubev, Nikolai; Karneyeu, Anton; Kirsanov, Mikhail; Krasnikov, Nikolai; Pashenkov, Anatoli; Tlisov, Danila; Toropin, Alexander; Epshteyn, Vladimir; Gavrilov, Vladimir; Lychkovskaya, Natalia; Popov, Vladimir; Pozdnyakov, Ivan; Safronov, Grigory; Spiridonov, Alexander; Vlasov, Evgueni; Zhokin, Alexander; Bylinkin, Alexander; Chadeeva, Marina; Danilov, Mikhail; Andreev, Vladimir; Azarkin, Maksim; Dremin, Igor; Kirakosyan, Martin; Leonidov, Andrey; Mesyats, Gennady; Rusakov, Sergey V; Baskakov, Alexey; Belyaev, Andrey; Boos, Edouard; Ershov, Alexander; Gribushin, Andrey; Kaminskiy, Alexandre; Kodolova, Olga; Korotkikh, Vladimir; Lokhtin, Igor; Miagkov, Igor; Obraztsov, Stepan; Petrushanko, Sergey; Savrin, Viktor; Snigirev, Alexander; Vardanyan, Irina; Azhgirey, Igor; Bayshev, Igor; Bitioukov, Sergei; Kachanov, Vassili; Kalinin, Alexey; Konstantinov, Dmitri; Krychkine, Victor; Petrov, Vladimir; Ryutin, Roman; Sobol, Andrei; Tourtchanovitch, Leonid; Troshin, Sergey; Tyurin, Nikolay; Uzunian, Andrey; Volkov, Alexey; Adzic, Petar; Cirkovic, Predrag; Milosevic, Jovan; Rekovic, Vladimir; Alcaraz Maestre, Juan; Calvo, Enrique; Cerrada, Marcos; Chamizo Llatas, Maria; Colino, Nicanor; De La Cruz, Begona; Delgado Peris, Antonio; Escalante Del Valle, Alberto; Fernandez Bedoya, Cristina; Fernández Ramos, Juan Pablo; Flix, Jose; Fouz, Maria Cruz; Garcia-Abia, Pablo; Gonzalez Lopez, Oscar; Goy Lopez, Silvia; Hernandez, Jose M; Josa, Maria Isabel; Navarro De Martino, Eduardo; Pérez-Calero Yzquierdo, Antonio María; Puerta Pelayo, Jesus; Quintario Olmeda, Adrián; Redondo, Ignacio; Romero, Luciano; Santaolalla, Javier; Senghi Soares, Mara; Albajar, Carmen; de Trocóniz, Jorge F; Missiroli, Marino; Moran, Dermot; Cuevas, Javier; Fernandez Menendez, Javier; Folgueras, Santiago; Gonzalez Caballero, Isidro; Palencia Cortezon, Enrique; Vizan Garcia, Jesus Manuel; Cabrillo, Iban Jose; Calderon, Alicia; Castiñeiras De Saa, Juan Ramon; De Castro Manzano, Pablo; Fernandez, Marcos; Garcia-Ferrero, Juan; Gomez, Gervasio; Lopez Virto, Amparo; Marco, Jesus; Marco, Rafael; Martinez Rivero, Celso; Matorras, Francisco; Piedra Gomez, Jonatan; Rodrigo, Teresa; Rodríguez-Marrero, Ana Yaiza; Ruiz-Jimeno, Alberto; Scodellaro, Luca; Trevisani, Nicolò; Vila, Ivan; Vilar Cortabitarte, Rocio; Abbaneo, Duccio; Auffray, Etiennette; Auzinger, Georg; Bachtis, Michail; Baillon, Paul; Ball, Austin; Barney, David; Benaglia, Andrea; Bendavid, Joshua; Benhabib, Lamia; Berruti, Gaia Maria; Bloch, Philippe; Bocci, Andrea; Bonato, Alessio; Botta, Cristina; Breuker, Horst; Camporesi, Tiziano; Castello, Roberto; Cerminara, Gianluca; D'Alfonso, Mariarosaria; D'Enterria, David; Dabrowski, Anne; Daponte, Vincenzo; David Tinoco Mendes, Andre; De Gruttola, Michele; De Guio, Federico; De Roeck, Albert; De Visscher, Simon; Di Marco, Emanuele; Dobson, Marc; Dordevic, Milos; Dorney, Brian; Du Pree, Tristan; Duggan, Daniel; Dünser, Marc; Dupont, Niels; Elliott-Peisert, Anna; Franzoni, Giovanni; Fulcher, Jonathan; Funk, Wolfgang; Gigi, Dominique; Gill, Karl; Giordano, Domenico; Girone, Maria; Glege, Frank; Guida, Roberto; Gundacker, Stefan; Guthoff, Moritz; Hammer, Josef; Harris, Philip; Hegeman, Jeroen; Innocente, Vincenzo; Janot, Patrick; Kirschenmann, Henning; Kortelainen, Matti J; Kousouris, Konstantinos; Krajczar, Krisztian; Lecoq, Paul; Lourenco, Carlos; Lucchini, Marco Toliman; Magini, Nicolo; Malgeri, Luca; Mannelli, Marcello; Martelli, Arabella; Masetti, Lorenzo; Meijers, Frans; Mersi, Stefano; Meschi, Emilio; Moortgat, Filip; Morovic, Srecko; Mulders, Martijn; Nemallapudi, Mythra Varun; Neugebauer, Hannes; Orfanelli, Styliani; Orsini, Luciano; Pape, Luc; Perez, Emmanuelle; Peruzzi, Marco; Petrilli, Achille; Petrucciani, Giovanni; Pfeiffer, Andreas; Pierini, Maurizio; Piparo, Danilo; Racz, Attila; Reis, Thomas; Rolandi, Gigi; Rovere, Marco; Ruan, Manqi; Sakulin, Hannes; Schäfer, Christoph; Schwick, Christoph; Seidel, Markus; Sharma, Archana; Silva, Pedro; Simon, Michal; Sphicas, Paraskevas; Steggemann, Jan; Stieger, Benjamin; Stoye, Markus; Takahashi, Yuta; Treille, Daniel; Triossi, Andrea; Tsirou, Andromachi; Veres, Gabor Istvan; Wardle, Nicholas; Wöhri, Hermine Katharina; Zagoździńska, Agnieszka; Zeuner, Wolfram Dietrich; Bertl, Willi; Deiters, Konrad; Erdmann, Wolfram; Horisberger, Roland; Ingram, Quentin; Kaestli, Hans-Christian; Kotlinski, Danek; Langenegger, Urs; Rohe, Tilman; Bachmair, Felix; Bäni, Lukas; Bianchini, Lorenzo; Casal, Bruno; Dissertori, Günther; Dittmar, Michael; Donegà, Mauro; Eller, Philipp; Grab, Christoph; Heidegger, Constantin; Hits, Dmitry; Hoss, Jan; Kasieczka, Gregor; Lecomte, Pierre; Lustermann, Werner; Mangano, Boris; Marionneau, Matthieu; Martinez Ruiz del Arbol, Pablo; Masciovecchio, Mario; Meister, Daniel; Micheli, Francesco; Musella, Pasquale; Nessi-Tedaldi, Francesca; Pandolfi, Francesco; Pata, Joosep; Pauss, Felicitas; Perrozzi, Luca; Quittnat, Milena; Rossini, Marco; Schönenberger, Myriam; Starodumov, Andrei; Takahashi, Maiko; Tavolaro, Vittorio Raoul; Theofilatos, Konstantinos; Wallny, Rainer; Aarrestad, Thea Klaeboe; Amsler, Claude; Caminada, Lea; Canelli, Maria Florencia; Chiochia, Vincenzo; De Cosa, Annapaola; Galloni, Camilla; Hinzmann, Andreas; Hreus, Tomas; Kilminster, Benjamin; Lange, Clemens; Ngadiuba, Jennifer; Pinna, Deborah; Rauco, Giorgia; Robmann, Peter; Salerno, Daniel; Yang, Yong; Cardaci, Marco; Chen, Kuan-Hsin; Doan, Thi Hien; Jain, Shilpi; Khurana, Raman; Konyushikhin, Maxim; Kuo, Chia-Ming; Lin, Willis; Lu, Yun-Ju; Pozdnyakov, Andrey; Yu, Shin-Shan; Kumar, Arun; Chang, Paoti; Chang, You-Hao; Chang, Yu-Wei; Chao, Yuan; Chen, Kai-Feng; Chen, Po-Hsun; Dietz, Charles; Fiori, Francesco; Grundler, Ulysses; Hou, George Wei-Shu; Hsiung, Yee; Liu, Yueh-Feng; Lu, Rong-Shyang; Miñano Moya, Mercedes; Petrakou, Eleni; Tsai, Jui-fa; Tzeng, Yeng-Ming; Asavapibhop, Burin; Kovitanggoon, Kittikul; Singh, Gurpreet; Srimanobhas, Norraphat; Suwonjandee, Narumon; Adiguzel, Aytul; Cerci, Salim; Demiroglu, Zuhal Seyma; Dozen, Candan; Dumanoglu, Isa; Gecit, Fehime Hayal; Girgis, Semiray; Gokbulut, Gul; Guler, Yalcin; Gurpinar, Emine; Hos, Ilknur; Kangal, Evrim Ersin; Kayis Topaksu, Aysel; Onengut, Gulsen; Ozcan, Merve; Ozdemir, Kadri; Ozturk, Sertac; Tali, Bayram; Topakli, Huseyin; Zorbilmez, Caglar; Bilin, Bugra; Bilmis, Selcuk; Isildak, Bora; Karapinar, Guler; Yalvac, Metin; Zeyrek, Mehmet; Gülmez, Erhan; Kaya, Mithat; Kaya, Ozlem; Yetkin, Elif Asli; Yetkin, Taylan; Cakir, Altan; Cankocak, Kerem; Sen, Sercan; Vardarlı, Fuat Ilkehan; Grynyov, Boris; Levchuk, Leonid; Sorokin, Pavel; Aggleton, Robin; Ball, Fionn; Beck, Lana; Brooke, James John; Clement, Emyr; Cussans, David; Flacher, Henning; Goldstein, Joel; Grimes, Mark; Heath, Greg P; Heath, Helen F; Jacob, Jeson; Kreczko, Lukasz; Lucas, Chris; Meng, Zhaoxia; Newbold, Dave M; Paramesvaran, Sudarshan; Poll, Anthony; Sakuma, Tai; Seif El Nasr-storey, Sarah; Senkin, Sergey; Smith, Dominic; Smith, Vincent J; Belyaev, Alexander; Brew, Christopher; Brown, Robert M; Calligaris, Luigi; Cieri, Davide; Cockerill, David JA; Coughlan, John A; Harder, Kristian; Harper, Sam; Olaiya, Emmanuel; Petyt, David; Shepherd-Themistocleous, Claire; Thea, Alessandro; Tomalin, Ian R; Williams, Thomas; Worm, Steven; Baber, Mark; Bainbridge, Robert; Buchmuller, Oliver; Bundock, Aaron; Burton, Darren; Casasso, Stefano; Citron, Matthew; Colling, David; Corpe, Louie; Dauncey, Paul; Davies, Gavin; De Wit, Adinda; Della Negra, Michel; Dunne, Patrick; Elwood, Adam; Futyan, David; Hall, Geoffrey; Iles, Gregory; Lane, Rebecca; Lucas, Robyn; Lyons, Louis; Magnan, Anne-Marie; Malik, Sarah; Nash, Jordan; Nikitenko, Alexander; Pela, Joao; Pesaresi, Mark; Raymond, David Mark; Richards, Alexander; Rose, Andrew; Seez, Christopher; Tapper, Alexander; Uchida, Kirika; Vazquez Acosta, Monica; Virdee, Tejinder; Zenz, Seth Conrad; Cole, Joanne; Hobson, Peter R; Khan, Akram; Kyberd, Paul; Leslie, Dawn; Reid, Ivan; Symonds, Philip; Teodorescu, Liliana; Turner, Mark; Borzou, Ahmad; Call, Kenneth; Dittmann, Jay; Hatakeyama, Kenichi; Liu, Hongxuan; Pastika, Nathaniel; Charaf, Otman; Cooper, Seth; Henderson, Conor; Rumerio, Paolo; Arcaro, Daniel; Avetisyan, Aram; Bose, Tulika; Gastler, Daniel; Rankin, Dylan; Richardson, Clint; Rohlf, James; Sulak, Lawrence; Zou, David; Alimena, Juliette; Berry, Edmund; Cutts, David; Ferapontov, Alexey; Garabedian, Alex; Hakala, John; Heintz, Ulrich; Jesus, Orduna; Laird, Edward; Landsberg, Greg; Mao, Zaixing; Narain, Meenakshi; Piperov, Stefan; Sagir, Sinan; Syarif, Rizki; Breedon, Richard; Breto, Guillermo; Calderon De La Barca Sanchez, Manuel; Chauhan, Sushil; Chertok, Maxwell; Conway, John; Conway, Rylan; Cox, Peter Timothy; Erbacher, Robin; Funk, Garrett; Gardner, Michael; Ko, Winston; Lander, Richard; Mclean, Christine; Mulhearn, Michael; Pellett, Dave; Pilot, Justin; Ricci-Tam, Francesca; Shalhout, Shalhout; Smith, John; Squires, Michael; Stolp, Dustin; Tripathi, Mani; Wilbur, Scott; Yohay, Rachel; Cousins, Robert; Everaerts, Pieter; Florent, Alice; Hauser, Jay; Ignatenko, Mikhail; Saltzberg, David; Takasugi, Eric; Valuev, Vyacheslav; Weber, Matthias; Burt, Kira; Clare, Robert; Ellison, John Anthony; Gary, J William; Hanson, Gail; Heilman, Jesse; Paneva, Mirena Ivova; Jandir, Pawandeep; Kennedy, Elizabeth; Lacroix, Florent; Long, Owen Rosser; Malberti, Martina; Olmedo Negrete, Manuel; Shrinivas, Amithabh; Wei, Hua; Wimpenny, Stephen; Yates, Brent; Branson, James G; Cerati, Giuseppe Benedetto; Cittolin, Sergio; D'Agnolo, Raffaele Tito; Derdzinski, Mark; Holzner, André; Kelley, Ryan; Klein, Daniel; Letts, James; Macneill, Ian; Olivito, Dominick; Padhi, Sanjay; Pieri, Marco; Sani, Matteo; Sharma, Vivek; Simon, Sean; Tadel, Matevz; Vartak, Adish; Wasserbaech, Steven; Welke, Charles; Würthwein, Frank; Yagil, Avraham; Zevi Della Porta, Giovanni; Bradmiller-Feld, John; Campagnari, Claudio; Dishaw, Adam; Dutta, Valentina; Flowers, Kristen; Franco Sevilla, Manuel; Geffert, Paul; George, Christopher; Golf, Frank; Gouskos, Loukas; Gran, Jason; Incandela, Joe; Mccoll, Nickolas; Mullin, Sam Daniel; Richman, Jeffrey; Stuart, David; Suarez, Indara; West, Christopher; Yoo, Jaehyeok; Anderson, Dustin; Apresyan, Artur; Bornheim, Adolf; Bunn, Julian; Chen, Yi; Duarte, Javier; Mott, Alexander; Newman, Harvey B; Pena, Cristian; Spiropulu, Maria; Vlimant, Jean-Roch; Xie, Si; Zhu, Ren-Yuan; Andrews, Michael Benjamin; Azzolini, Virginia; Calamba, Aristotle; Carlson, Benjamin; Ferguson, Thomas; Paulini, Manfred; Russ, James; Sun, Menglei; Vogel, Helmut; Vorobiev, Igor; Cumalat, John Perry; Ford, William T; Gaz, Alessandro; Jensen, Frank; Johnson, Andrew; Krohn, Michael; Mulholland, Troy; Nauenberg, Uriel; Stenson, Kevin; Wagner, Stephen Robert; Alexander, James; Chatterjee, Avishek; Chaves, Jorge; Chu, Jennifer; Dittmer, Susan; Eggert, Nicholas; Mirman, Nathan; Nicolas Kaufman, Gala; Patterson, Juliet Ritchie; Rinkevicius, Aurelijus; Ryd, Anders; Skinnari, Louise; Soffi, Livia; Sun, Werner; Tan, Shao Min; Teo, Wee Don; Thom, Julia; Thompson, Joshua; Tucker, Jordan; Weng, Yao; Wittich, Peter; Abdullin, Salavat; Albrow, Michael; Apollinari, Giorgio; Banerjee, Sunanda; Bauerdick, Lothar AT; Beretvas, Andrew; Berryhill, Jeffrey; Bhat, Pushpalatha C; Bolla, Gino; Burkett, Kevin; Butler, Joel Nathan; Cheung, Harry; Chlebana, Frank; Cihangir, Selcuk; Elvira, Victor Daniel; Fisk, Ian; Freeman, Jim; Gottschalk, Erik; Gray, Lindsey; Green, Dan; Grünendahl, Stefan; Gutsche, Oliver; Hanlon, Jim; Hare, Daryl; Harris, Robert M; Hasegawa, Satoshi; Hirschauer, James; Hu, Zhen; Jayatilaka, Bodhitha; Jindariani, Sergo; Johnson, Marvin; Joshi, Umesh; Klima, Boaz; Kreis, Benjamin; Lammel, Stephan; Linacre, Jacob; Lincoln, Don; Lipton, Ron; Liu, Tiehui; Lopes De Sá, Rafael; Lykken, Joseph; Maeshima, Kaori; Marraffino, John Michael; Maruyama, Sho; Mason, David; McBride, Patricia; Merkel, Petra; Mrenna, Stephen; Nahn, Steve; Newman-Holmes, Catherine; O'Dell, Vivian; Pedro, Kevin; Prokofyev, Oleg; Rakness, Gregory; Sexton-Kennedy, Elizabeth; Soha, Aron; Spalding, William J; Spiegel, Leonard; Stoynev, Stoyan; Strobbe, Nadja; Taylor, Lucas; Tkaczyk, Slawek; Tran, Nhan Viet; Uplegger, Lorenzo; Vaandering, Eric Wayne; Vernieri, Caterina; Verzocchi, Marco; Vidal, Richard; Wang, Michael; Weber, Hannsjoerg Artur; Whitbeck, Andrew; Acosta, Darin; Avery, Paul; Bortignon, Pierluigi; Bourilkov, Dimitri; Carnes, Andrew; Carver, Matthew; Curry, David; Das, Souvik; Field, Richard D; Furic, Ivan-Kresimir; Gleyzer, Sergei V; Konigsberg, Jacobo; Korytov, Andrey; Kotov, Khristian; Ma, Peisen; Matchev, Konstantin; Mei, Hualin; Milenovic, Predrag; Mitselmakher, Guenakh; Rank, Douglas; Rossin, Roberto; Shchutska, Lesya; Snowball, Matthew; Sperka, David; Terentyev, Nikolay; Thomas, Laurent; Wang, Jian; Wang, Sean-Jiun; Yelton, John; Hewamanage, Samantha; Linn, Stephan; Markowitz, Pete; Martinez, German; Rodriguez, Jorge Luis; Ackert, Andrew; Adams, Jordon Rowe; Adams, Todd; Askew, Andrew; Bein, Samuel; Bochenek, Joseph; Diamond, Brendan; Haas, Jeff; Hagopian, Sharon; Hagopian, Vasken; Johnson, Kurtis F; Khatiwada, Ajeeta; Prosper, Harrison; Weinberg, Marc; Baarmand, Marc M; Bhopatkar, Vallary; Colafranceschi, Stefano; Hohlmann, Marcus; Kalakhety, Himali; Noonan, Daniel; Roy, Titas; Yumiceva, Francisco; Adams, Mark Raymond; Apanasevich, Leonard; Berry, Douglas; Betts, Russell Richard; Bucinskaite, Inga; Cavanaugh, Richard; Evdokimov, Olga; Gauthier, Lucie; Gerber, Cecilia Elena; Hofman, David Jonathan; Kurt, Pelin; O'Brien, Christine; Sandoval Gonzalez, Irving Daniel; Turner, Paul; Varelas, Nikos; Wu, Zhenbin; Zakaria, Mohammed; Bilki, Burak; Clarida, Warren; Dilsiz, Kamuran; Durgut, Süleyman; Gandrajula, Reddy Pratap; Haytmyradov, Maksat; Khristenko, Viktor; Merlo, Jean-Pierre; Mermerkaya, Hamit; Mestvirishvili, Alexi; Moeller, Anthony; Nachtman, Jane; Ogul, Hasan; Onel, Yasar; Ozok, Ferhat; Penzo, Aldo; Snyder, Christina; Tiras, Emrah; Wetzel, James; Yi, Kai; Anderson, Ian; Barnett, Bruce Arnold; Blumenfeld, Barry; Eminizer, Nicholas; Fehling, David; Feng, Lei; Gritsan, Andrei; Maksimovic, Petar; Martin, Christopher; Osherson, Marc; Roskes, Jeffrey; Cocoros, Alice; Sarica, Ulascan; Swartz, Morris; Xiao, Meng; Xin, Yongjie; You, Can; Baringer, Philip; Bean, Alice; Benelli, Gabriele; Bruner, Christopher; Kenny III, Raymond Patrick; Majumder, Devdatta; Malek, Magdalena; Mcbrayer, William; Murray, Michael; Sanders, Stephen; Stringer, Robert; Wang, Quan; Ivanov, Andrew; Kaadze, Ketino; Khalil, Sadia; Makouski, Mikhail; Maravin, Yurii; Mohammadi, Abdollah; Saini, Lovedeep Kaur; Skhirtladze, Nikoloz; Toda, Sachiko; Lange, David; Rebassoo, Finn; Wright, Douglas; Anelli, Christopher; Baden, Drew; Baron, Owen; Belloni, Alberto; Calvert, Brian; Eno, Sarah Catherine; Ferraioli, Charles; Gomez, Jaime; Hadley, Nicholas John; Jabeen, Shabnam; Kellogg, Richard G; Kolberg, Ted; Kunkle, Joshua; Lu, Ying; Mignerey, Alice; Shin, Young Ho; Skuja, Andris; Tonjes, Marguerite; Tonwar, Suresh C; Apyan, Aram; Barbieri, Richard; Baty, Austin; Bierwagen, Katharina; Brandt, Stephanie; Busza, Wit; Cali, Ivan Amos; Demiragli, Zeynep; Di Matteo, Leonardo; Gomez Ceballos, Guillelmo; Goncharov, Maxim; Gulhan, Doga; Iiyama, Yutaro; Innocenti, Gian Michele; Klute, Markus; Kovalskyi, Dmytro; Lai, Yue Shi; Lee, Yen-Jie; Levin, Andrew; Luckey, Paul David; Marini, Andrea Carlo; Mcginn, Christopher; Mironov, Camelia; Narayanan, Siddharth; Niu, Xinmei; Paus, Christoph; Roland, Christof; Roland, Gunther; Salfeld-Nebgen, Jakob; Stephans, George; Sumorok, Konstanty; Varma, Mukund; Velicanu, Dragos; Veverka, Jan; Wang, Jing; Wang, Ta-Wei; Wyslouch, Bolek; Yang, Mingming; Zhukova, Victoria; Dahmes, Bryan; Evans, Andrew; Finkel, Alexey; Gude, Alexander; Hansen, Peter; Kalafut, Sean; Kao, Shih-Chuan; Klapoetke, Kevin; Kubota, Yuichi; Lesko, Zachary; Mans, Jeremy; Nourbakhsh, Shervin; Ruckstuhl, Nicole; Rusack, Roger; Tambe, Norbert; Turkewitz, Jared; Acosta, John Gabriel; Oliveros, Sandra; Avdeeva, Ekaterina; Bartek, Rachel; Bloom, Kenneth; Bose, Suvadeep; Claes, Daniel R; Dominguez, Aaron; Fangmeier, Caleb; Gonzalez Suarez, Rebeca; Kamalieddin, Rami; Knowlton, Dan; Kravchenko, Ilya; Meier, Frank; Monroy, Jose; Ratnikov, Fedor; Siado, Joaquin Emilo; Snow, Gregory R; Alyari, Maral; Dolen, James; George, Jimin; Godshalk, Andrew; Harrington, Charles; Iashvili, Ia; Kaisen, Josh; Kharchilava, Avto; Kumar, Ashish; Rappoccio, Salvatore; Roozbahani, Bahareh; Alverson, George; Barberis, Emanuela; Baumgartel, Darin; Chasco, Matthew; Hortiangtham, Apichart; Massironi, Andrea; Morse, David Michael; Nash, David; Orimoto, Toyoko; Teixeira De Lima, Rafael; Trocino, Daniele; Wang, Ren-Jie; Wood, Darien; Zhang, Jinzhong; Bhattacharya, Saptaparna; Hahn, Kristan Allan; Kubik, Andrew; Low, Jia Fu; Mucia, Nicholas; Odell, Nathaniel; Pollack, Brian; Schmitt, Michael Henry; Sung, Kevin; Trovato, Marco; Velasco, Mayda; Brinkerhoff, Andrew; Dev, Nabarun; Hildreth, Michael; Jessop, Colin; Karmgard, Daniel John; Kellams, Nathan; Lannon, Kevin; Marinelli, Nancy; Meng, Fanbo; Mueller, Charles; Musienko, Yuri; Planer, Michael; Reinsvold, Allison; Ruchti, Randy; Smith, Geoffrey; Taroni, Silvia; Valls, Nil; Wayne, Mitchell; Wolf, Matthias; Woodard, Anna; Antonelli, Louis; Brinson, Jessica; Bylsma, Ben; Durkin, Lloyd Stanley; Flowers, Sean; Hart, Andrew; Hill, Christopher; Hughes, Richard; Ji, Weifeng; Ling, Ta-Yung; Liu, Bingxuan; Luo, Wuming; Puigh, Darren; Rodenburg, Marissa; Winer, Brian L; Wulsin, Howard Wells; Driga, Olga; Elmer, Peter; Hardenbrook, Joshua; Hebda, Philip; Koay, Sue Ann; Lujan, Paul; Marlow, Daniel; Medvedeva, Tatiana; Mooney, Michael; Olsen, James; Palmer, Christopher; Piroué, Pierre; Stickland, David; Tully, Christopher; Zuranski, Andrzej; Malik, Sudhir; Barker, Anthony; Barnes, Virgil E; Benedetti, Daniele; Bortoletto, Daniela; Gutay, Laszlo; Jha, Manoj; Jones, Matthew; Jung, Andreas Werner; Jung, Kurt; Kumar, Ajay; Miller, David Harry; Neumeister, Norbert; Radburn-Smith, Benjamin Charles; Shi, Xin; Shipsey, Ian; Silvers, David; Sun, Jian; Svyatkovskiy, Alexey; Wang, Fuqiang; Xie, Wei; Xu, Lingshan; Parashar, Neeti; Stupak, John; Adair, Antony; Akgun, Bora; Chen, Zhenyu; Ecklund, Karl Matthew; Geurts, Frank JM; Guilbaud, Maxime; Li, Wei; Michlin, Benjamin; Northup, Michael; Padley, Brian Paul; Redjimi, Radia; Roberts, Jay; Rorie, Jamal; Tu, Zhoudunming; Zabel, James; Betchart, Burton; Bodek, Arie; de Barbaro, Pawel; Demina, Regina; Eshaq, Yossof; Ferbel, Thomas; Galanti, Mario; Garcia-Bellido, Aran; Han, Jiyeon; Harel, Amnon; Hindrichs, Otto; Khukhunaishvili, Aleko; Lo, Kin Ho; Petrillo, Gianluca; Tan, Ping; Verzetti, Mauro; Chou, John Paul; Contreras-Campana, Emmanuel; Ferencek, Dinko; Gershtein, Yuri; Halkiadakis, Eva; Heindl, Maximilian; Hidas, Dean; Hughes, Elliot; Kaplan, Steven; Kunnawalkam Elayavalli, Raghav; Lath, Amitabh; Nash, Kevin; Saka, Halil; Salur, Sevil; Schnetzer, Steve; Sheffield, David; Somalwar, Sunil; Stone, Robert; Thomas, Scott; Thomassen, Peter; Walker, Matthew; Foerster, Mark; Riley, Grant; Rose, Keith; Spanier, Stefan; Thapa, Krishna; Bouhali, Othmane; Castaneda Hernandez, Alfredo; Celik, Ali; Dalchenko, Mykhailo; De Mattia, Marco; Delgado, Andrea; Dildick, Sven; Eusebi, Ricardo; Gilmore, Jason; Huang, Tao; Kamon, Teruki; Krutelyov, Vyacheslav; Mueller, Ryan; Osipenkov, Ilya; Pakhotin, Yuriy; Patel, Rishi; Perloff, Alexx; Rose, Anthony; Safonov, Alexei; Tatarinov, Aysen; Ulmer, Keith; Akchurin, Nural; Cowden, Christopher; Damgov, Jordan; Dragoiu, Cosmin; Dudero, Phillip Russell; Faulkner, James; Kunori, Shuichi; Lamichhane, Kamal; Lee, Sung Won; Libeiro, Terence; Undleeb, Sonaina; Volobouev, Igor; Appelt, Eric; Delannoy, Andrés G; Greene, Senta; Gurrola, Alfredo; Janjam, Ravi; Johns, Willard; Maguire, Charles; Mao, Yaxian; Melo, Andrew; Ni, Hong; Sheldon, Paul; Tuo, Shengquan; Velkovska, Julia; Xu, Qiao; Arenton, Michael Wayne; Cox, Bradley; Francis, Brian; Goodell, Joseph; Hirosky, Robert; Ledovskoy, Alexander; Li, Hengne; Lin, Chuanzhe; Neu, Christopher; Sinthuprasith, Tutanon; Sun, Xin; Wang, Yanchu; Wolfe, Evan; Wood, John; Xia, Fan; Clarke, Christopher; Harr, Robert; Karchin, Paul Edmund; Kottachchi Kankanamge Don, Chamath; Lamichhane, Pramod; Sturdy, Jared; Belknap, Donald; Carlsmith, Duncan; Cepeda, Maria; Dasu, Sridhara; Dodd, Laura; Duric, Senka; Gomber, Bhawna; Grothe, Monika; Herndon, Matthew; Hervé, Alain; Klabbers, Pamela; Lanaro, Armando; Levine, Aaron; Long, Kenneth; Loveless, Richard; Mohapatra, Ajit; Ojalvo, Isabel; Perry, Thomas; Pierro, Giuseppe Antonio; Polese, Giovanni; Ruggles, Tyler; Sarangi, Tapas; Savin, Alexander; Sharma, Archana; Smith, Nicholas; Smith, Wesley H; Taylor, Devin; Verwilligen, Piet; Woods, Nathaniel

    2017-07-31

    Two-particle correlations in pPb collisions at a nucleon-nucleon center-of-mass energy of 5.02 TeV are studied as a function of the pseudorapidity separation ($\\Delta \\eta$) of the particle pair at small relative azimuthal angle ($ | \\Delta \\phi | < \\pi/3$). The correlations are decomposed into a jet component that dominates the short-range correlations ($ | \\Delta \\eta | < $ 1), and a component that persists at large $\\Delta \\eta$ and may originate from collective behavior of the produced system. The events are classified in terms of the multiplicity of the produced particles. Finite azimuthal anisotropies are observed in high-multiplicity events. The second and third Fourier components of the particle-pair azimuthal correlations, $V_2$ and $V_3$, are extracted after subtraction of the jet component. The single-particle anisotropy parameters $v_2$ and $v_3$ are normalized by their lab frame mid-rapidity value and are studied as a function of $\\eta_{\\text{cm}}$. The normalized $v_2$ distribution is foun...

  16. Ferromagnetism in ferroelectric BaTiO3 induced by vacancies: Sensitive dependence on charge state, origin of magnetism, and temperature range of existence

    Science.gov (United States)

    Raeliarijaona, Aldo; Fu, Huaxiang

    2017-10-01

    Using density-functional calculations we investigate the possibility and underlying mechanism of generating ferromagnetism (FM) in ferroelectric BaTiO3 by native vacancies. For the same vacancy species but different charge states (e.g., VO0 vs VO2 +), our paper reveals a marked difference in magnetic behaviors. For instance, while VO0 is ferromagnetic, VO2 + is not. This sensitive dependence, which has often been overlooked, highlights the critical importance of taking into account different charge states. Furthermore, while oxygen vacancies have been often used in experiments to explain the vacancy-induced FM, our calculation demonstrates that Ti vacancies, in particular VTi3 - and VTi2 - with low formation energies, generate even stronger ferromagnetism in BaTiO3, with a magnetic moment which is 400% larger than that of VO0. Interestingly, this strong FM of VTi can be further enhanced by hole doping. Although both cation vacancies (VTiq) and anion vacancies (VO0) induce FM, their mechanisms differ drastically. FM of anion vacancies originates from the spin-polarized electrons at Ti sites, but FM of cation vacancies stems from the spin-polarized holes at O sites. This paper also sheds light on vacancy-induced FM by discovering that the spin densities of all three considered vacancy species are highly extended in real space, distributed far away from the vacancy. Moreover, we predict that the ferromagnetism caused by VTi3 - is able to survive at high temperatures, which is promising for room-temperature spintronic or multiferroic applications.

  17. Cosmogenic exposure dating of boulders and bedrock in Denmark: wide range in ages reflect strong dependence of post-depositional stability related to specific glacial landforms

    Science.gov (United States)

    Houmark-Nielsen, Michael; Linge, Henriette; Fabel, Derek; Xu, Sheng

    2010-05-01

    The timing of ice-sheet fluctuations, as indicated by glacier advances and retreats, is detected from a wide range of geochronological techniques, including varve counting, and radiocarbon and luminescence dating of proglacial and inter till sediments. A robust Late Weichselian chronology of deglacial ice sheet fluctuations in southwestern Scandinavia indicates that the decline of the Scandinavian Ice Sheet from the Last Glacial Maximum position at c. 23-21 kyr (thousands of years) ago in central Denmark occurred through recessional stages and readvances. Active glaciers withdrew from eastern Denmark 17-16 kyr ago and left the southwestern Baltic basin ice free at the beginning of the Bølling interstade c. 14.5 kyr ago. The withdrawal left behind belts of elongate end moraines and streamlined ground moraine as large ice masses were successively isolated causing massive down wasting until c. 12 - 11 kyr ago. In Eastern Denmark and southernmost Sweden this lead to formation of complex superimposed glacial landscapes originally covered with a wealth of erratic boulders. Hitherto untried cosmogenic nuclide surface exposure dating was applied to sites in Eastern Denmark to test the method against independent chronologies. Samples collected from erratics, moraines and ice-sculpted bedrock were prepared at the Cosmogenic Nuclide Laboratory at the University of Glasgow and AMS measurements were carried out at the Scottish Universities Environmental Research Centre (SUERC) AMS facility. Procedural blank corrected 10Be concentrations were converted to in situ 10Be surface exposure ages using the online CRONUS-Earth 10Be-26Al exposure age calculator Version 2.2. Exposure ages from 35 samples range between 11.5 and 20 kyr, 18 of which lie within the expected age envelope. Two samples show overestimated ages apparently due to cosmogenic nuclide inheritance from previous exposure episodes. The remaining 17, two of which have suffered from exhumation, are younger than predicted

  18. Speciation of lead, copper, zinc and antimony in water draining a shooting range--time dependant metal accumulation and biomarker responses in brown trout (Salmo trutta L.).

    Science.gov (United States)

    Heier, Lene Sørlie; Lien, Ivar B; Strømseng, Arnljot E; Ljønes, Marita; Rosseland, Bjørn Olav; Tollefsen, Knut-Erik; Salbu, Brit

    2009-06-15

    The speciation of Pb, Cu, Zn and Sb in a shooting range run-off stream were studied during a period of 23 days. In addition, metal accumulation in gills and liver, red blood cell ALA-D activity, hepatic metallothionine (Cd/Zn-MT) and oxidative stress index (GSSG/ tGSH levels) in brown trout (Salmo trutta L.) exposed to the stream were investigated. Fish, contained in cages, were exposed and sampled after 0, 2, 4, 7, 9, 11 and 23 days of exposure. Trace metals in the water were fractionated in situ according to size (nominal molecular mass) and charge properties. During the experimental period an episode with higher runoff occurred resulting in increased levels of metals in the stream. Pb and Cu were mainly found as high molecular mass species, while Zn and Sb were mostly present as low molecular mass species. Pb, Cu and Sb accumulated on gills, in addition to Al origination from natural sources in the catchment. Pb, Cu and Sb were also detected at elevated concentration in the liver. Blood glucose and plasma Na and Cl levels were significantly altered during the exposure period, and are attributed to elevated concentrations of Pb, Cu and Al. A significant suppression of ALA-D was detected after 11 days. Significant differences were detected in Cd/Zn-MT and oxidative stress (tGSH/GSSG) responses at Day 4. For Pb the results show a clear link between the HMM (high molecular mass) positively charged Pb species, followed by accumulation on gills and liver and a suppression in ALA-D. Thus, high flow episodes can remobilise metals from the catchment, inducing stress to aquatic organisms.

  19. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  20. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  1. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  2. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  3. Quantum Probability and Operant Conditioning: Behavioral Uncertainty in Reinforcement Learning

    OpenAIRE

    Alonso, E.; Mondragon, E.

    2014-01-01

    An implicit assumption in the study of operant conditioning and reinforcement learning is that behavior is stochastic, in that it depends on the probability that an outcome follows a response and on how the presence or absence of the output affects the frequency of the response. In this paper we argue that classical probability is not the right tool to represent uncertainty operant conditioning and propose an interpretation of behavioral states in terms of quantum probability instead.

  4. Calculations of the dominant long-range, spin-independent contributions to the interaction energy between two nonrelativistic Dirac fermions from double-boson exchange of spin-0 and spin-1 bosons with spin-dependent couplings

    Science.gov (United States)

    Aldaihan, S.; Krause, D. E.; Long, J. C.; Snow, W. M.

    2017-05-01

    Various theories beyond the Standard Model predict new particles with masses in the sub-eV range with very weak couplings to ordinary matter which can possess spin-dependent couplings to electrons and nucleons. Present laboratory constraints on exotic spin-dependent interactions with pseudoscalar and axial couplings for exchange boson masses between meV and eV are very poor compared to constraints on spin-independent interactions in the same mass range arising from spin-0 and spin-1 boson exchange. It is therefore interesting to analyze in a general way how one can use the strong experimental bounds on spin-independent interactions to also constrain spin-dependent interactions by considering higher-order exchange processes. The exchange of a pair of bosons between two fermions with spin-dependent couplings will possess contributions which flip spins twice and thereby generate a polarization-independent interaction energy which can add coherently between two unpolarized objects. In this paper we derive the dominant long-range contributions to the interaction energy between two nonrelativistic spin-1 /2 Dirac fermions from double exchange of spin-0 and spin-1 bosons proportional to couplings of the form gP4, gS2gP2, and gV2gA2 . Our results for gP4 are in agreement with previous calculations that have appeared in the literature. We demonstrate the usefulness of this analysis to constrain spin-dependent couplings by presenting the results of a reanalysis of data from a short-range gravity experiment to derive an improved constraint on (gPN)2, the pseudoscalar coupling for nucleons, in the range between 40 and 200 μ m of about a factor of 5 compared to previous limits. We hope that the expressions derived in this work will be employed by other researchers in the future to evaluate whether or not they can constrain exotic spin-dependent interactions from spin-independent measurements. The spin-independent contribution from 2-boson exchange with axial vector couplings

  5. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  6. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  7. Accurate measurement of the energy dependence of the process $e^{+} + e^{-} \\to e^{\\pm} + e^{\\mp}$, in the s-range 1.44-9.0 $GeV^{2}$

    CERN Document Server

    Bernardini, M; Brunini, P L; Fiorentino, E; Massam, Thomas; Monari, L; Palmonari, F; Rimondi, F; Zichichi, A

    1973-01-01

    The analysis of 12827 e/sup +/+e/sup -/ to e/sup +or-/+e/sup -or+/ events observed in the s-range 1.44-9.0 GeV/sup 2/ allows measurement of the energy dependence of the cross-section for the most typical QED process, with +or-2% accuracy. Within this limit the data follow QED, with first-order radiative corrections included.

  8. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  9. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  10. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  11. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  12. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  13. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  15. A Priori Probability Distribution of the Cosmological Constant

    OpenAIRE

    Weinberg, Steven

    2000-01-01

    In calculations of the probability distribution for the cosmological constant, it has been previously assumed that the a priori probability distribution is essentially constant in the very narrow range that is anthropically allowed. This assumption has recently been challenged. Here we identify large classes of theories in which this assumption is justified.

  16. Probable doxycycline-induced acute pancreatitis.

    Science.gov (United States)

    Moy, Brian T; Kapila, Nikhil

    2016-03-01

    A probable case of doxycycline-induced pancreatitis is reported. A 51-year-old man was admitted to the emergency department with a one-week history of extreme fatigue, malaise, and confusion. Three days earlier he had been started on empirical doxycycline therapy for presumed Lyme disease; he was taking no other medications at the time of admission. A physical examination was remarkable for abdominal tenderness. Relevant laboratory data included a lipase concentration of 5410 units/L (normal range, 13-60 units/L), an amylase concentration of 1304 (normal range, 28-100 units/L), and a glycosylated hemoglobin concentration of 15.2% (normal, doxycycline was discontinued. With vasopressor support, aggressive fluid resuscitation, hemodialysis, and an insulin infusion, the patient's clinical course rapidly improved over five days. Scoring of the case via the method of Naranjo et al. yielded a score of 6, indicating a probable adverse reaction to doxycycline. A man developed AP three days after starting therapy with oral doxycycline, and the association between drug and reaction was determined to be probable. His case appears to be the third of doxycycline-associated AP, although tigecycline, tetracycline, and minocycline have also been implicated as causes of AP. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  17. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  18. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  19. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  20. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients.

    Science.gov (United States)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch; Winge, Kristian; Friis, Søren

    2016-01-01

    Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. From the Danish National Patient Registry, we identified 782 patients diagnosed with conditions potentially compatible with probable MSA (International Classification of Diseases, version 10 (ICD-10) codes G23.2, G23.8 and G23.9) during 1994-2009. Through medical record review, we narrowed our sample to 115 patients who fulfilled the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years (range 0-15 years). One-third of patients experienced a transient improvement in motor symptoms with use of levodopa. Median survival from disease onset was 6.9 years (range 1-16 years, 95% CI 6.3-7.5) with no apparent variation according to gender or subtype. Our nationwide approach corroborated that MSA is associated with diverse and grave symptoms, only limited response to levodopa, and poor prognosis. © 2016 S. Karger AG, Basel.

  1. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...

  2. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  3. Incompatible Stochastic Processes and Complex Probabilities

    Science.gov (United States)

    Zak, Michail

    1997-01-01

    The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.

  4. A study of the temperature dependence of the infrared absorption cross-sections of 2,2,3,3,3-pentafluoropropanol in the range of 298-362 K

    Science.gov (United States)

    Godin, Paul J.; Cabaj, Alex; Xu, Li-Hong; Le Bris, Karine; Strong, Kimberly

    2017-01-01

    Absorption cross-sections of 2,2,3,3,3-pentafluoropropanol (PFPO) were derived from Fourier transform infrared spectra recorded from 565 to 3400 cm-1 with a resolution of 0.1 cm-1 over a temperature range of 298-362 K. These results were compared to previously published theoretical density functional theory (DFT) calculations and experimental measurements made at room temperature. We find good agreement between our experimentally derived results, DFT calculations, and previously published data. The only temperature dependence observed was in the centroid shift of the 850-1500 cm-1 band and in the amplitude of some of the absorption peaks. However, this temperature dependence does not result in a significant trend in integrated band strength as a function of temperature. We calculate an average integrated band strength of (1.991±0.001)×10-16 cm molecule-1 for PFPO over the spectral range studied. Radiative efficiencies (REs) and the global warming potential (GWP) for PFPO were also derived. We find an average RE of 0.2603 ± 0.0007 Wm-2ppbv-1 and a GWP100 of 19.8. The calculated radiative efficiencies show that no dependence on temperature and our findings are consistent with previous studies, increasing our confidence in the value of the GWP of PFPO.

  5. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  6. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  7. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  8. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  9. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  10. Methods for combining experts' probability assessments.

    Science.gov (United States)

    Jacobs, R A

    1995-09-01

    This article reviews statistical techniques for combining multiple probability distributions. The framework is that of a decision maker who consults several experts regarding some events. The experts express their opinions in the form of probability distributions. The decision maker must aggregate the experts' distributions into a single distribution that can be used for decision making. Two classes of aggregation methods are reviewed. When using a supra Bayesian procedure, the decision maker treats the expert opinions as data that may be combined with its own prior distribution via Bayes' rule. When using a linear opinion pool, the decision maker forms a linear combination of the expert opinions. The major feature that makes the aggregation of expert opinions difficult is the high correlation or dependence that typically occurs among these opinions. A theme of this paper is the need for training procedures that result in experts with relatively independent opinions or for aggregation methods that implicitly or explicitly model the dependence among the experts. Analyses are presented that show that m dependent experts are worth the same as k independent experts where k < or = m. In some cases, an exact value for k can be given; in other cases, lower and upper bounds can be placed on k.

  11. SNOWY RANGE WILDERNESS, WYOMING.

    Science.gov (United States)

    Houston, Robert S.; Bigsby, Philip R.

    1984-01-01

    A mineral survey of the Snowy Range Wilderness in Wyoming was undertaken and was followed up with more detailed geologic and geochemical surveys, culminating in diamond drilling of one hole in the Snowy Range Wilderness. No mineral deposits were identified in the Snowy Range Wilderness, but inasmuch as low-grade uranium and associated gold resources were identified in rocks similar to those of the northern Snowy Range Wilderness in an area about 5 mi northeast of the wilderness boundary, the authors conclude that the northern half of the wilderness has a probable-resource potential for uranium and gold. Closely spaced drilling would be required to completely evaluate this mineral potential. The geologic terrane precludes the occurrence of fossil fuels.

  12. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  13. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  14. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  15. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  16. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  17. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  18. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  19. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  20. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...... to Gumbel distributions, and these fits are found to represent the data measured with good accuracy. The pressure distributions found have been used in a calibration of partial factors, which should achieve a certain theoretical target reliability index. For a target annual reliability index of 4...

  1. Calibration of Probabilities: The State of the Art to 1980

    Science.gov (United States)

    1981-06-01

    propositions can be characterized according to the number of alternatives they offer: No alternatives: "What is absinthe ?" The assessor provides an...alternative: " Absinthe is a precious stone. What is the probability that this statement is true?" Again, the relevant range of the probability scale is...0 to 1. Two alternatives: " Absinthe is (a) a precious stone; (b) a liqueur." With the half-range method, the assessor first selects the more likely

  2. Atomic transition probabilities of Er i

    Energy Technology Data Exchange (ETDEWEB)

    Lawler, J E; Den Hartog, E A [Department of Physics, University of Wisconsin, 1150 University Ave., Madison, WI 53706 (United States); Wyart, J-F, E-mail: jelawler@wisc.ed, E-mail: jean-francois.wyart@lac.u-psud.f, E-mail: eadenhar@wisc.ed [Laboratoire Aime Cotton, CNRS (UPR3321), Bat. 505, Centre Universitaire Paris-Sud, 91405-Orsay (France)

    2010-12-14

    Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.

  3. Atomic transition probabilities of Gd i

    Energy Technology Data Exchange (ETDEWEB)

    Lawler, J E; Den Hartog, E A [Department of Physics, University of Wisconsin, 1150 University Avenue, Madison, WI 53706 (United States); Bilty, K A, E-mail: jelawler@wisc.edu, E-mail: biltyka@uwec.edu, E-mail: eadenhar@wisc.edu [Department of Physics and Astronomy, University of Wisconsin-Eau Claire, Eau Claire, WI 54702 (United States)

    2011-05-14

    Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.

  4. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  5. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  6. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  7. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  8. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.

  9. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  10. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  11. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  12. Atomic Transition Probabilities in TiI

    Science.gov (United States)

    Nitz, David E.; Siewert, Lowell K.; Schneider, Matthew N.

    2001-05-01

    We have measured branching fractions and atomic transition probabilities in TiI for 50 visible and near-IR transitions which connect odd-parity levels lying 25000 cm-1 to 27000 cm-1 above the ground state to low-lying even parity levels. Branching fractions are obtained from the analysis of six hollow cathode emission spectra recorded using the Fourier transform spectrometer at the National Solar Observatory, supplemented in cases susceptible to radiation-trapping problems by conventional emission spectroscopy using a commercial sealed lamp operated at very low discharge current. The absolute scale for normalizing the branching fractions is established using radiative lifetimes from time-resolved laser-induced fluorescence measurements.(S. Salih and J.E. Lawler, Astronomy and Astrophysics 239, 407 (1990).) Uncertainties of the transition probabilities range from ±5% for the stronger branches to ±20% for the weaker ones. Among the 16 lines for which previously-measured transition probabilities are listed in the NIST critical compilation,(G. A. Martin, J. R. Fuhr, and W. L. Wiese, J. Phys. Chem. Ref. Data 17, Suppl. 3, 85 (1988).) several significant discrepancies are noted.

  13. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  14. The effect of probability discounting on reward seeking: a three-dimensional perspective

    Directory of Open Access Journals (Sweden)

    Yannick-Andre eBreton

    2014-08-01

    Full Text Available Rats will work for electrical stimulation pulses of the medial forebrain bundle. The rewarding effect arises from the volleys of action potentials fired by the stimulation and subsequent spatio-temporal integration of their post-synpatic impact. The proportion of time allocated to self-stimulation depends on the intensity of the rewarding effect as well as on other key determinants of decision-making, such as subjective opportunity costs and reward probability. We have proposed that a 3D model relating time allocation to the intensity and cost of reward can distinguish manipulations acting prior to the output of the spatio-temporal integrator from those acting at or beyond it. Here, we test this proposition by varying reward probability, a variable that influences the computation of payoff in the 3D model downstream from the output of the integrator. On riskless trials, reward was delivered on every occasion that the rat held down the lever for a cumulative duration called the ``price,'' whereas on risky trials, reward was delivered with probability 0.75 or 0.50. According to the model, the 3D structure relating time allocation to reward intensity and price is shifted leftward along the price axis by reductions in reward probability; the magnitude of the shift estimates the change in subjective probability. The predictions were borne out: reducing reward probability shifted the 3D structure systematically along the price axis while producing only small, inconsistent displacements along the pulse-frequency axis. The results confirm that the model can accurately distinguish manipulations acting at or beyond the spatio-temporal integrator and strengthen the conclusions of previous studies showing similar shifts following dopaminergic manipulations. Subjective and objective reward probabilities appeared indistinguishable over the range of 0.5 <= p <= 1.0.

  15. Associations between the probabilities of frequency-specific hearing loss and unaided APHAB scores.

    Science.gov (United States)

    Löhler, J; Wollenberg, B; Schlattmann, P; Hoang, N; Schönweiler, R

    2017-03-01

    The Abbreviated Profile of Hearing Aid Benefit (APHAB) questionnaire reports subjective hearing impairments in four typical conditions. We investigated the association between the frequency-specific probability of hearing loss and scores from the unaided APHAB (APHABu) to determine whether the APHABu could be useful in primary diagnoses of hearing loss, in addition to pure tone and speech audiometry. This retrospective study included database records from 6558 patients (average age 69.0 years). We employed a multivariate generalised linear mixed model to analyse the probabilities of hearing losses (severity range 20-75 dB, evaluated in 5-dB steps), measured at different frequencies (0.5, 1.0, 2.0, 4.0, and 8.0 kHz), for nearly all combinations of APHABu subscale scores (subscale scores from 20 to 80%, evaluated in steps of 5%). We calculated the probability of hearing loss for 28,561 different combinations of APHABu subscale scores (results available online). In general, the probability of hearing loss was positively associated with the combined APHABu score (i.e. increasing probability with increasing scores). However, this association was negative at one frequency (8 kHz). The highest probabilities were for a hearing loss of 45 dB at test frequency 2.0 kHz, but with a wide spreading. We showed that the APHABu subscale scores were associated with the probability of hearing loss measured with audiometry. This information could enrich the expert's evaluation of the subject's hearing loss, and it might help resolve suspicious cases of aggravation. The 0.5 and 8.0 kHz frequencies influenced hearing loss less than the frequencies in-between, and 2.0 kHz was most influential on intermediate degree hearing loss (around 45 dB), which corresponded to the frequency-dependence of speech intelligibility measured with speech audiometry.

  16. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  17. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  18. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  19. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  20. Probability of Failure in Hypersonic Engines Using Large Deviations

    OpenAIRE

    Papanicolaou, George; West, Nicholas; Yang, Tzu-Wei

    2012-01-01

    We consider a reduced order model of an air-breathing hypersonic engine with a time-dependent stochastic inflow that may cause the failure of the engine. The probability of failure is analyzed by the Freidlin-Wentzell theory, the large deviation principle for finite dimensional stochastic differential equations. We compute the asymptotic failure probability by numerically solving the constrained optimization related to the large deviation problem. A large-deviation-based importance sampling s...

  1. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  2. Posterior Probability Matching and Human Perceptual Decision Making

    Science.gov (United States)

    Murray, Richard F.; Patel, Khushbu; Yee, Alan

    2015-01-01

    Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models’ performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods provide new tools

  3. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  4. Collision energy dependence of the HD(nu' = 2) product rotational distribution of the H + D2 reaction in the range 1.30-1.89 eV.

    Science.gov (United States)

    Ausfelder, Florian; Pomerantz, Andrew E; Zare, Richard N; Althorpe, Stuart C; Aoiz, F J; Banares, Luis; Castillo, Jesus F

    2004-02-15

    An experimental and theoretical investigation of the collision energy dependence of the HD(nu' = 2,j') rotational product state distribution for the H + D2 reaction in the collision energy range of Ecol = 1.30-1.89 eV has been carried out. Theoretical results based on time-dependent and time-independent quantum mechanical methods agree nearly perfectly with each other, and the agreement with the experiment is good at low collision energies and very good at high collision energies. This behavior is in marked contrast to a previous report on the HD(nu' = 3,j') product state rotational distribution [Pomerantz et al., J. Chem. Phys. 120, 3244 (2004)] where a systematic difference between experiment and theory was observed, especially at the highest collision energies. The reason for this different behavior is not yet understood. In addition, this study employs Doppler-free spectroscopy to resolve an ambiguity in the E, F-X resonantly enhanced multiphoton ionization transition originating from the HD(nu' = 2,j' = 1) state, which is found to be caused by an accidental blending with the transition coming from the HD(nu' = 1,j' = 14) state. Copyright 2004 American Institute of Physics

  5. On the Frequency and Voltage-Dependent Profiles of the Surface States and Series Resistance of Au/ZnO/n-Si Structures in a Wide Range of Frequency and Voltage

    Science.gov (United States)

    Nikravan, Afsoun; Badali, Yosef; Altındal, Şemsettin; Uslu, İbrahim; Orak, İkram

    2017-10-01

    In order to interpret the electrical characteristics of fabricated Au/ZnO/n-Si structures as a function of frequency and voltage well, their capacitance-voltage ( C- V) and conductance-voltage ( G/ ω- V) measurements were carried out in a wide range of frequencies (0.7 kHz-2 MHz) and voltages (± 6 V) by 50 mV steps at room temperature. Both the C- V and G/ ω- V plots have reverse, depletion, and accumulation regions such as a metal-insulator/oxide semiconductor (MIS or MOS) structures. The values of doped-donor atoms ( N D), Fermi energy level ( E F), barrier height (ΦB), and series resistance ( R s) of the structure were obtained as a function of frequency and voltage. While the value of N D decreases with increasing frequency almost as exponentially, the value of depletion width ( W D) increases. The values of C and G/ ω increase with decreasing frequency because the surface states ( N ss) are able to follow the alternating current (AC) signal, resulting in excess capacitance ( C ex) and conductance ( G ex/ ω), which depends on their relaxation time and the frequency of the AC signal. The voltage-dependent profiles of N ss were obtained from both the high-low frequency capacitance and Hill-Colleman methods. The other important parameter R s of the structure was also obtained from the Nicollian and Brews methods as a function of voltage.

  6. A novel mechanism for Ca2+/calmodulin-dependent protein kinase II targeting to L-type Ca2+channels that initiates long-range signaling to the nucleus.

    Science.gov (United States)

    Wang, Xiaohan; Marks, Christian R; Perfitt, Tyler L; Nakagawa, Terunaga; Lee, Amy; Jacobson, David A; Colbran, Roger J

    2017-10-20

    Neuronal excitation can induce new mRNA transcription, a phenomenon called excitation-transcription (E-T) coupling. Among several pathways implicated in E-T coupling, activation of voltage-gated L-type Ca 2+ channels (LTCCs) in the plasma membrane can initiate a signaling pathway that ultimately increases nuclear CREB phosphorylation and, in most cases, expression of immediate early genes. Initiation of this long-range pathway has been shown to require recruitment of Ca 2+ -sensitive enzymes to a nanodomain in the immediate vicinity of the LTCC by an unknown mechanism. Here, we show that activated Ca 2+ /calmodulin-dependent protein kinase II (CaMKII) strongly interacts with a novel binding motif in the N-terminal domain of Ca V 1 LTCC α1 subunits that is not conserved in Ca V 2 or Ca V 3 voltage-gated Ca 2+ channel subunits. Mutations in the Ca V 1.3 α1 subunit N-terminal domain or in the CaMKII catalytic domain that largely prevent the in vitro interaction also disrupt CaMKII association with intact LTCC complexes isolated by immunoprecipitation. Furthermore, these same mutations interfere with E-T coupling in cultured hippocampal neurons. Taken together, our findings define a novel molecular interaction with the neuronal LTCC that is required for the initiation of a long-range signal to the nucleus that is critical for learning and memory. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  7. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  8. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  9. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  10. Probability theory a concise course

    CERN Document Server

    Rozanov, Y A

    1977-01-01

    This clear exposition begins with basic concepts and moves on to combination of events, dependent events and random variables, Bernoulli trials and the De Moivre-Laplace theorem, a detailed treatment of Markov chains, continuous Markov processes, and more. Includes 150 problems, many with answers. Indispensable to mathematicians and natural scientists alike.

  11. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  12. A practical overview on probability distributions

    OpenAIRE

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-01-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a bino...

  13. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  14. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  15. HUMAN FAILURE EVENT DEPENDENCE: WHAT ARE THE LIMITS

    Energy Technology Data Exchange (ETDEWEB)

    Herberger, Sarah M.; Boring, Ronald L.

    2016-10-01

    Abstract Objectives: This paper discusses the differences between classical human reliability analysis (HRA) dependence and the full spectrum of probabilistic dependence. Positive influence suggests an error increases the likelihood of subsequent errors or success increases the likelihood of subsequent success. Currently the typical method for dependence in HRA implements the Technique for Human Error Rate Prediction (THERP) positive dependence equations. This assumes that the dependence between two human failure events varies at discrete levels between zero and complete dependence (as defined by THERP). Dependence in THERP does not consistently span dependence values between 0 and 1. In contrast, probabilistic dependence employs Bayes Law, and addresses a continuous range of dependence. Methods: Using the laws of probability, complete dependence and maximum positive dependence do not always agree. Maximum dependence is when two events overlap to their fullest amount. Maximum negative dependence is the smallest amount that two events can overlap. When the minimum probability of two events overlapping is less than independence, negative dependence occurs. For example, negative dependence is when an operator fails to actuate Pump A, thereby increasing his or her chance of actuating Pump B. The initial error actually increases the chance of subsequent success. Results: Comparing THERP and probability theory yields different results in certain scenarios; with the latter addressing negative dependence. Given that most human failure events are rare, the minimum overlap is typically 0. And when the second event is smaller than the first event the max dependence is less than 1, as defined by Bayes Law. As such alternative dependence equations are provided along with a look-up table defining the maximum and maximum negative dependence given the probability of two events. Conclusions: THERP dependence has been used ubiquitously for decades, and has provided approximations of

  16. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  17. Constraints on probability distributions of grammatical forms

    Directory of Open Access Journals (Sweden)

    Kostić Aleksandar

    2007-01-01

    Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.

  18. Probable Cause: A Decision Making Framework.

    Science.gov (United States)

    1984-08-01

    primacy more likely than recency , and vice versa? Third, the updating of causal beliefs depends on positive as well as negative evidence. Therefore, a full...order, contiguity in time and space, and similarity of cause and effect . In doing so, we show how these cues can conflict with probabilistic ideas. A...causal chain between an effect and its presumed cause. The model is used to discuss a wide OR range of studies on causal judgments and explicates

  19. Home range and travels

    Science.gov (United States)

    Stickel, L.F.; King, John A.

    1968-01-01

    The concept of home range was expressed by Seton (1909) in the term 'home region,' which Burr (1940, 1943) clarified with a definition of home range and exemplified in a definitive study of Peromyscus in the field. Burt pointed out the ever-changing characteristics of home-range area and the consequent absence of boundaries in the usual sense--a finding verified by investigators thereafter. In the studies summarized in this paper, sizes of home ranges of Peromyscus varied within two magnitudes, approximately from 0.1 acre to ten acres, in 34 studies conducted in a variety of habitats from the seaside dunes of Florida to the Alaskan forests. Variation in sizes of home ranges was correlated with both environmental and physiological factors; with habitat it was conspicuous, both in the same and different regions. Food supply also was related to size of home range, both seasonally and in relation to habitat. Home ranges generally were smallest in winter and largest in spring, at the onset of the breeding season. Activity and size also were affected by changes in weather. Activity was least when temperatures were low and nights were bright. Effects of rainfall were variable. Sizes varied according to sex and age; young mice remained in the parents' range until they approached maturity, when they began to travel more widely. Adult males commonly had larger home ranges than females, although there were a number of exceptions. An inverse relationship between population density and size of home range was shown in several studies and probably is the usual relationship. A basic need for activity and exploration also appeared to influence size of home range. Behavior within the home range was discussed in terms of travel patterns, travels in relation to home sites and refuges, territory, and stability of size of home range. Travels within the home range consisted of repeated use of well-worn trails to sites of food, shelter, and refuge, plus more random exploratory travels

  20. Per-event probability of hepatitis C infection during sharing of injecting equipment.

    Directory of Open Access Journals (Sweden)

    Lies Boelen

    Full Text Available BACKGROUND: Shared injecting apparatus during drug use is the premier risk factor for hepatitis C virus (HCV transmission. AIMS: To estimate the per-event probability of HCV infection during a sharing event, and the transmission probability of HCV from contaminated injecting apparatus. METHODS: Estimates were obtained using a maximum likelihood method with estimated IDU and sharing events obtained from behavioural data. SETTINGS: Cohort study in multiple correction centres in New South Wales, Australia. PARTICIPANTS: Subjects (N = 500 with a lifetime history of injecting drug use (IDU who were followed up between 2005 and 2012. During follow-up, interviews for risk behaviours were taken and blood sampling (HCV-antibody and RNA testing was performed. MEASUREMENTS: Self-reported frequencies of injecting drugs and sharing events, as well as other risk behaviours and details on the nature of injecting events. FINDINGS: The best estimate of the per-event probability of infection was 0.57% (CI: 0.32-1.05%. A sensitivity analysis on the likely effect of under-reporting of sharing of the injecting apparatus indicated that the per event infection probability may be as low as 0.17% (95% CI: 0.11%-0.25%. The transmission probability was similarly shown to range up to 6%, dependent on the presumed prevalence of the virus in injecting equipment. CONCLUSIONS: The transmission probability of HCV during a sharing event is small. Hence, strategies to reduce the frequency and sharing of injecting equipment are required, as well as interventions focused on decreasing the per event risk.

  1. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  2. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  3. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  4. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  5. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  6. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...

  7. Probability output modeling for support vector machines

    Science.gov (United States)

    Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian

    2007-11-01

    In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.

  8. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  9. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  10. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  11. Probability of flooding: An uncertainty analysis

    NARCIS (Netherlands)

    Slijkhuis, K.A.H.; Frijters, M.P.C.; Cooke, R.M.; Vrouwenvelder, A.C.W.M.

    1998-01-01

    In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could

  12. Lévy processes in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.

  13. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  14. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...

  15. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  16. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  17. Conditional probability of rainfall extremes across multiple durations

    Science.gov (United States)

    Le, Phuong Dong; Leonard, Michael; Westra, Seth

    2017-04-01

    The conditional probability that extreme rainfall will occur at one location given that it is occurring at another location is critical in engineering design and management circumstances including planning of evacuation routes and the sitting of emergency infrastructure. A challenge with this conditional simulation is that in many situations the interest is not so much the conditional distributions of rainfall of the same duration at two locations, but rather the conditional distribution of flooding in two neighbouring catchments, which may be influenced by rainfall of different critical durations. To deal with this challenge, a model that can consider both spatial and duration dependence of extremes is required. The aim of this research is to develop a model that can take account both spatial dependence and duration dependence into the dependence structure of extreme rainfalls. To achieve this aim, this study is a first attempt at combining extreme rainfall for multiple durations within a spatial extreme model framework based on max-stable process theory. Max-stable processes provide a general framework for modelling multivariate extremes with spatial dependence for just a single duration extreme rainfall. To achieve dependence across multiple timescales, this study proposes a new approach that includes addition elements representing duration dependence of extremes to the covariance matrix of max-stable model. To improve the efficiency of calculation, a re-parameterization proposed by Koutsoyiannis et al. (1998) is used to reduce the number of parameters necessary to be estimated. This re-parameterization enables the GEV parameters to be represented as a function of timescale. A stepwise framework has been adopted to achieve the overall aims of this research. Firstly, the re-parameterization is used to define a new set of common parameters for marginal distribution across multiple durations. Secondly, spatial interpolation of the new parameter set is used to

  18. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  19. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  20. Projecting Climate Change Impacts on Wildfire Probabilities

    Science.gov (United States)

    Westerling, A. L.; Bryant, B. P.; Preisler, H.

    2008-12-01

    We present preliminary results of the 2008 Climate Change Impact Assessment for wildfire in California, part of the second biennial science report to the California Climate Action Team organized via the California Climate Change Center by the California Energy Commission's Public Interest Energy Research Program pursuant to Executive Order S-03-05 of Governor Schwarzenegger. In order to support decision making by the State pertaining to mitigation of and adaptation to climate change and its impacts, we model wildfire occurrence monthly from 1950 to 2100 under a range of climate scenarios from the Intergovernmental Panel on Climate Change. We use six climate change models (GFDL CM2.1, NCAR PCM1, CNRM CM3, MPI ECHAM5, MIROC3.2 med, NCAR CCSM3) under two emissions scenarios--A2 (C02 850ppm max atmospheric concentration) and B1(CO2 550ppm max concentration). Climate model output has been downscaled to a 1/8 degree (~12 km) grid using two alternative methods: a Bias Correction and Spatial Donwscaling (BCSD) and a Constructed Analogues (CA) downscaling. Hydrologic variables have been simulated from temperature, precipitation, wind and radiation forcing data using the Variable Infiltration Capacity (VIC) Macroscale Hydrologic Model. We model wildfire as a function of temperature, moisture deficit, and land surface characteristics using nonlinear logistic regression techniques. Previous work on wildfire climatology and seasonal forecasting has demonstrated that these variables account for much of the inter-annual and seasonal variation in wildfire. The results of this study are monthly gridded probabilities of wildfire occurrence by fire size class, and estimates of the number of structures potentially affected by fires. In this presentation we will explore the range of modeled outcomes for wildfire in California, considering the effects of emissions scenarios, climate model sensitivities, downscaling methods, hydrologic simulations, statistical model specifications for

  1. Adolescents' misinterpretation of health risk probability expressions.

    Science.gov (United States)

    Cohn, L D; Schydlower, M; Foley, J; Copeland, R L

    1995-05-01

    To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).

  2. A practical overview on probability distributions.

    Science.gov (United States)

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-03-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.

  3. Support Theory: A Nonextensional Representation of Subjective Probability.

    Science.gov (United States)

    Tversky, Amos; Koehler, Derek J.

    1994-01-01

    A new theory of subjective probability is presented. According to this theory, different descriptions of the same event can give rise to different judgments. Experimental evidence supporting this theory is summarized, demonstrating that the theory provides a unified treatment of a wide range of empirical findings. (SLD)

  4. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  5. On the probability distribution for the cosmological constant

    Science.gov (United States)

    Elizalde, E.; Gaztañaga, E.

    1990-01-01

    The behaviour in Coleman's approach of the probability distribution for the cosmological constant Λ is shown to depend rather strongly on the corrections to the effective action. In particular, when one includes terms proportional to Λ2, the infinite peak in the probability density at Λ=0 smoothly disappears (provided that the coefficient of Λ2 is positive). A random distribution for Λ can then be obtained (as a limiting case) in a domain around Λ=0. This is in accordance with the results of an approach recently proposed by Fischler, Klebanov, Polchinski and Susskind.

  6. Incorporating detection probability into northern Great Plains pronghorn population estimates

    Science.gov (United States)

    Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.

    2014-01-01

    Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.

  7. Probability analysis of position errors using uncooled IR stereo camera

    Science.gov (United States)

    Oh, Jun Ho; Lee, Sang Hwa; Lee, Boo Hwan; Park, Jong-Il

    2016-05-01

    This paper analyzes the random phenomenon of 3D positions when tracking moving objects using the infrared (IR) stereo camera, and proposes a probability model of 3D positions. The proposed probability model integrates two random error phenomena. One is the pixel quantization error which is caused by discrete sampling pixels in estimating disparity values of stereo camera. The other is the timing jitter which results from the irregular acquisition-timing in the uncooled IR cameras. This paper derives a probability distribution function by combining jitter model with pixel quantization error. To verify the proposed probability function of 3D positions, the experiments on tracking fast moving objects are performed using IR stereo camera system. The 3D depths of moving object are estimated by stereo matching, and be compared with the ground truth obtained by laser scanner system. According to the experiments, the 3D depths of moving object are estimated within the statistically reliable range which is well derived by the proposed probability distribution. It is expected that the proposed probability model of 3D positions can be applied to various IR stereo camera systems that deal with fast moving objects.

  8. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  9. Climate driven range divergence among host species affects range-wide patterns of parasitism

    Directory of Open Access Journals (Sweden)

    Richard E. Feldman

    2017-01-01

    Full Text Available Species interactions like parasitism influence the outcome of climate-driven shifts in species ranges. For some host species, parasitism can only occur in that part of its range that overlaps with a second host species. Thus, predicting future parasitism may depend on how the ranges of the two hosts change in relation to each other. In this study, we tested whether the climate driven species range shift of Odocoileus virginianus (white-tailed deer accounts for predicted changes in parasitism of two other species from the family Cervidae, Alces alces (moose and Rangifer tarandus (caribou, in North America. We used MaxEnt models to predict the recent (2000 and future (2050 ranges (probabilities of occurrence of the cervids and a parasite Parelaphostrongylus tenuis (brainworm taking into account range shifts of the parasite’s intermediate gastropod hosts. Our models predicted that range overlap between A. alces/R. tarandus and P. tenuis will decrease between 2000 and 2050, an outcome that reflects decreased overlap between A. alces/R. tarandus and O. virginianus and not the parasites, themselves. Geographically, our models predicted increasing potential occurrence of P. tenuis where A. alces/R. tarandus are likely to decline, but minimal spatial overlap where A. alces/R. tarandus are likely to increase. Thus, parasitism may exacerbate climate-mediated southern contraction of A. alces and R. tarandus ranges but will have limited influence on northward range expansion. Our results suggest that the spatial dynamics of one host species may be the driving force behind future rates of parasitism for another host species.

  10. Comparison of probability density functions for analyzing irradiance statistics due to atmospheric turbulence.

    Science.gov (United States)

    Mclaren, Jason R W; Thomas, John C; Mackintosh, Jessica L; Mudge, Kerry A; Grant, Kenneth J; Clare, Bradley A; Cowley, William G

    2012-09-01

    A large number of model probability density functions (PDFs) are used to analyze atmospheric scintillation statistics. We have analyzed scintillation data from two different experimental setups covering a range of scintillation strengths to determine which candidate model PDFs best describe the experimental data. The PDFs were fitted to the experimental data using the method of least squares. The root-mean-squared fitting error was used to monitor the goodness of fit. The results of the fitting were found to depend strongly on the scintillation strength. We find that the log normally modulated Rician and the log normal PDFs are the best fit to the experimental data over the range of scintillation strengths encountered.

  11. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  12. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  13. Experience matters: information acquisition optimizes probability gain.

    Science.gov (United States)

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  14. UT Biomedical Informatics Lab (BMIL) Probability Wheel.

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  15. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  16. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  17. Anticipating Central Asian Water Stress: Variation in River Flow Dependency on Melt Waters from Alpine to Plains in the Remote Tien Shan Range, Kyrgyzstan Using a Rapid Hydro Assessment Methodology

    Science.gov (United States)

    Hill, A. F.; Wilson, A. M.; Williams, M. W.

    2016-12-01

    The future of mountain water resources in High Asia is of high interest to water managers, development organizations and policy makers given large populations downstream reliant on snow and ice sourced river flow. Together with historical and cultural divides among ex-Soviet republics, a lack of central water management following the Soviet break-up has led to water stress as trans-boundary waters weave through and along borders. New upstream hydropower development, a thirsty downstream agricultural sector and a shrinking Aral Sea has led to increasing tension in the region. Despite these pressures and in contrast to eastern High Asia's Himalayan basins (Ganges, Brahmaputra), little attention has been given to western High Asia draining the Pamir and Tien Shan ranges (Syr Darya and Amu Darya basins) to better understand the hydrology of this vast and remote area. Difficult access and challenging terrain exacerbate challenges to working in this remote mountain region. As part of the Contributions to High Asia Runoff from Ice and Snow (CHARIS) project, we asked how does river flow source water composition change over an alpine-to-plains domain of Kyrgyzstan's Naryn River in the Syr Darya basin? In addition, what may the future hold for river flow in Central Asia given the differing responses of snow and ice to climate changes? Utilizing a Rapid Hydrologic Assessment methodology including a suite of pre-field mapping techniques we collected in situ water chemistry data at targeted, remote mountain sites over 450km of the Naryn River over an elevation gradient from glacial headwaters to the lower lying areas - places where people, hydropower and agriculture utilize water. Chemical and isotope tracers were used to separate stream flow to understand relative dependency on melt waters as the river moves downstream from glaciers and snow covered areas. This case study demonstrates a technique to acquire field data over large scales in remote regions that facilitates

  18. Cross Check of NOvA Oscillation Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics

    2018-01-12

    In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.

  19. Atomic transition probabilities of Ce I from Fourier transform spectra

    Science.gov (United States)

    Lawler, J. E.; Chisholm, J.; Nitz, D. E.; Wood, M. P.; Sobeck, J.; Den Hartog, E. A.

    2010-04-01

    Atomic transition probabilities for 2874 lines of the first spectrum of cerium (Ce I) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2009 J. Phys. B: At. Mol. Opt. Phys. 42 085006). The wavelength range of the data set is from 360 to 1500 nm. Comparisons are made to previous investigations which are less extensive. Accurate Ce i transition probabilities are needed for lighting research and development on metal halide high-intensity discharge lamps.

  20. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  1. Atomic transition probabilities of Ce I from Fourier transform spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lawler, J E; Wood, M P; Den Hartog, E A [Department of Physics, University of Wisconsin, 1150 University Ave., Madison, WI 53706 (United States); Chisholm, J [Department of Physics, Boston College, 140 Commonwealth Ave., Chestnut Hill, MA 02467 (United States); Nitz, D E [Department of Physics, St. Olaf College, 1520 St. Olaf Ave., Northfield, MN 55057 (United States); Sobeck, J, E-mail: jelawler@wisc.ed, E-mail: chishojd@bc.ed, E-mail: nitz@stolaf.ed, E-mail: mpwood@wisc.ed, E-mail: jsobeck@uchicago.ed, E-mail: eadenhar@wisc.ed [Department of Astronomy and Astrophysics, University of Chicago, 5640 Ellis Ave., Chicago, IL 60637 (United States)

    2010-04-28

    Atomic transition probabilities for 2874 lines of the first spectrum of cerium (Ce I) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2009 J. Phys. B: At. Mol. Opt. Phys. 42 085006). The wavelength range of the data set is from 360 to 1500 nm. Comparisons are made to previous investigations which are less extensive. Accurate Ce i transition probabilities are needed for lighting research and development on metal halide high-intensity discharge lamps.

  2. Fracture probability along a fatigue crack path

    Energy Technology Data Exchange (ETDEWEB)

    Makris, P. [Technical Univ., Athens (Greece)

    1995-03-01

    Long experience has shown that the strength of materials under fatigue load has a stochastic behavior, which can be expressed through the fracture probability. This paper deals with a new analytically derived law for the distribution of the fracture probability along a fatigue crack path. The knowledge of the distribution of the fatigue fracture probability along the crack path helps the connection between stress conditions and the expected fatigue life of a structure under stochasticly varying loads. (orig.)

  3. Probability and statistics: models for research

    National Research Council Canada - National Science Library

    Bailey, Daniel Edgar

    1971-01-01

    This book is an interpretative presentation of the mathematical and logical basis of probability and statistics, indulging in some mathematics, but concentrating on the logical and scientific meaning...

  4. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    Science.gov (United States)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  5. Advantages of the probability amplitude over the probability density in quantum mechanics

    OpenAIRE

    Kurihara, Yoshimasa; Quach, Nhi My Uyen

    2013-01-01

    We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...

  6. A two-locus forensic match probability for subdivided populations.

    Science.gov (United States)

    Ayres, K L

    2000-01-01

    A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.

  7. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  9. Caustic-induced features in microlensing magnification probability distributions

    Science.gov (United States)

    Rauch, Kevin P.; Mao, Shude; Wambsganss, Joachim; Paczynski, Bohdan

    1992-01-01

    Numerical simulations have uncovered a previously unrecognized 'bump' in the macroimage magnification probabilities produced by a planar distribution of point masses. The result could be relevant to cases of microlensing by star fields in single galaxies, for which this lensing geometry is an excellent approximation. The bump is produced by bright pairs of microimages formed by sources lying near the caustics of the lens. The numerically calculated probabilities for the magnifications in the range between 3 and 30 are significantly higher than those given by the asymptotic relation derived by Schneider. The bump present in the two-dimensional lenses appears not to exist in the magnification probability distribution produced by a fully three-dimensional lens.

  10. Probability distributions for the magnification of quasars due to microlensing

    Science.gov (United States)

    Wambsganss, Joachim

    1992-01-01

    Gravitational microlensing can magnify the flux of a lensed quasar considerably and therefore possibly influence quasar source counts or the observed quasar luminosity function. A large number of distributions of magnification probabilities due to gravitational microlensing for finite sources are presented, with a reasonable coverage of microlensing parameter space (i.e., surface mass density, external shear, mass spectrum of lensing objects). These probability distributions were obtained from smoothing two-dimensional magnification patterns with Gaussian source profiles. Different source sizes ranging from 10 exp 14 cm to 5 x 10 exp 16 cm were explored. The probability distributions show a large variety of shapes. Coefficients of fitted slopes for large magnifications are presented.

  11. Perfect Precision Detecting Probability Of An Atom Via Sgc Mechanism

    Science.gov (United States)

    Hamedi, H. R.

    2015-06-01

    This letter investigates a scheme of high efficient two-dimensional (2D) atom localization via scanning probe absorption in a Y-type four-level atomic scheme with two orthogonal standing waves. It is shown that because of the position dependent atom-field interaction, the spatial probability distribution of the atom can be directly determined via monitoring the probe absorption and gain spectra. The impact of different controlling parameters of the system on 2D localization is studied. We find that owning the effect of spontaneously generated coherence (SGC), the atom can be localized at a particular position and the maximal probability of detecting the atom within the sub-wavelength domain of the two orthogonal standing waves reaches to hundred percent. Phase controlling of position dependent probe absorption is then discussed. The presented scheme may be helpful in laser cooling or atom nanolithography via high precision and high resolution atom localization.

  12. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  13. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  14. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  15. Analytical Study of Thermonuclear Reaction Probability Integrals

    OpenAIRE

    Chaudhry, M.A.; Haubold, H. J.; Mathai, A. M.

    2000-01-01

    An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.

  16. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  17. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  18. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  19. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  20. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  1. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  2. prep misestimates the probability of replication

    NARCIS (Netherlands)

    Iverson, G.; Lee, M.D.; Wagenmakers, E.-J.

    2009-01-01

    The probability of "replication," prep, has been proposed as a means of identifying replicable and reliable effects in the psychological sciences. We conduct a basic test of prep that reveals that it misestimates the true probability of replication, especially for small effects. We show how these

  3. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  4. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  5. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K) and population density (N) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc , while N, and for most regions K, was generally positively correlated with Pocc . Thus, in temperate forest trees the regions of highest occurrence probability are

  6. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  7. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  8. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  9. Probability distribution of the number of deceits in collective robotics

    OpenAIRE

    Murciano, Antonio; Zamora, Javier; Lopez-Sanchez, Jesus; Rodriguez-Santamaria, Emilia

    2002-01-01

    The benefit obtained by a selfish robot by cheating in a real multirobotic system can be represented by the random variable Xn,q: the number of cheating interactions needed before all the members in a cooperative team of robots, playing a TIT FOR TAT strategy, recognize the selfish robot. Stability of cooperation depends on the ratio between the benefit obtained by selfish and cooperative robots. In this paper, we establish the probability model for Xn,q. If the values...

  10. Levy's zero-one law in game-theoretic probability

    OpenAIRE

    Shafer, Glenn; Vovk, Vladimir; Takemura, Akimichi

    2009-01-01

    We prove a game-theoretic version of Levy's zero-one law, and deduce several corollaries from it, including non-stochastic versions of Kolmogorov's zero-one law, the ergodicity of Bernoulli shifts, and a zero-one law for dependent trials. Our secondary goal is to explore the basic definitions of game-theoretic probability theory, with Levy's zero-one law serving a useful role.

  11. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  12. Physical foundations of the probability of biogenesis.

    Science.gov (United States)

    Bogdanski, C A

    1975-01-01

    For bigenesis-a particular kind of systmogenesis-to occur, certain physical and informative requirements were indispensable, especially: (1) the selforganization of new kind of negative feed-backs supported by the trans-substantial channels of information. These were certainly at first organized only on the submolecular level, each of them consisting of many charge-transfer connections. The accomplishment of requirement (1) depends on (2) and (3): (2) the organization of a sufficiently complex structure inside the agglomerated system. We should mention here the desagglomerated inorganic systems according to the archaic models: 'A' (Atoms) and P (astro-Planetary systems). In these prebiotic models the selfregulation background consists of the kind of negative opposing forces). (3) the availability of molecules in which the structure complexity is sufficiently high to be able to contribute to the organization of the trans-substantial channels. Biogenesis of spontaneous trans-substantialization of information channels in feed-backs. The trans-substantial channels are spontaneously organized in the biotic model, but they are also present in many technical electronic models of systems constructed by man. Therefore, it is no wonder that biogenesis probably occurred at the 10--6 m size level (compare the diameter of the microspheres of Fox and the concept of Ponnamperuma who mentions the size of the contemporary Micrococcus). Such a system position - inside the biotic unicellular sub-band (extended actually from 10--6 to 10--1 m) is faviourable for the organization of the high complexity of the structurogenic processes trajectories. It is at the nearest possible level to this region on the dimensional scale where a maximal plurality of the different joining forces exists...

  13. Relativistic Many-body Moller-Plesset Perturbation Theory Calculations of the Energy Levels and Transition Probabilities in Na- to P-like Xe Ions

    Energy Technology Data Exchange (ETDEWEB)

    Vilkas, M J; Ishikawa, Y; Trabert, E

    2007-03-27

    Relativistic multireference many-body perturbation theory calculations have been performed on Xe{sup 43+}-Xe{sup 39+} ions, resulting in energy levels, electric dipole transition probabilities, and level lifetimes. The second-order many-body perturbation theory calculation of energy levels included mass shifts, frequency-dependent Breit correction and Lamb shifts. The calculated transition energies and E1 transition rates are used to present synthetic spectra in the extreme ultraviolet range for some of the Xe ions.

  14. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  15. Failure-probability driven dose painting

    Energy Technology Data Exchange (ETDEWEB)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Berthelsen, Anne K. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Bentzen, Søren M. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Departments of Human Oncology and Medical Physics, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2013-08-15

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.

  16. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  17. Technical note: Influence of the phantom material on the absorbed-dose energy dependence of the EBT3 radiochromic film for photons in the energy range 3 keV-18 MeV.

    Science.gov (United States)

    Hermida-López, M; Lüdemann, L; Flühs, A; Brualla, L

    2014-11-01

    Water is the reference medium for radiation therapy dosimetry, but for film dosimetry it is more practical to use a solid phantom. As the composition of solid phantoms differs from that of water, the energy dependence of film exposed within solid phantoms may also differ. The energy dependence of a radiochromic film for a given beam quality Q (energy for monoenergetic beams) has two components: the intrinsic energy dependence and the absorbed-dose energy dependence f(Q), the latter of which can be calculated through a Monte Carlo simulation of radiation transport. The authors used Monte Carlo simulations to study the influence of the phantom material on the f(Q) of the EBT3 radiochromic film (Ashland Specialty Ingredients, Wayne, NJ) for photon beams with energies between 3 keV and 18 MeV. All simulations were carried out with the general-purpose Monte Carlo code penelope 2011. The geometrical model consisted of a cylindrical phantom, with the film positioned at different depths depending on the initial photon energy. The authors simulated monoenergetic parallel photon beams and x-ray beams from a superficial therapy system. To validate their choice of simulation parameters, they also calculated f(Q) for older film models, EBT and EBT2, comparing with published results. In addition to water, they calculated f(Q) of the EBT3 film for solid phantom materials commonly used for film dosimetry: RW1 and RW3 (PTW-Freiburg, Freiburg, Germany), Solid Water (Gammex-RMI, Madison, WI), and PMMA. Finally, they combined their calculated f(Q) with published overall energy response data to obtain the intrinsic energy dependence of the EBT3 film in water. The calculated f(Q) for EBT and EBT2 films was statistically compatible with previously published data. Between 10 keV and 18 MeV, the variation found in f(Q) of the EBT3 film for water was within 2.3%, with a standard statistical uncertainty less than 1%. If the quantity dose-to-water in the phantom is considered, which is the

  18. Technical Note: Influence of the phantom material on the absorbed-dose energy dependence of the EBT3 radiochromic film for photons in the energy range 3 keV–18 MeV

    Energy Technology Data Exchange (ETDEWEB)

    Hermida-López, M., E-mail: mhermida@vhebron.net [NCTeam, Strahlenklinik, Universitätsklinikum Essen, Hufelandstraße 55, Essen D-45122, Germany and Servei de Física i Protecció Radiològica, Hospital Universitari Vall d’Hebron, Pg. Vall d’Hebron 119-129, Barcelona 08035 (Spain); Lüdemann, L.; Flühs, A. [Medical Physics, Strahlenklinik, Universitätsklinikum Essen, Hufelandstraße 55, Essen D-45122 (Germany); Brualla, L. [NCTeam, Strahlenklinik, Universitätsklinikum Essen, Hufelandstraße 55, Essen D-45122 (Germany)

    2014-11-01

    Purpose: Water is the reference medium for radiation therapy dosimetry, but for film dosimetry it is more practical to use a solid phantom. As the composition of solid phantoms differs from that of water, the energy dependence of film exposed within solid phantoms may also differ. The energy dependence of a radiochromic film for a given beam quality Q (energy for monoenergetic beams) has two components: the intrinsic energy dependence and the absorbed-dose energy dependence f(Q), the latter of which can be calculated through a Monte Carlo simulation of radiation transport. The authors used Monte Carlo simulations to study the influence of the phantom material on the f(Q) of the EBT3 radiochromic film (Ashland Specialty Ingredients, Wayne, NJ) for photon beams with energies between 3 keV and 18 MeV. Methods: All simulations were carried out with the general-purpose Monte Carlo code PENELOPE 2011. The geometrical model consisted of a cylindrical phantom, with the film positioned at different depths depending on the initial photon energy. The authors simulated monoenergetic parallel photon beams and x-ray beams from a superficial therapy system. To validate their choice of simulation parameters, they also calculated f(Q) for older film models, EBT and EBT2, comparing with published results. In addition to water, they calculated f(Q) of the EBT3 film for solid phantom materials commonly used for film dosimetry: RW1 and RW3 (PTW-Freiburg, Freiburg, Germany), Solid Water (Gammex-RMI, Madison, WI), and PMMA. Finally, they combined their calculated f(Q) with published overall energy response data to obtain the intrinsic energy dependence of the EBT3 film in water. Results: The calculated f(Q) for EBT and EBT2 films was statistically compatible with previously published data. Between 10 keV and 18 MeV, the variation found in f(Q) of the EBT3 film for water was within 2.3%, with a standard statistical uncertainty less than 1%. If the quantity dose-to-water in the phantom is

  19. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr

    2016-08-01

    In this paper, we propose and derive a slotted-time model for analyzing the burst blocking probability in Optical Burst Switched (OBS) networks. We evaluated the immediate and delayed signaling reservation schemes. The proposed model compares the performance of both just-in-time (JIT) and just-enough-time (JET) signaling protocols associated with of void/non-void filling link scheduling schemes. It also considers none and limited range wavelength conversions scenarios. Our model is distinguished by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio compared with full wavelength conversion.

  20. A Course on Elementary Probability Theory

    OpenAIRE

    Lo, Gane Samb

    2017-01-01

    This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related t...

  1. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  2. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  3. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  4. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  5. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  6. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  8. Tomographic probability representation in the problem of transitions between the Landau levels

    OpenAIRE

    Zhebrak, E. D.

    2012-01-01

    The problem of moving of a charged particle in electromagnetic field is considered in terms of tomographic probability representation. The coherent and Fock states of a charge moving in varying homogeneous magnetic field are studied in the tomographic probability representation of quantum mechanics. The states are expressed in terms of quantum tomograms. The Fock state tomograms are given in the form of probability distributions described by multivariable Hermite polynomials with time-depende...

  9. Mutation of His465 Alters the pH-dependent Spectroscopic Properties of Escherichia coli Glutamate Decarboxylase and Broadens the Range of Its Activity toward More Alkaline pH

    NARCIS (Netherlands)

    Pennacchietti, E.; Lammens, T.M.; Capitani, G.; Franssen, M.C.R.; John, R.A.; Bossa, F.; Biase, De D.

    2009-01-01

    Glutamate decarboxylase (GadB) from Escherichia coli is a hexameric, pyridoxal 5'-phosphate-dependent enzyme catalyzing CO2 release from the a-carboxyl group of l-glutamate to yield ¿-aminobutyrate. GadB exhibits an acidic pH optimum and undergoes a spectroscopically detectable and strongly

  10. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  11. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  12. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    Science.gov (United States)

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  13. Quantifying extinction probabilities from sighting records: inference and uncertainties.

    Directory of Open Access Journals (Sweden)

    Peter Caley

    Full Text Available Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes.

  14. Short-range fundamental forces

    CERN Document Server

    Antoniadis, I; Buchner, M; Fedorov, V V; Hoedl, S; Lambrecht, A; Nesvizhevsky, V V; Pignol, G; Protasov, K V; Reynaud, S; Sobolev, Yu

    2011-01-01

    We consider theoretical motivations to search for extra short-range fundamental forces as well as experiments constraining their parameters. The forces could be of two types: 1) spin-independent forces, 2) spin-dependent axion-like forces. Differe nt experimental techniques are sensitive in respective ranges of characteristic distances. The techniques include measurements of gravity at short distances, searches for extra interactions on top of the Casimir force, precision atomic and neutron experim ents. We focus on neutron constraints, thus the range of characteristic distances considered here corresponds to the range accessible for neutron experiments.

  15. The rate of beneficial mutations surfing on the wave of a range expansion.

    Directory of Open Access Journals (Sweden)

    Rémi Lehe

    Full Text Available Many theoretical and experimental studies suggest that range expansions can have severe consequences for the gene pool of the expanding population. Due to strongly enhanced genetic drift at the advancing frontier, neutral and weakly deleterious mutations can reach large frequencies in the newly colonized regions, as if they were surfing the front of the range expansion. These findings raise the question of how frequently beneficial mutations successfully surf at shifting range margins, thereby promoting adaptation towards a range-expansion phenotype. Here, we use individual-based simulations to study the surfing statistics of recurrent beneficial mutations on wave-like range expansions in linear habitats. We show that the rate of surfing depends on two strongly antagonistic factors, the probability of surfing given the spatial location of a novel mutation and the rate of occurrence of mutations at that location. The surfing probability strongly increases towards the tip of the wave. Novel mutations are unlikely to surf unless they enjoy a spatial head start compared to the bulk of the population. The needed head start is shown to be proportional to the inverse fitness of the mutant type, and only weakly dependent on the carrying capacity. The precise location dependence of surfing probabilities is derived from the non-extinction probability of a branching process within a moving field of growth rates. The second factor is the mutation occurrence which strongly decreases towards the tip of the wave. Thus, most successful mutations arise at an intermediate position in the front of the wave. We present an analytic theory for the tradeoff between these factors that allows to predict how frequently substitutions by beneficial mutations occur at invasion fronts. We find that small amounts of genetic drift increase the fixation rate of beneficial mutations at the advancing front, and thus could be important for adaptation during species invasions.

  16. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  17. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  18. Probability Analysis of a Quantum Computer

    OpenAIRE

    Einarsson, Göran

    2003-01-01

    The quantum computer algorithm by Peter Shor for factorization of integers is studied. The quantum nature of a QC makes its outcome random. The output probability distribution is investigated and the chances of a successful operation is determined

  19. Nanoformulations and Clinical Trial Candidates as Probably ...

    African Journals Online (AJOL)

    Nanoformulations and Clinical Trial Candidates as Probably Effective and Safe Therapy for Tuberculosis. Madeeha Laghari, Yusrida Darwis, Abdul Hakeem Memon, Arshad Ali Khan, Ibrahim Mohammed Tayeb Abdulbaqi, Reem Abou Assi ...

  20. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  1. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  2. Zika Probably Not Spread Through Saliva: Study

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_167531.html Zika Probably Not Spread Through Saliva: Study Research with ... HealthDay News) -- Scientists have some interesting news about Zika: You're unlikely to get the virus from ...

  3. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  4. On $\\varphi$-families of probability distributions

    OpenAIRE

    Rui F. Vigelis; Cavalcante, Charles C.

    2011-01-01

    We generalize the exponential family of probability distributions. In our approach, the exponential function is replaced by a $\\varphi$-function, resulting in a $\\varphi$-family of probability distributions. We show how $\\varphi$-families are constructed. In a $\\varphi$-family, the analogue of the cumulant-generating function is a normalizing function. We define the $\\varphi$-divergence as the Bregman divergence associated to the normalizing function, providing a generalization of the Kullbac...

  5. An Illustrative Problem in Computational Probability.

    Science.gov (United States)

    1980-06-01

    easily evaluated. In general, the (n)probabilities #j (t) my be comuted by the numerical solution of the simple differential equations d Cu) * (n) Rt#n...algorithmically tractable solutions to problem in probability adds an interesting new dimension to their analysis. Zn the con- struction of efficient...signif icence. This serves to Illustrate out first point. )hatbematica3lk equivalent solutions sma be vastly diLfferent In their sutIlIty for

  6. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  7. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  8. Maximum likelihood estimation for predicting the probability of obtaining variable shortleaf pine regeneration densities

    Science.gov (United States)

    Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin

    2003-01-01

    A logistic equation is the basis for a model that predicts the probability of obtaining regeneration at specified densities. The density of regeneration (trees/ha) for which an estimate of probability is desired can be specified by means of independent variables in the model. When estimating parameters, the dependent variable is set to 1 if the regeneration density (...

  9. Analysis of feedbacks between nucleation rate, survival probability and cloud condensation nuclei formation

    Science.gov (United States)

    Westervelt, D. M.; Pierce, J. R.; Adams, P. J.

    2014-06-01

    Aerosol nucleation is an important source of particle number in the atmosphere. However, in order to become cloud condensation nuclei (CCN), freshly nucleated particles must undergo significant condensational growth while avoiding coagulational scavenging. In an effort to quantify the contribution of nucleation to CCN, this work uses the GEOS-Chem-TOMAS global aerosol model to calculate changes in CCN concentrations against a broad range of nucleation rates and mechanisms. We then quantify the factors that control CCN formation from nucleation, including daily nucleation rates, growth rates, coagulation sinks, condensation sinks, survival probabilities, and CCN formation rates, in order to examine feedbacks that may limit growth of nucleated particles to CCN. Nucleation rate parameterizations tested in GEOS-Chem-TOMAS include ternary nucleation (with multiple tuning factors), activation nucleation (with two pre-factors), binary nucleation, and ion-mediated nucleation. We find that nucleation makes a significant contribution to boundary layer CCN(0.2%), but this contribution is only modestly sensitive to the choice of nucleation scheme, ranging from 49 to 78% increase in concentrations over a control simulation with no nucleation. Moreover, a two order-of-magnitude increase in the globally averaged nucleation rate (via changes to tuning factors) results in small changes (less than 10%) to global CCN(0.2%) concentrations. To explain this, we present a simple theory showing that survival probability has an exponentially decreasing dependence on the square of the condensation sink. This functional form stems from a negative correlation between condensation sink and growth rate and a positive correlation between condensation sink and coagulational scavenging. Conceptually, with a fixed condensable vapor budget (sulfuric acid and organics), any increase in CCN concentrations due to higher nucleation rates necessarily entails an increased aerosol surface area in the

  10. Mixture probability distribution functions to model wind speed distributions

    Energy Technology Data Exchange (ETDEWEB)

    Kollu, Ravindra; Rayapudi, Srinivasa Rao; Pakkurthi, Krishna Mohan [J.N.T. Univ., Kakinada (India). Dept. of Electrical and Electronics Engineering; Narasimham, S.V.L. [J.N.T. Univ., Andhra Pradesh (India). Computer Science and Engineering Dept.

    2012-11-01

    Accurate wind speed modeling is critical in estimating wind energy potential for harnessing wind power effectively. The quality of wind speed assessment depends on the capability of chosen probability density function (PDF) to describe the measured wind speed frequency distribution. The objective of this study is to describe (model) wind speed characteristics using three mixture probability density functions Weibull-extreme value distribution (GEV), Weibull-lognormal, and GEV-lognormal which were not tried before. Statistical parameters such as maximum error in the Kolmogorov-Smirnov test, root mean square error, Chi-square error, coefficient of determination, and power density error are considered as judgment criteria to assess the fitness of the probability density functions. Results indicate that Weibull- GEV PDF is able to describe unimodal as well as bimodal wind distributions accurately whereas GEV-lognormal PDF is able to describe familiar bell-shaped unimodal distribution well. Results show that mixture probability functions are better alternatives to conventional Weibull, two-component mixture Weibull, gamma, and lognormal PDFs to describe wind speed characteristics. (orig.)

  11. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  12. Approximation of Measurement Results of “Emergency” Signal Reception Probability

    Directory of Open Access Journals (Sweden)

    Gajda Stanisław

    2017-08-01

    Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.

  13. Determination of riverbank erosion probability using Locally Weighted Logistic Regression

    Science.gov (United States)

    Ioannidou, Elena; Flori, Aikaterini; Varouchakis, Emmanouil A.; Giannakis, Georgios; Vozinaki, Anthi Eirini K.; Karatzas, George P.; Nikolaidis, Nikolaos

    2015-04-01

    Riverbank erosion is a natural geomorphologic process that affects the fluvial environment. The most important issue concerning riverbank erosion is the identification of the vulnerable locations. An alternative to the usual hydrodynamic models to predict vulnerable locations is to quantify the probability of erosion occurrence. This can be achieved by identifying the underlying relations between riverbank erosion and the geomorphological or hydrological variables that prevent or stimulate erosion. Thus, riverbank erosion can be determined by a regression model using independent variables that are considered to affect the erosion process. The impact of such variables may vary spatially, therefore, a non-stationary regression model is preferred instead of a stationary equivalent. Locally Weighted Regression (LWR) is proposed as a suitable choice. This method can be extended to predict the binary presence or absence of erosion based on a series of independent local variables by using the logistic regression model. It is referred to as Locally Weighted Logistic Regression (LWLR). Logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable (e.g. binary response) based on one or more predictor variables. The method can be combined with LWR to assign weights to local independent variables of the dependent one. LWR allows model parameters to vary over space in order to reflect spatial heterogeneity. The probabilities of the possible outcomes are modelled as a function of the independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. erosion presence or absence) for any value of the independent variables. The

  14. Energy dependent response of the Fricke gel dosimeter prepared with 270 Bloom gelatine for photons in the energy range 13.93 keV-6 MeV

    Energy Technology Data Exchange (ETDEWEB)

    Cavinato, C.C. [Instituto de Pesquisas Energeticas e Nucleares, IPEN-CNEN/SP, Av. Prof. Lineu Prestes, 2242, Cidade Universitaria, 05508-000 Sao Paulo, SP (Brazil); Campos, L.L., E-mail: lcrodri@ipen.b [Instituto de Pesquisas Energeticas e Nucleares, IPEN-CNEN/SP, Av. Prof. Lineu Prestes, 2242, Cidade Universitaria, 05508-000 Sao Paulo, SP (Brazil)

    2010-07-21

    The spectrophotometric energy dependent response to photons with effective energies between 13.93 keV and 6 MeV of the Fricke xylenol gel (FXG) dosimeter developed at IPEN, prepared using 270 Bloom gelatine, was evaluated in order to verify the possible dosimeter application in other medicine areas in addition to radiosurgery, for example, breast radiotherapy and blood bags radiosterilization. Other dosimetric characteristics were also evaluated. The obtained results indicate that the FXG dosimeter can contribute to dosimetry in different medical application areas including magnetic resonance imaging (MRI) evaluation technique that permits three-dimensional (3D) dose distribution evaluation.

  15. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  16. Probability of pregnancy in beef heifers

    Directory of Open Access Journals (Sweden)

    D.P. Faria

    2014-12-01

    Full Text Available This study aimed to evaluate the influence of initial weight, initial age, average daily gain in initial weight, average daily gain in total weight and genetic group on the probability of pregnancy in primiparous females of the Nellore, 1/2 Simmental + 1/2 Nellore, and 3/4 Nellore + 1/4 Simmental genetic groups. Data were collected from the livestock file of the Farpal Farm, located in the municipality of Jaíba, Minas Gerais State, Brazil. The pregnancy diagnosis results (success = 1 and failure = 0 were used to determine the probability of pregnancy that was modeled using logistic regression by the Proc Logistic procedure available on SAS (Statistical..., 2004 software, from the regressor variables initial weight, average daily gain in initial weight, average daily gain in total weight, and genetic group. Initial weight (IW was the most important variable in the probability of pregnancy in heifers, and 1-kg increments in IW allowed for increases of 5.8, 9.8 and 3.4% in the probability of pregnancy in Nellore, 1/2 Simmental + 1/2 Nellore and, 3/4 Nellore + 1/4 Simmental heifers, respectively. The initial age influenced the probability of pregnancy in Nellore heifers. From the estimates of the effects of each variable it was possible to determine the minimum initial weights for each genetic group. This information can be used to monitor the development of heifers until the breeding season and increase the pregnancy rate.

  17. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Michael C. Wittmann

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  18. The role of probabilities in physics.

    Science.gov (United States)

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. High-frequency ultrasonographic imaging of the endothelium-dependent flow-mediated dilatation (FMD) in a brachial artery: normative ranges in a group of low CV risk subjects of different ages.

    Science.gov (United States)

    Ryliskyte, Ligita; Ghiadoni, Lorenzo; Plantinga, Yvonne; Janaviciene, Silvija; Petrulioniene, Zaneta; Laucevicius, Aleksandras; Gintautas, Jonas

    2004-01-01

    High-frequency ultrasonographic imaging of flow-mediated dilatation (FMD) in a brachial artery, as non-invasive technique, was used for the clinical evaluation of endothelial function (EF) in 115 subjects (ages 44.19+/-12.23 (26 to 83) free of coronary heart disease or its equivalents. Our aim was to study the normative ranges for FMD in subjects of different age groups with low cardiovascular risk factors. The mean FMD was 8.23+/-4.51% (0 to 20.9). Multivariate analysis revealed that there were only two independent predictors of FMD: resting vessel diameter (r = -0.45, pFMD = 25.5-0.17 x age-2.6 x resting vessel diameter. Our study demonstrates that FMD in low cardiovascular risk patients inversely correlates with age as well as brachial artery diameter. Normative ranges of FMD could be predicted for different age groups. In addition to conventional methods for the assessment of cardiovascular risk by using a population-based approach (score indexes such as SCORE, FRAMINGH, PROCAM), high-frequency ultrasonographic imaging of flow-mediated dilatation (FMD) in the brachial artery is now becoming an accepted method for the assessment of an individual patient's cardiovascular risk. Although preliminary guidelines have been published, this technique has interpretive limitations. In a study that was published earlier, the authors estimated diameter-related normal ranges of FMD. Several articles reported cut points between control and diseased groups. Data about the impact of age on FMD are also available in the literature. However, this is the first attempt to classify normal values into groups according to age and diameter.

  20. Depth- and range-dependent variation in the performance of aquatic telemetry systems: understanding and predicting the susceptibility of acoustic tag–receiver pairs to close proximity detection interference

    Directory of Open Access Journals (Sweden)

    Stephen R. Scherrer

    2018-01-01

    Full Text Available Background Passive acoustic telemetry using coded transmitter tags and stationary receivers is a popular method for tracking movements of aquatic animals. Understanding the performance of these systems is important in array design and in analysis. Close proximity detection interference (CPDI is a condition where receivers fail to reliably detect tag transmissions. CPDI generally occurs when the tag and receiver are near one another in acoustically reverberant settings. Here we confirm transmission multipaths reflected off the environment arriving at a receiver with sufficient delay relative to the direct signal cause CPDI. We propose a ray-propagation based model to estimate the arrival of energy via multipaths to predict CPDI occurrence, and we show how deeper deployments are particularly susceptible. Methods A series of experiments were designed to develop and validate our model. Deep (300 m and shallow (25 m ranging experiments were conducted using Vemco V13 acoustic tags and VR2-W receivers. Probabilistic modeling of hourly detections was used to estimate the average distance a tag could be detected. A mechanistic model for predicting the arrival time of multipaths was developed using parameters from these experiments to calculate the direct and multipath path lengths. This model was retroactively applied to the previous ranging experiments to validate CPDI observations. Two additional experiments were designed to validate predictions of CPDI with respect to combinations of deployment depth and distance. Playback of recorded tags in a tank environment was used to confirm multipaths arriving after the receiver’s blanking interval cause CPDI effects. Results Analysis of empirical data estimated the average maximum detection radius (AMDR, the farthest distance at which 95% of tag transmissions went undetected by receivers, was between 840 and 846 m for the deep ranging experiment across all factor permutations. From these results, CPDI was

  1. Early appraisal of the fixation probability in directed networks

    Science.gov (United States)

    Barbosa, Valmir C.; Donangelo, Raul; Souza, Sergio R.

    2010-10-01

    In evolutionary dynamics, the probability that a mutation spreads through the whole population, having arisen from a single individual, is known as the fixation probability. In general, it is not possible to find the fixation probability analytically given the mutant’s fitness and the topological constraints that govern the spread of the mutation, so one resorts to simulations instead. Depending on the topology in use, a great number of evolutionary steps may be needed in each of the simulation events, particularly in those that end with the population containing mutants only. We introduce two techniques to accelerate the determination of the fixation probability. The first one skips all evolutionary steps in which the number of mutants does not change and thereby reduces the number of steps per simulation event considerably. This technique is computationally advantageous for some of the so-called layered networks. The second technique, which is not restricted to layered networks, consists of aborting any simulation event in which the number of mutants has grown beyond a certain threshold value and counting that event as having led to a total spread of the mutation. For advantageous mutations in large populations and regardless of the network’s topology, we demonstrate, both analytically and by means of simulations, that using a threshold of about [N/(r-1)]1/4 mutants, where N is the number of simulation events and r is the ratio of the mutants’ fitness to that of the remainder of the population, leads to an estimate of the fixation probability that deviates in no significant way from that obtained from the full-fledged simulations. We have observed speedups of two orders of magnitude for layered networks with 10000 nodes.

  2. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  3. Classicality versus quantumness in Born's probability

    Science.gov (United States)

    Luo, Shunlong

    2017-11-01

    Born's rule, which postulates the probability of a measurement outcome in a quantum state, is pivotal to interpretations and applications of quantum mechanics. By exploiting the departure of the product of two Hermitian operators in Born's rule from Hermiticity, we prescribe an intrinsic and natural scheme to decompose Born's probability into a classical part and a quantum part, which have significant implications in quantum information theory. The classical part constitutes the information compatible with the associated measurement operator, while the quantum part represents the quantum coherence of the state with respect to the measurement operator. Fundamental properties of the decomposition are revealed. As applications, we establish several trade-off relations for the classicality and quantumness in Born's probability, which may be interpreted as alternative realizations of Heisenberg's uncertainty principle. The results shed physical lights on related issues concerning quantification of complementarity, coherence, and uncertainty, as well as the classical-quantum interplay.

  4. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  5. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  7. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  8. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  9. Explosion probability of unexploded ordnance: expert beliefs.

    Science.gov (United States)

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  10. Energy dependence of effective atomic numbers for photon energy absorption and photon interaction: Studies of some biological molecules in the energy range 1 keV-20 MeV

    DEFF Research Database (Denmark)

    Manohara, S.R.; Hanagodimath, S.M.; Gerward, Leif

    2008-01-01

    , linolenic, arachidonic, and arachidic acids), nucleotide bases (adenine, guanine, cytosine, uracil, and thymine), and carbohydrates (glucose, sucrose, raffinose, and starch). The Z(PEA, eff) and Z(PI, eff) values have been found to change with energy and composition of the biological molecules. The energy......Effective atomic numbers for photon energy absorption, Z(PEA,eff), and for photon interaction, Z(PI,eff), have been calculated by a direct method in the photon-energy region from 1 keV to 20 MeV for biological molecules, such as fatty acids (lauric, myristic, palmitic, stearic, oleic, linoleic...... dependence of the mass attenuation coefficient, Z(PEA, eff), and the mass energy-absorption coefficient, Z(PI, eff), is shown graphically and in tabular form. Significant differences of 17%-38% between Z(PI, eff) and Z(PEA, eff) occur in the energy region 5-100 keV. The reasons for these differences...

  11. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  12. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  13. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  14. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  15. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  16. Probability Inequalities for a Gladiator Game

    OpenAIRE

    Yosef Rinott; Marco Scarsini; Yaming Yu

    2011-01-01

    Based on a model introduced by Kaminsky, Luks, and Nelson (1984), we consider a zero-sum allocation game called the Gladiator Game, where two teams of gladiators engage in a sequence of one-to-one fights in which the probability of winning is a function of the gladiators' strengths. Each team's strategy consist the allocation of its total strength among its gladiators. We find the Nash equilibria of the game and compute its value. To do this, we study interesting majorization-type probability...

  17. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  18. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  19. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  20. Concepts of probability in radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    Bernhard Weninger

    2011-12-01

    Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.