WorldWideScience

Sample records for presumed probability density

  1. On the shapes of the presumed probability density function for the modeling of turbulence-radiation interactions

    International Nuclear Information System (INIS)

    Liu, L.H.; Xu, X.; Chen, Y.L.

    2004-01-01

    The laminar flamelet equations in combination with the joint probability density function (PDF) transport equation of mixture fraction and turbulence frequency have been used to simulate turbulent jet diffusion flames. To check the suitability of the presumed shapes of the PDF for the modeling of turbulence-radiation interactions (TRI), two types of presumed joint PDFs are constructed by using the second-order moments of temperature and the species concentrations, which are derived by the laminar flamelet model. The time-averaged radiative source terms and the time-averaged absorption coefficients are calculated by the presumed joint PDF approaches, and compared with those obtained by the laminar flamelet model. By comparison, it is shown that there are obvious differences between the results of the independent PDF approach and the laminar flamelet model. Generally, the results of the dependent PDF approach agree better with those of the flamelet model. For the modeling of TRI, the dependent PDF approach is superior to the independent PDF approach

  2. Evaluation of Presumed Probability-Density-Function Models in Non-Premixed Flames by using Large Eddy Simulation

    International Nuclear Information System (INIS)

    Cao Hong-Jun; Zhang Hui-Qiang; Lin Wen-Yi

    2012-01-01

    Four kinds of presumed probability-density-function (PDF) models for non-premixed turbulent combustion are evaluated in flames with various stoichiometric mixture fractions by using large eddy simulation (LES). The LES code is validated by the experimental data of a classical turbulent jet flame (Sandia flame D). The mean and rms temperatures obtained by the presumed PDF models are compared with the LES results. The β-function model achieves a good prediction for different flames. The predicted rms temperature by using the double-δ function model is very small and unphysical in the vicinity of the maximum mean temperature. The clip-Gaussian model and the multi-δ function model make a worse prediction of the extremely fuel-rich or fuel-lean side due to the clip at the boundary of the mixture fraction space. The results also show that the overall prediction performance of presumed PDF models is better at mediate stoichiometric mixture fractions than that at very small or very large ones. (fundamental areas of phenomenology(including applications))

  3. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  4. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  5. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  6. Gaussian mixture probability hypothesis density filter for multipath multitarget tracking in over-the-horizon radar

    Science.gov (United States)

    Qin, Yong; Ma, Hong; Chen, Jinfeng; Cheng, Li

    2015-12-01

    Conventional multitarget tracking systems presume that each target can produce at most one measurement per scan. Due to the multiple ionospheric propagation paths in over-the-horizon radar (OTHR), this assumption is not valid. To solve this problem, this paper proposes a novel tracking algorithm based on the theory of finite set statistics (FISST) called the multipath probability hypothesis density (MP-PHD) filter in cluttered environments. First, the FISST is used to derive the update equation, and then Gaussian mixture (GM) is introduced to derive the closed-form solution of the MP-PHD filter. Moreover, the extended Kalman filter (EKF) is presented to deal with the nonlinear problem of the measurement model in OTHR. Eventually, the simulation results are provided to demonstrate the effectiveness of the proposed filter.

  7. Comparison of density estimators. [Estimation of probability density functions

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.; Monahan, J.F.

    1977-09-01

    Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)

  8. On Farmer's line, probability density functions, and overall risk

    International Nuclear Information System (INIS)

    Munera, H.A.; Yadigaroglu, G.

    1986-01-01

    Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value

  9. What is presumed when we presume consent?

    Directory of Open Access Journals (Sweden)

    Pierscionek Barbara K

    2008-04-01

    Full Text Available Abstract Background The organ donor shortfall in the UK has prompted calls to introduce legislation to allow for presumed consent: if there is no explicit objection to donation of an organ, consent should be presumed. The current debate has not taken in account accepted meanings of presumption in law and science and the consequences for rights of ownership that would arise should presumed consent become law. In addition, arguments revolve around the rights of the competent autonomous adult but do not always consider the more serious implications for children or the disabled. Discussion Any action or decision made on a presumption is accepted in law and science as one based on judgement of a provisional situation. It should therefore allow the possibility of reversing the action or decision. Presumed consent to organ donation will not permit such reversal. Placing prime importance on the functionality of body organs and their capacity to sustain life rather than on explicit consent of the individual will lead to further debate about rights of ownership and potentially to questions about financial incentives and to whom benefits should accrue. Factors that influence donor rates are not fully understood and attitudes of the public to presumed consent require further investigation. Presuming consent will also necessitate considering how such a measure would be applied in situations involving children and mentally incompetent adults. Summary The presumption of consent to organ donation cannot be understood in the same way as is presumption when applied to science or law. Consideration should be given to the consequences of presuming consent and to the questions of ownership and organ monetary value as these questions are likely to arise should presumed consent be permitted. In addition, the implications of presumed consent on children and adults who are unable to object to organ donation, requires serious contemplation if these most vulnerable

  10. Probability-density-function characterization of multipartite entanglement

    International Nuclear Information System (INIS)

    Facchi, P.; Florio, G.; Pascazio, S.

    2006-01-01

    We propose a method to characterize and quantify multipartite entanglement for pure states. The method hinges upon the study of the probability density function of bipartite entanglement and is tested on an ensemble of qubits in a variety of situations. This characterization is also compared to several measures of multipartite entanglement

  11. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  12. Visualization techniques for spatial probability density function data

    Directory of Open Access Journals (Sweden)

    Udeepta D Bordoloi

    2006-01-01

    Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.

  13. An empirical probability model of detecting species at low densities.

    Science.gov (United States)

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  14. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  15. Blue functions: probability and current density propagators in non-relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Withers, L P Jr

    2011-01-01

    Like a Green function to propagate a particle's wavefunction in time, a Blue function is introduced to propagate the particle's probability and current density. Accordingly, the complete Blue function has four components. They are constructed from path integrals involving a quantity like the action that we call the motion. The Blue function acts on the displaced probability density as the kernel of an integral operator. As a result, we find that the Wigner density occurs as an expression for physical propagation. We also show that, in quantum mechanics, the displaced current density is conserved bilocally (in two places at one time), as expressed by a generalized continuity equation. (paper)

  16. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  17. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  18. Interactive design of probability density functions for shape grammars

    KAUST Repository

    Dang, Minh; Lienhard, Stefan; Ceylan, Duygu; Neubert, Boris; Wonka, Peter; Pauly, Mark

    2015-01-01

    A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density

  19. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    Science.gov (United States)

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  20. The force distribution probability function for simple fluids by density functional theory.

    Science.gov (United States)

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  1. INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS

    KAUST Repository

    Potter, Kristin; Kirby, Robert Michael; Xiu, Dongbin; Johnson, Chris R.

    2012-01-01

    The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.

  2. The effect of incremental changes in phonotactic probability and neighborhood density on word learning by preschool children

    Science.gov (United States)

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose Phonotactic probability or neighborhood density have predominately been defined using gross distinctions (i.e., low vs. high). The current studies examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method The full range of probability or density was examined by sampling five nonwords from each of four quartiles. Three- and 5-year-old children received training on nonword-nonobject pairs. Learning was measured in a picture-naming task immediately following training and 1-week after training. Results were analyzed using multi-level modeling. Results A linear spline model best captured nonlinearities in phonotactic probability. Specifically word learning improved as probability increased in the lowest quartile, worsened as probability increased in the midlow quartile, and then remained stable and poor in the two highest quartiles. An ordinary linear model sufficiently described neighborhood density. Here, word learning improved as density increased across all quartiles. Conclusion Given these different patterns, phonotactic probability and neighborhood density appear to influence different word learning processes. Specifically, phonotactic probability may affect recognition that a sound sequence is an acceptable word in the language and is a novel word for the child, whereas neighborhood density may influence creation of a new representation in long-term memory. PMID:23882005

  3. Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation

    Directory of Open Access Journals (Sweden)

    Michal Halas

    2012-01-01

    Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.

  4. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  5. Continuation of probability density functions using a generalized Lyapunov approach

    NARCIS (Netherlands)

    Baars, S.; Viebahn, J. P.; Mulder, T. E.; Kuehn, C.; Wubs, F. W.; Dijkstra, H. A.

    2017-01-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial

  6. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  7. PDE-Foam - a probability-density estimation method using self-adapting phase-space binning

    CERN Document Server

    Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter

    2009-01-01

    Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...

  8. Probability Density Estimation Using Neural Networks in Monte Carlo Calculations

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Cho, Jin Young; Song, Jae Seung; Kim, Chang Hyo

    2008-01-01

    The Monte Carlo neutronics analysis requires the capability for a tally distribution estimation like an axial power distribution or a flux gradient in a fuel rod, etc. This problem can be regarded as a probability density function estimation from an observation set. We apply the neural network based density estimation method to an observation and sampling weight set produced by the Monte Carlo calculations. The neural network method is compared with the histogram and the functional expansion tally method for estimating a non-smooth density, a fission source distribution, and an absorption rate's gradient in a burnable absorber rod. The application results shows that the neural network method can approximate a tally distribution quite well. (authors)

  9. Probability density function method for variable-density pressure-gradient-driven turbulence and mixing

    International Nuclear Information System (INIS)

    Bakosi, Jozsef; Ristorcelli, Raymond J.

    2010-01-01

    Probability density function (PDF) methods are extended to variable-density pressure-gradient-driven turbulence. We apply the new method to compute the joint PDF of density and velocity in a non-premixed binary mixture of different-density molecularly mixing fluids under gravity. The full time-evolution of the joint PDF is captured in the highly non-equilibrium flow: starting from a quiescent state, transitioning to fully developed turbulence and finally dissipated by molecular diffusion. High-Atwood-number effects (as distinguished from the Boussinesq case) are accounted for: both hydrodynamic turbulence and material mixing are treated at arbitrary density ratios, with the specific volume, mass flux and all their correlations in closed form. An extension of the generalized Langevin model, originally developed for the Lagrangian fluid particle velocity in constant-density shear-driven turbulence, is constructed for variable-density pressure-gradient-driven flows. The persistent small-scale anisotropy, a fundamentally 'non-Kolmogorovian' feature of flows under external acceleration forces, is captured by a tensorial diffusion term based on the external body force. The material mixing model for the fluid density, an active scalar, is developed based on the beta distribution. The beta-PDF is shown to be capable of capturing the mixing asymmetry and that it can accurately represent the density through transition, in fully developed turbulence and in the decay process. The joint model for hydrodynamics and active material mixing yields a time-accurate evolution of the turbulent kinetic energy and Reynolds stress anisotropy without resorting to gradient diffusion hypotheses, and represents the mixing state by the density PDF itself, eliminating the need for dubious mixing measures. Direct numerical simulations of the homogeneous Rayleigh-Taylor instability are used for model validation.

  10. Unification of field theory and maximum entropy methods for learning probability densities

    Science.gov (United States)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  11. Unification of field theory and maximum entropy methods for learning probability densities.

    Science.gov (United States)

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  12. On the evolution of the density probability density function in strongly self-gravitating systems

    International Nuclear Information System (INIS)

    Girichidis, Philipp; Konstandin, Lukas; Klessen, Ralf S.; Whitworth, Anthony P.

    2014-01-01

    The time evolution of the probability density function (PDF) of the mass density is formulated and solved for systems in free-fall using a simple approximate function for the collapse of a sphere. We demonstrate that a pressure-free collapse results in a power-law tail on the high-density side of the PDF. The slope quickly asymptotes to the functional form P V (ρ)∝ρ –1.54 for the (volume-weighted) PDF and P M (ρ)∝ρ –0.54 for the corresponding mass-weighted distribution. From the simple approximation of the PDF we derive analytic descriptions for mass accretion, finding that dynamically quiet systems with narrow density PDFs lead to retarded star formation and low star formation rates (SFRs). Conversely, strong turbulent motions that broaden the PDF accelerate the collapse causing a bursting mode of star formation. Finally, we compare our theoretical work with observations. The measured SFRs are consistent with our model during the early phases of the collapse. Comparison of observed column density PDFs with those derived from our model suggests that observed star-forming cores are roughly in free-fall.

  13. Assumed Probability Density Functions for Shallow and Deep Convection

    OpenAIRE

    Steven K Krueger; Peter A Bogenschutz; Marat Khairoutdinov

    2010-01-01

    The assumed joint probability density function (PDF) between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS) parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PD...

  14. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  15. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  16. Unification of field theory and maximum entropy methods for learning probability densities

    OpenAIRE

    Kinney, Justin B.

    2014-01-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy de...

  17. Visualizing classical and quantum probability densities for momentum using variations on familiar one-dimensional potentials

    International Nuclear Information System (INIS)

    Robinett, R.W.

    2002-01-01

    After briefly reviewing the definitions of classical probability densities for position, P C L(x), and for momentum, P C L(p), we present several examples of classical mechanical potential systems, mostly variations on such familiar cases as the infinite well and the uniformly accelerated particle for which the classical distributions can be easily derived and visualized. We focus especially on a simple potential which interpolates between the symmetric linear potential, V(x)=F vertical bar x vertical bar, and the infinite well, which can illustrate, in a mathematically straightforward way, how the divergent δ-function classical probability density for momentum for the infinite well can be seen to arise. Such examples can help students understand the quantum mechanical momentum-space wavefunction (and its corresponding probability density) in much the same way that other semiclassical techniques, such as the WKB approximation, can be used to visualize position-space wavefunctions. (author)

  18. Probability density of wave function of excited photoelectron: understanding XANES features

    Czech Academy of Sciences Publication Activity Database

    Šipr, Ondřej

    2001-01-01

    Roč. 8, - (2001), s. 232-234 ISSN 0909-0495 R&D Projects: GA ČR GA202/99/0404 Institutional research plan: CEZ:A02/98:Z1-010-914 Keywords : XANES * PED - probability density of wave function Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.519, year: 2001

  19. The Effect of Incremental Changes in Phonotactic Probability and Neighborhood Density on Word Learning by Preschool Children

    Science.gov (United States)

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose: Phonotactic probability or neighborhood density has predominately been defined through the use of gross distinctions (i.e., low vs. high). In the current studies, the authors examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method: The authors examined the full range of…

  20. Stochastic chaos induced by diffusion processes with identical spectral density but different probability density functions.

    Science.gov (United States)

    Lei, Youming; Zheng, Fan

    2016-12-01

    Stochastic chaos induced by diffusion processes, with identical spectral density but different probability density functions (PDFs), is investigated in selected lightly damped Hamiltonian systems. The threshold amplitude of diffusion processes for the onset of chaos is derived by using the stochastic Melnikov method together with a mean-square criterion. Two quasi-Hamiltonian systems, namely, a damped single pendulum and damped Duffing oscillator perturbed by stochastic excitations, are used as illustrative examples. Four different cases of stochastic processes are taking as the driving excitations. It is shown that in such two systems the spectral density of diffusion processes completely determines the threshold amplitude for chaos, regardless of the shape of their PDFs, Gaussian or otherwise. Furthermore, the mean top Lyapunov exponent is employed to verify analytical results. The results obtained by numerical simulations are in accordance with the analytical results. This demonstrates that the stochastic Melnikov method is effective in predicting the onset of chaos in the quasi-Hamiltonian systems.

  1. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    Science.gov (United States)

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  2. The influence of phonotactic probability and neighborhood density on children's production of newly learned words.

    Science.gov (United States)

    Heisler, Lori; Goffman, Lisa

    A word learning paradigm was used to teach children novel words that varied in phonotactic probability and neighborhood density. The effects of frequency and density on speech production were examined when phonetic forms were non-referential (i.e., when no referent was attached) and when phonetic forms were referential (i.e., when a referent was attached through fast mapping). Two methods of analysis were included: (1) kinematic variability of speech movement patterning; and (2) measures of segmental accuracy. Results showed that phonotactic frequency influenced the stability of movement patterning whereas neighborhood density influenced phoneme accuracy. Motor learning was observed in both non-referential and referential novel words. Forms with low phonotactic probability and low neighborhood density showed a word learning effect when a referent was assigned during fast mapping. These results elaborate on and specify the nature of interactivity observed across lexical, phonological, and articulatory domains.

  3. Protein single-model quality assessment by feature-based probability density functions.

    Science.gov (United States)

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  4. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

    Directory of Open Access Journals (Sweden)

    Robert G. Staudte

    2018-04-01

    Full Text Available We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs, obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples.

  5. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  6. Therapeutic High-Density Barium Enema in a Case of Presumed Diverticular Hemorrhage

    Directory of Open Access Journals (Sweden)

    Nonthalee Pausawasdi

    2011-02-01

    Full Text Available Many patients with lower gastrointestinal bleeding do not have an identifiable source of bleeding at colonoscopy. A significant percentage of these patients will have recurrent bleeding. In many patients, the presence of multiple diverticula leads to a diagnosis of presumed diverticular bleeding. Current treatment options include therapeutic endoscopy, angiography, or surgical resection, all of which depend on the identification of the diverticular source of bleeding. This report describes a case of recurrent bleeding in an elderly patient with diverticula but no identifiable source treated successfully with barium impaction therapy. This therapeutic modality does not depend on the identification of the bleeding diverticular lesion and was well tolerated by our 86-year-old patient.

  7. Noise-level determination for discrete spectra with Gaussian or Lorentzian probability density functions

    International Nuclear Information System (INIS)

    Moriya, Netzer

    2010-01-01

    A method, based on binomial filtering, to estimate the noise level of an arbitrary, smoothed pure signal, contaminated with an additive, uncorrelated noise component is presented. If the noise characteristics of the experimental spectrum are known, as for instance the type of the corresponding probability density function (e.g., Gaussian), the noise properties can be extracted. In such cases, both the noise level, as may arbitrarily be defined, and a simulated white noise component can be generated, such that the simulated noise component is statistically indistinguishable from the true noise component present in the original signal. In this paper we present a detailed analysis of the noise level extraction when the additive noise is Gaussian or Lorentzian. We show that the statistical parameters in these cases (mainly the variance and the half width at half maximum, respectively) can directly be obtained from the experimental spectrum even when the pure signal is erratic. Further discussion is given for cases where the noise probability density function is initially unknown.

  8. Momentum Probabilities for a Single Quantum Particle in Three-Dimensional Regular "Infinite" Wells: One Way of Promoting Understanding of Probability Densities

    Science.gov (United States)

    Riggs, Peter J.

    2013-01-01

    Students often wrestle unsuccessfully with the task of correctly calculating momentum probability densities and have difficulty in understanding their interpretation. In the case of a particle in an "infinite" potential well, its momentum can take values that are not just those corresponding to the particle's quantised energies but…

  9. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  10. Probability density of tunneled carrier states near heterojunctions calculated numerically by the scattering method.

    Energy Technology Data Exchange (ETDEWEB)

    Wampler, William R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Myers, Samuel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Modine, Normand A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    The energy-dependent probability density of tunneled carrier states for arbitrarily specified longitudinal potential-energy profiles in planar bipolar devices is numerically computed using the scattering method. Results agree accurately with a previous treatment based on solution of the localized eigenvalue problem, where computation times are much greater. These developments enable quantitative treatment of tunneling-assisted recombination in irradiated heterojunction bipolar transistors, where band offsets may enhance the tunneling effect by orders of magnitude. The calculations also reveal the density of non-tunneled carrier states in spatially varying potentials, and thereby test the common approximation of uniform- bulk values for such densities.

  11. Continuation of probability density functions using a generalized Lyapunov approach

    Energy Technology Data Exchange (ETDEWEB)

    Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)

    2017-05-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  12. Probability density functions for CP-violating rephasing invariants

    Science.gov (United States)

    Fortin, Jean-François; Giasson, Nicolas; Marleau, Luc

    2018-05-01

    The implications of the anarchy principle on CP violation in the lepton sector are investigated. A systematic method is introduced to compute the probability density functions for the CP-violating rephasing invariants of the PMNS matrix from the Haar measure relevant to the anarchy principle. Contrary to the CKM matrix which is hierarchical, it is shown that the Haar measure, and hence the anarchy principle, are very likely to lead to the observed PMNS matrix. Predictions on the CP-violating Dirac rephasing invariant |jD | and Majorana rephasing invariant |j1 | are also obtained. They correspond to 〈 |jD | 〉 Haar = π / 105 ≈ 0.030 and 〈 |j1 | 〉 Haar = 1 / (6 π) ≈ 0.053 respectively, in agreement with the experimental hint from T2K of | jDexp | ≈ 0.032 ± 0.005 (or ≈ 0.033 ± 0.003) for the normal (or inverted) hierarchy.

  13. Conjuntivite presumível por Acanthamoeba Conjunctivitis presumably due to Acanthamoeba

    Directory of Open Access Journals (Sweden)

    Ana Cristina de Carvalho Ruthes

    2004-12-01

    Full Text Available OBJETIVO: Abordar quatro casos de conjuntivite presumível por Acanthamoeba, descrevendo o diagnóstico, considerando sinais e sintomas e o tratamento instituído. MÉTODOS: Foram estudados casos de conjuntivite presumível por Acanthamoeba diagnosticados no Hospital de Olhos do Paraná (HOP, no período de setembro/1998 a janeiro/2002. Todos os olhos estudados foram submetidos a um protocolo de investigação que incluía exame oftalmológico completo, microbiologia e cultura de secreções conjuntivais. RESULTADOS: Os exames laboratoriais de microscopia e cultura do material colhido estes pacientes revelaram o diagnóstico de Acanthamoeba. A maioria dos pacientes referia olhos vermelhos e irritação ocular de longa data. Os autores encontraram correlação entre a cultura e o exame direto, em que se evidenciou a presença de cistos e trofozoítas do protozoário. CONCLUSÃO: Este é o primeiro relato de conjuntivite provavelmente por Acanthamoeba de acordo com a literatura revisada. Pacientes selecionados e refratários ao tratamento habitual de infecção ocular externa devem ser considerados para estudo laboratorial adequado à procura etiológica da doença.PURPOSE: To describe four cases of conjunctivitis presumably due to Acanthamoeba considering diagnosis, signs, symptoms and treatment. METHODS: We reviewed the medical records of all patients who presented a clinical diagnosis of Acanthamoeba conjunctivitis between September/1998 to January/2001 at the "Hospital de Olhos do Paraná (HOP". All eyes were submitted to a protocol of investigation that included ophthalmologic examination, microscopic examination and culture exams of conjunctival smears for adequate treatment. RESULTS: The laboratorial results of conjunctival smears revealed contamination with Acanthamoeba by direct examination and thereafter, confirmed by culture. The authors observed cysts and trophozoites of Acanthamoeba. CONCLUSION: This is the first report of

  14. Exact joint density-current probability function for the asymmetric exclusion process.

    Science.gov (United States)

    Depken, Martin; Stinchcombe, Robin

    2004-07-23

    We study the asymmetric simple exclusion process with open boundaries and derive the exact form of the joint probability function for the occupation number and the current through the system. We further consider the thermodynamic limit, showing that the resulting distribution is non-Gaussian and that the density fluctuations have a discontinuity at the continuous phase transition, while the current fluctuations are continuous. The derivations are performed by using the standard operator algebraic approach and by the introduction of new operators satisfying a modified version of the original algebra. Copyright 2004 The American Physical Society

  15. Dictionary-Based Stochastic Expectation–Maximization for SAR Amplitude Probability Density Function Estimation

    OpenAIRE

    Moser , Gabriele; Zerubia , Josiane; Serpico , Sebastiano B.

    2006-01-01

    International audience; In remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of the pixel intensities. This paper deals with the problem of probability density function (pdf) estimation in the context of synthetic aperture radar (SAR) amplitude data analysis. Several theoretical and heuristic models for the pdfs of SAR data have been proposed in the literature, which have been proved to be effective for different land-cov...

  16. Audio Query by Example Using Similarity Measures between Probability Density Functions of Features

    Directory of Open Access Journals (Sweden)

    Marko Helén

    2010-01-01

    Full Text Available This paper proposes a query by example system for generic audio. We estimate the similarity of the example signal and the samples in the queried database by calculating the distance between the probability density functions (pdfs of their frame-wise acoustic features. Since the features are continuous valued, we propose to model them using Gaussian mixture models (GMMs or hidden Markov models (HMMs. The models parametrize each sample efficiently and retain sufficient information for similarity measurement. To measure the distance between the models, we apply a novel Euclidean distance, approximations of Kullback-Leibler divergence, and a cross-likelihood ratio test. The performance of the measures was tested in simulations where audio samples are automatically retrieved from a general audio database, based on the estimated similarity to a user-provided example. The simulations show that the distance between probability density functions is an accurate measure for similarity. Measures based on GMMs or HMMs are shown to produce better results than that of the existing methods based on simpler statistics or histograms of the features. A good performance with low computational cost is obtained with the proposed Euclidean distance.

  17. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  18. A new formulation of the probability density function in random walk models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Falk, Anne Katrine Vinther; Gryning, Sven-Erik

    1997-01-01

    In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials...

  19. Response and reliability analysis of nonlinear uncertain dynamical structures by the probability density evolution method

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri

    2016-01-01

    The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...

  20. Spatial correlations and probability density function of the phase difference in a developed speckle-field: numerical and natural experiments

    International Nuclear Information System (INIS)

    Mysina, N Yu; Maksimova, L A; Ryabukho, V P; Gorbatenko, B B

    2015-01-01

    Investigated are statistical properties of the phase difference of oscillations in speckle-fields at two points in the far-field diffraction region, with different shapes of the scatterer aperture. Statistical and spatial nonuniformity of the probability density function of the field phase difference is established. Numerical experiments show that, for the speckle-fields with an oscillating alternating-sign transverse correlation function, a significant nonuniformity of the probability density function of the phase difference in the correlation region of the field complex amplitude, with the most probable values 0 and p, is observed. A natural statistical interference experiment using Young diagrams has confirmed the results of numerical experiments. (laser applications and other topics in quantum electronics)

  1. The role of methotrexate in resolving ocular inflammation after specific therapy for presumed latent syphilitic uveitis and presumed tuberculosis-related uveitis.

    Science.gov (United States)

    Sahin, Ozlem; Ziaei, Alireza

    2014-07-01

    This study was designed to investigate whether the antiinflammatory and antiproliferative activity of oral and intravitreal methotrexate (MTX) suppresses intraocular inflammation in patients with presumed latent syphilitic uveitis and presumed tuberculosis-related uveitis. Interventional prospective study including three cases with presumed latent syphilitic uveitis treated with intravenous penicillin and oral MTX, and two cases with presumed tuberculosis-related uveitis treated with standard antituberculosis therapy and intravitreal MTX injections. Treatment efficacy of all cases was assessed by best-corrected visual acuity, fundus fluorescein angiography, and optical coherence tomography. Four eyes of 3 patients with presumed latent syphilitic uveitis had improved best-corrected visual acuity, suppression of intraocular inflammation, and resolution of cystoid macular edema in 6 months with oral MTX therapy. No recurrence of intraocular inflammation was observed in 6 months to 18 months of follow-up period after cessation of MTX. Two eyes of two patients with presumed tuberculosis-related uveitis showed improved best-corrected visual acuity, suppression of intraocular inflammation, and resolution of cystoid macular edema after intravitreal injections of MTX. No recurrence of intraocular inflammation was observed in 6 months to 8 months of follow-up period after cessation of antituberculous therapy. For the first time in the treatment of presumed latent syphilitic uveitis and presumed tuberculosis-related uveitis, we believe that MTX might have an adjunctive role to suppress intraocular inflammation, reduce uveitic macular edema, and prevent the recurrences of the diseases.

  2. Turbulent combustion modelization via a tabulation method of detailed kinetic chemistry coupled to Probability Density Function. Application to aeronautical engines; Modelisation de la combustion turbulente via une methode tabulation de la cinetique chimique detaillee couplee a des fonctions densites de probabilite. Application aux foyers aeronautiques

    Energy Technology Data Exchange (ETDEWEB)

    Rullaud, M

    2004-06-01

    A new modelization of turbulent combustion is proposed with detailed chemistry and probability density functions (PDFs). The objective is to capture temperature and species concentrations, mainly the CO. The PCM-FTC model, Presumed Conditional Moment - Flame Tabulated Chemistry, is based on the tabulation of laminar premixed and diffusion flames to capture partial pre-mixing present in aeronautical engines. The presumed PDFs is introduced to predict averaged values. The tabulation method is based on the analysis of the chemical structure of laminar premixed and diffusion flames. Hypothesis are presented, tested and validated with Sandia experimental data jet flames. Then, the model is introduced in a turbulent flow simulation software. Three configurations are retained to quantify the level of prediction of this formulation: the D and F-Flames of Sandia and lifted jet flames of methane/air of Stanford. A good agreement is observed between experiments and simulations. The validity of this method is then demonstrated. (author)

  3. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    Science.gov (United States)

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  4. 20 CFR 219.24 - Evidence of presumed death.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Evidence of presumed death. 219.24 Section... EVIDENCE REQUIRED FOR PAYMENT Evidence of Age and Death § 219.24 Evidence of presumed death. When a person cannot be proven dead but evidence of death is needed, the Board may presume he or she died at a certain...

  5. 27 CFR 70.52 - Signature presumed authentic.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Signature presumed authentic. 70.52 Section 70.52 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE... Collection of Excise and Special (Occupational) Tax Collection-General Provisions § 70.52 Signature presumed...

  6. Probability density function evolution of power systems subject to stochastic variation of renewable energy

    Science.gov (United States)

    Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.

    2018-05-01

    As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.

  7. Probability density cloud as a geometrical tool to describe statistics of scattered light.

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  8. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  9. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  10. Charged-particle thermonuclear reaction rates: II. Tables and graphs of reaction rates and probability density functions

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.; Fitzgerald, R.

    2010-01-01

    Numerical values of charged-particle thermonuclear reaction rates for nuclei in the A=14 to 40 region are tabulated. The results are obtained using a method, based on Monte Carlo techniques, that has been described in the preceding paper of this issue (Paper I). We present a low rate, median rate and high rate which correspond to the 0.16, 0.50 and 0.84 quantiles, respectively, of the cumulative reaction rate distribution. The meaning of these quantities is in general different from the commonly reported, but statistically meaningless expressions, 'lower limit', 'nominal value' and 'upper limit' of the total reaction rate. In addition, we approximate the Monte Carlo probability density function of the total reaction rate by a lognormal distribution and tabulate the lognormal parameters μ and σ at each temperature. We also provide a quantitative measure (Anderson-Darling test statistic) for the reliability of the lognormal approximation. The user can implement the approximate lognormal reaction rate probability density functions directly in a stellar model code for studies of stellar energy generation and nucleosynthesis. For each reaction, the Monte Carlo reaction rate probability density functions, together with their lognormal approximations, are displayed graphically for selected temperatures in order to provide a visual impression. Our new reaction rates are appropriate for bare nuclei in the laboratory. The nuclear physics input used to derive our reaction rates is presented in the subsequent paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.

  11. Interactive design of probability density functions for shape grammars

    KAUST Repository

    Dang, Minh

    2015-11-02

    A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density function (pdf) over such a shape space and to sample models according to the designed pdf. First, we propose a user interface that enables a user to quickly provide preference scores for selected shapes and suggest sampling strategies to decide which models to present to the user to evaluate. Second, we propose a novel kernel function to encode the similarity between two procedural models. Third, we propose a framework to interpolate user preference scores by combining multiple techniques: function factorization, Gaussian process regression, autorelevance detection, and l1 regularization. Fourth, we modify the original grammars to generate models with a pdf proportional to the user preference scores. Finally, we provide evaluations of our user interface and framework parameters and a comparison to other exploratory modeling techniques using modeling tasks in five example shape spaces: furniture, low-rise buildings, skyscrapers, airplanes, and vegetation.

  12. Probability density function modeling of scalar mixing from concentrated sources in turbulent channel flow

    OpenAIRE

    Bakosi, J.; Franzese, P.; Boybeyi, Z.

    2010-01-01

    Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth & Pope with Durbin's method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous ...

  13. Estimation of the four-wave mixing noise probability-density function by the multicanonical Monte Carlo method.

    Science.gov (United States)

    Neokosmidis, Ioannis; Kamalakis, Thomas; Chipouras, Aristides; Sphicopoulos, Thomas

    2005-01-01

    The performance of high-powered wavelength-division multiplexed (WDM) optical networks can be severely degraded by four-wave-mixing- (FWM-) induced distortion. The multicanonical Monte Carlo method (MCMC) is used to calculate the probability-density function (PDF) of the decision variable of a receiver, limited by FWM noise. Compared with the conventional Monte Carlo method previously used to estimate this PDF, the MCMC method is much faster and can accurately estimate smaller error probabilities. The method takes into account the correlation between the components of the FWM noise, unlike the Gaussian model, which is shown not to provide accurate results.

  14. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  15. 26 CFR 301.6064-1 - Signature presumed authentic.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Signature presumed authentic. 301.6064-1 Section 301.6064-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED....6064-1 Signature presumed authentic. An individual's name signed to a return, statement, or other...

  16. Retributivist Arguments against Presuming Innocence : Answering to Duff

    NARCIS (Netherlands)

    van Dijk, A.A.

    2013-01-01

    Factors justifying not presuming innocence are generally incorporated into the Presumption of Innocence (PoI). A confusing discourse has resulted: numerous guilt-presuming acts are deemed consistent with the PoI. I argue for an unusually broad PoI: any act that might convey to a reasonable actor

  17. 10 CFR 436.13 - Presuming cost-effectiveness results.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Presuming cost-effectiveness results. 436.13 Section 436... Methodology and Procedures for Life Cycle Cost Analyses § 436.13 Presuming cost-effectiveness results. (a) If the investment and other costs for an energy or water conservation measure considered for retrofit to...

  18. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  19. Probability Density Function Method for Observing Reconstructed Attractor Structure

    Institute of Scientific and Technical Information of China (English)

    陆宏伟; 陈亚珠; 卫青

    2004-01-01

    Probability density function (PDF) method is proposed for analysing the structure of the reconstructed attractor in computing the correlation dimensions of RR intervals of ten normal old men. PDF contains important information about the spatial distribution of the phase points in the reconstructed attractor. To the best of our knowledge, it is the first time that the PDF method is put forward for the analysis of the reconstructed attractor structure. Numerical simulations demonstrate that the cardiac systems of healthy old men are about 6 - 6.5 dimensional complex dynamical systems. It is found that PDF is not symmetrically distributed when time delay is small, while PDF satisfies Gaussian distribution when time delay is big enough. A cluster effect mechanism is presented to explain this phenomenon. By studying the shape of PDFs, that the roles played by time delay are more important than embedding dimension in the reconstruction is clearly indicated. Results have demonstrated that the PDF method represents a promising numerical approach for the observation of the reconstructed attractor structure and may provide more information and new diagnostic potential of the analyzed cardiac system.

  20. Effects of combined dimension reduction and tabulation on the simulations of a turbulent premixed flame using a large-eddy simulation/probability density function method

    Science.gov (United States)

    Kim, Jeonglae; Pope, Stephen B.

    2014-05-01

    A turbulent lean-premixed propane-air flame stabilised by a triangular cylinder as a flame-holder is simulated to assess the accuracy and computational efficiency of combined dimension reduction and tabulation of chemistry. The computational condition matches the Volvo rig experiments. For the reactive simulation, the Lagrangian Large-Eddy Simulation/Probability Density Function (LES/PDF) formulation is used. A novel two-way coupling approach between LES and PDF is applied to obtain resolved density to reduce its statistical fluctuations. Composition mixing is evaluated by the modified Interaction-by-Exchange with the Mean (IEM) model. A baseline case uses In Situ Adaptive Tabulation (ISAT) to calculate chemical reactions efficiently. Its results demonstrate good agreement with the experimental measurements in turbulence statistics, temperature, and minor species mass fractions. For dimension reduction, 11 and 16 represented species are chosen and a variant of Rate Controlled Constrained Equilibrium (RCCE) is applied in conjunction with ISAT to each case. All the quantities in the comparison are indistinguishable from the baseline results using ISAT only. The combined use of RCCE/ISAT reduces the computational time for chemical reaction by more than 50%. However, for the current turbulent premixed flame, chemical reaction takes only a minor portion of the overall computational cost, in contrast to non-premixed flame simulations using LES/PDF, presumably due to the restricted manifold of purely premixed flame in the composition space. Instead, composition mixing is the major contributor to cost reduction since the mean-drift term, which is computationally expensive, is computed for the reduced representation. Overall, a reduction of more than 15% in the computational cost is obtained.

  1. How to determine decisional capacity in critically ill patients. Presume the patient can make decisions unless proven otherwise.

    Science.gov (United States)

    Fleming, C; Momin, Z A; Brensilver, J M; Brandstetter, R D

    1995-03-01

    Decisional capacity includes ability to comprehend information, to make an informed choice, and to communicate that choice; it is specific to the decision at hand. Presume a patient has decisional capacity; an evaluation of incapacity must be justified. Administer a standardized mental status test to help assess alertness, attention, memory, and reasoning ability. A patient scoring below 10 on the Folstein Mini-Mental State Examination (maximum score, 30) probably does not have decisional capacity; one scoring from 10 to 15 probably can designate a proxy but not make complex health care decisions. Obtain psychiatric consultations for a patient who exhibits psychological barriers to decision making.

  2. Conditional Moment Closure Modelling of a Lifted H2/N2 Turbulent Jet Flame Using the Presumed Mapping Function Approach

    Directory of Open Access Journals (Sweden)

    Ahmad El Sayed

    2015-01-01

    Full Text Available A lifted hydrogen/nitrogen turbulent jet flame issuing into a vitiated coflow is investigated using the conditional moment closure (CMC supplemented by the presumed mapping function (PMF approach for the modelling of conditional mixing and velocity statistics. Using a prescribed reference field, the PMF approach yields a presumed probability density function (PDF for the mixture fraction, which is then used in closing the conditional scalar dissipation rate (CSDR and conditional velocity in a fully consistent manner. These closures are applied to a lifted flame and the findings are compared to previous results obtained using β-PDF-based closures over a range of coflow temperatures (Tc. The PMF results are in line with those of the β-PDF and compare well to measurements. The transport budgets in mixture fraction and physical spaces and the radical history ahead of the stabilisation height indicate that the stabilisation mechanism is susceptible to Tc. As in the previous β-PDF calculations, autoignition around the “most reactive” mixture fraction remains the controlling mechanism for sufficiently high Tc. Departure from the β-PDF predictions is observed when Tc is decreased as PMF predicts stabilisation by means of premixed flame propagation. This conclusion is based on the observation that lean mixtures are heated by downstream burning mixtures in a preheat zone developing ahead of the stabilization height. The spurious sources, which stem from inconsistent CSDR modelling, are further investigated. The findings reveal that their effect is small but nonnegligible, most notably within the flame zone.

  3. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  4. Fusing probability density function into Dempster-Shafer theory of evidence for the evaluation of water treatment plant.

    Science.gov (United States)

    Chowdhury, Shakhawat

    2013-05-01

    The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.

  5. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  6. Observability of the probability current density using spin rotator as a quantum clock

    International Nuclear Information System (INIS)

    Home, D.; Alok Kumar Pan; Md Manirul Ali

    2005-01-01

    Full text: An experimentally realizable scheme is formulated which can test any quantum mechanical approach for calculating the arrival time distribution. This is specifically illustrated by using the modulus of the probability current density for calculating the arrival time distribution of spin-1/2 neutral particles at the exit point of a spin rotator (SR) which contains a constant magnetic field. Such a calculated time distribution is then used for evaluating the distribution of spin orientations along different directions for these particles emerging from the SR. Based on this, the result of spin measurement along any arbitrary direction for such an ensemble is predicted. (author)

  7. Exact probability function for bulk density and current in the asymmetric exclusion process

    Science.gov (United States)

    Depken, Martin; Stinchcombe, Robin

    2005-03-01

    We examine the asymmetric simple exclusion process with open boundaries, a paradigm of driven diffusive systems, having a nonequilibrium steady-state transition. We provide a full derivation and expanded discussion and digression on results previously reported briefly in M. Depken and R. Stinchcombe, Phys. Rev. Lett. 93, 040602 (2004). In particular we derive an exact form for the joint probability function for the bulk density and current, both for finite systems, and also in the thermodynamic limit. The resulting distribution is non-Gaussian, and while the fluctuations in the current are continuous at the continuous phase transitions, the density fluctuations are discontinuous. The derivations are done by using the standard operator algebraic techniques and by introducing a modified version of the original operator algebra. As a by-product of these considerations we also arrive at a very simple way of calculating the normalization constant appearing in the standard treatment with the operator algebra. Like the partition function in equilibrium systems, this normalization constant is shown to completely characterize the fluctuations, albeit in a very different manner.

  8. Probability density functions of photochemicals over a coastal area of Northern Italy

    International Nuclear Information System (INIS)

    Georgiadis, T.; Fortezza, F.; Alberti, L.; Strocchi, V.; Marani, A.; Dal Bo', G.

    1998-01-01

    The present paper surveys the findings of experimental studies and analyses of statistical probability density functions (PDFs) applied to air pollutant concentrations to provide an interpretation of the ground-level distributions of photochemical oxidants in the coastal area of Ravenna (Italy). The atmospheric-pollution data set was collected from the local environmental monitoring network for the period 1978-1989. Results suggest that the statistical distribution of surface ozone, once normalised over the solar radiation PDF for the whole measurement period, follows a log-normal law as found for other pollutants. Although the Weibull distribution also offers a good fit of the experimental data, the area's meteorological features seem to favour the former distribution once the statistical index estimates have been analysed. Local transport phenomena are discussed to explain the data tail trends

  9. Complications of presumed ocular tuberculosis.

    Science.gov (United States)

    Hamade, Issam H; Tabbara, Khalid F

    2010-12-01

    To determine the effect of steroid treatment on visual outcome and ocular complications in patients with presumed ocular tuberculosis. Retrospective review of patients with presumptive ocular tuberculosis. The clinical diagnosis was made based on ocular findings, positive purified protein derivative (PPD) testing of more than 15 mm induration, exclusion of other causes of uveitis and positive ocular response to anti-tuberculous therapy (ATT) within 4 weeks. Group 1 included patients who had received oral prednisone or subtenon injection of triamcinolone acetonide prior to ATT. Group 2 included patients who did not receive corticosteroid therapy prior to administration of ATT.   Among 500 consecutive new cases of uveitis encountered in 1997-2007 there were 49 (10%) patients with presumed ocular tuberculosis. These comprised 28 (57%) male and 21 (43%) female patients with a mean age of 45 years (range 12-76 years). Four (20%) patients in group 1 had initial visual acuity of 20/40 or better, in comparison to eight (28%) patients in group 2. At 1-year follow-up, six (30%) patients in group 1 had a visual acuity of 20/40 or better compared with 20 (69%) patients in group 2 (p = 0.007). Of 20 eyes (26%) in group 1 that had visual acuity of < 20/50 at 1-year follow up, 14 (70%) eyes developed severe chorioretinal lesion (p = 0.019). Early administration of corticosteroids without anti-tuberculous therapy in presumed ocular tuberculosis may lead to poor visual outcome compared with patients who did not receive corticosteroids prior to presentation. Furthermore, the severity of chorioretinitis lesion in the group of patients given corticosteroid prior to ATT may account for the poor visual outcome. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.

  10. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  11. Organ procurement: let's presume consent

    OpenAIRE

    Moustarah, F

    1998-01-01

    IN WINNING FIRST PRIZE in the Logie Medical Ethics Essay Contest in 1997, Dr. Fady Moustarah made a strong and compelling argument in favour of presumed consent in the procurement of donor organs. He stressed that a major education campaign will be needed when such a policy is adopted lest some people begin to regard physicians as "organ vultures."

  12. Relationship between the Wigner function and the probability density function in quantum phase space representation

    International Nuclear Information System (INIS)

    Li Qianshu; Lue Liqiang; Wei Gongmin

    2004-01-01

    This paper discusses the relationship between the Wigner function, along with other related quasiprobability distribution functions, and the probability density distribution function constructed from the wave function of the Schroedinger equation in quantum phase space, as formulated by Torres-Vega and Frederick (TF). At the same time, a general approach in solving the wave function of the Schroedinger equation of TF quantum phase space theory is proposed. The relationship of the wave functions between the TF quantum phase space representation and the coordinate or momentum representation is thus revealed

  13. Probability Density Components Analysis: A New Approach to Treatment and Classification of SAR Images

    Directory of Open Access Journals (Sweden)

    Osmar Abílio de Carvalho Júnior

    2014-04-01

    Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.

  14. Head multidetector computed tomography: emergency medicine physicians overestimate the pretest probability and legal risk of significant findings.

    Science.gov (United States)

    Baskerville, Jerry Ray; Herrick, John

    2012-02-01

    This study focuses on clinically assigned prospective estimated pretest probability and pretest perception of legal risk as independent variables in the ordering of multidetector computed tomographic (MDCT) head scans. Our primary aim is to measure the association between pretest probability of a significant finding and pretest perception of legal risk. Secondarily, we measure the percentage of MDCT scans that physicians would not order if there was no legal risk. This study is a prospective, cross-sectional, descriptive analysis of patients 18 years and older for whom emergency medicine physicians ordered a head MDCT. We collected a sample of 138 patients subjected to head MDCT scans. The prevalence of a significant finding in our population was 6%, yet the pretest probability expectation of a significant finding was 33%. The legal risk presumed was even more dramatic at 54%. These data support the hypothesis that physicians presume the legal risk to be significantly higher than the risk of a significant finding. A total of 21% or 15% patients (95% confidence interval, ±5.9%) would not have been subjected to MDCT if there was no legal risk. Physicians overestimated the probability that the computed tomographic scan would yield a significant result and indicated an even greater perceived medicolegal risk if the scan was not obtained. Physician test-ordering behavior is complex, and our study queries pertinent aspects of MDCT testing. The magnification of legal risk vs the pretest probability of a significant finding is demonstrated. Physicians significantly overestimated pretest probability of a significant finding on head MDCT scans and presumed legal risk. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Development and evaluation of probability density functions for a set of human exposure factors

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-06-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.

  16. Development and evaluation of probability density functions for a set of human exposure factors

    International Nuclear Information System (INIS)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-01-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors

  17. Presumed hereditary retinal degenerations: Ibadan experience ...

    African Journals Online (AJOL)

    This study describes the clinical presentation of RP, the prevalence of associated treatable disorders and the characteristics of patients with severe visual impairment and blindness. Method: A retrospective review of 52 cases presumed and diagnosed to have RP was performed on patients who presented at the Eye Clinic, ...

  18. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    Science.gov (United States)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  19. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  20. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  1. The non-Gaussian joint probability density function of slope and elevation for a nonlinear gravity wave field. [in ocean surface

    Science.gov (United States)

    Huang, N. E.; Long, S. R.; Bliven, L. F.; Tung, C.-C.

    1984-01-01

    On the basis of the mapping method developed by Huang et al. (1983), an analytic expression for the non-Gaussian joint probability density function of slope and elevation for nonlinear gravity waves is derived. Various conditional and marginal density functions are also obtained through the joint density function. The analytic results are compared with a series of carefully controlled laboratory observations, and good agreement is noted. Furthermore, the laboratory wind wave field observations indicate that the capillary or capillary-gravity waves may not be the dominant components in determining the total roughness of the wave field. Thus, the analytic results, though derived specifically for the gravity waves, may have more general applications.

  2. Aetiological study of the presumed ocular histoplasmosis syndrome in the Netherlands

    NARCIS (Netherlands)

    Ongkosuwito, J.V.; Kortbeek, L.M.; Lelij, van der A.; Molicka, E.; Kijlstra, A.; Smet, de M.D.; Suttrop-Schulten, M.S.A.

    1999-01-01

    Aim. To investigate whether presumed ocular histoplasmosis syndrome in the Netherlands is caused by Histoplasma capsulatum and whether other risk factors might play a role in the pathogenesis of this syndrome. Methods. 23 patients were clinically diagnosed as having presumed ocular histoplasmosis

  3. Influence of nucleon density distribution in nucleon emission probability

    International Nuclear Information System (INIS)

    Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.

    2014-01-01

    Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield

  4. Word Recognition and Nonword Repetition in Children with Language Disorders: The Effects of Neighborhood Density, Lexical Frequency, and Phonotactic Probability

    Science.gov (United States)

    Rispens, Judith; Baker, Anne; Duinmeijer, Iris

    2015-01-01

    Purpose: The effects of neighborhood density (ND) and lexical frequency on word recognition and the effects of phonotactic probability (PP) on nonword repetition (NWR) were examined to gain insight into processing at the lexical and sublexical levels in typically developing (TD) children and children with developmental language problems. Method:…

  5. Multiple Vehicle Cooperative Localization with Spatial Registration Based on a Probability Hypothesis Density Filter

    Directory of Open Access Journals (Sweden)

    Feihu Zhang

    2014-01-01

    Full Text Available This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments.

  6. Continuous time random walk model with asymptotical probability density of waiting times via inverse Mittag-Leffler function

    Science.gov (United States)

    Liang, Yingjie; Chen, Wen

    2018-04-01

    The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.

  7. An investigation of student understanding of classical ideas related to quantum mechanics: Potential energy diagrams and spatial probability density

    Science.gov (United States)

    Stephanik, Brian Michael

    This dissertation describes the results of two related investigations into introductory student understanding of ideas from classical physics that are key elements of quantum mechanics. One investigation probes the extent to which students are able to interpret and apply potential energy diagrams (i.e., graphs of potential energy versus position). The other probes the extent to which students are able to reason classically about probability and spatial probability density. The results of these investigations revealed significant conceptual and reasoning difficulties that students encounter with these topics. The findings guided the design of instructional materials to address the major problems. Results from post-instructional assessments are presented that illustrate the impact of the curricula on student learning.

  8. Improved Variable Window Kernel Estimates of Probability Densities

    OpenAIRE

    Hall, Peter; Hu, Tien Chung; Marron, J. S.

    1995-01-01

    Variable window width kernel density estimators, with the width varying proportionally to the square root of the density, have been thought to have superior asymptotic properties. The rate of convergence has been claimed to be as good as those typical for higher-order kernels, which makes the variable width estimators more attractive because no adjustment is needed to handle the negativity usually entailed by the latter. However, in a recent paper, Terrell and Scott show that these results ca...

  9. Calculation of probability density functions for temperature and precipitation change under global warming

    International Nuclear Information System (INIS)

    Watterson, Ian G.

    2007-01-01

    Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections

  10. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  11. High-density limit of quantum chromodynamics

    International Nuclear Information System (INIS)

    Alvarez, E.

    1983-01-01

    By means of a formal expansion of the partition function presumably valid at large baryon densities, the propagator of the quarks is expressed in terms of the gluon propagator. This result is interpreted as implying that correlations between quarks and gluons are unimportant at high enough density, so that a kind of mean-field approximation gives a very accurate description of the physical system

  12. Nonspherical atomic ground-state densities and chemical deformation densities from x-ray scattering

    International Nuclear Information System (INIS)

    Ruedenberg, K.; Schwarz, W.H.E.

    1990-01-01

    Presuming that chemical insight can be gained from the difference between the molecular electron density and the superposition of the ground-state densities of the atoms in a molecule, it is pointed out that, for atoms with degenerate ground states, an unpromoted ''atom in a molecule'' is represented by a specific ensemble of the degenerate atomic ground-state wave functions and that this ensemble is determined by the anisotropic local surroundings. The resulting atomic density contributions are termed oriented ground state densities, and the corresponding density difference is called the chemical deformation density. The constraints implied by this conceptual approach for the atomic density contributions are formulated and a method is developed for determining them from x-ray scattering data. The electron density of the appropriate promolecule and its x-ray scattering are derived, the determination of the parameters of the promolecule is outlined, and the chemical deformation density is formulated

  13. On the shake-off probability for atomic systems

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)

    2016-07-15

    Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.

  14. Time-averaged probability density functions of soot nanoparticles along the centerline of a piloted turbulent diffusion flame using a scanning mobility particle sizer

    KAUST Repository

    Chowdhury, Snehaunshu; Boyette, Wesley; Roberts, William L.

    2017-01-01

    In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating

  15. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  16. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    important for most applications or theoretical problems of interest. In statistics ... In probability theory, statistics, statistical mechanics, communication theory, and other .... (1) by taking advantage of SMVT as a general mathematical approach.

  17. Probability density fittings of corrosion test-data: Implications on ...

    Indian Academy of Sciences (India)

    Steel-reinforced concrete; probability distribution functions; corrosion ... to be present in the corrosive system at a suitable concentration (Holoway et al 2004; Söylev & ..... voltage, equivalent to voltage drop, across a resistor divided by the ...

  18. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  19. Comparative magnetic resonance imaging findings between gliomas and presumed cerebrovascular accidents in dogs.

    Science.gov (United States)

    Cervera, Vicente; Mai, Wilfried; Vite, Charles H; Johnson, Victoria; Dayrell-Hart, Betsy; Seiler, Gabriela S

    2011-01-01

    Cerebrovascular accidents, or strokes, and gliomas are common intraaxial brain lesions in dogs. An accurate differentiation of these two lesions is necessary for prognosis and treatment decisions. The magnetic resonance (MR) imaging characteristics of 21 dogs with a presumed cerebrovascular accident and 17 with a glioma were compared. MR imaging findings were reviewed retrospectively by three observers unaware of the final diagnosis. Statistically significant differences between the appearance of gliomas and cerebrovascular accidents were identified based on lesion location, size, mass effect, perilesional edema, and appearance of the apparent diffusion coefficient map. Gliomas were predominantly located in the cerebrum (76%) compared with presumed cerebrovascular accidents that were located mainly in the cerebellum, thalamus, caudate nucleus, midbrain, and brainstem (76%). Gliomas were significantly larger compared with presumed cerebrovascular accidents and more commonly associated with mass effect and perilesional edema. Wedge-shaped lesions were seen only in 19% of presumed cerebrovascular accidents. Between the three observers, 10-47% of the presumed cerebrovascular accidents were misdiagnosed as gliomas, and 0-12% of the gliomas were misdiagnosed as cerebrovascular accidents. Diffusion weighted imaging increased the accuracy of the diagnosis for both lesions. Agreement between observers was moderate (kappa = 0.48, P < 0.01).

  20. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  1. Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...

  2. Evaluation of autopsy imaging (postmortem CT) to presume causes of death

    International Nuclear Information System (INIS)

    Nishihara, Keisuke; Sugihara, Shuji; Morioka, Nobuo; Sato, Shinya; Tsukamoto, Kazumichi; Ogawa, Toshihide

    2010-01-01

    A total of 123 patients arrived at the emergency room in a state of cardiopulmonary arrest were examined by CT after death. Forty one patients (33.3%) were presumed the causes of death by autopsy imaging (Ai). Only 30 patients (24.4%) could be presumed causes of death with postmortem inspection and clinical information. However, presumption rate of cause of death was improved up to 46.3% (22.0 points increase) by adding information provided in Ai. (author)

  3. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)

    2016-10-15

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  4. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo

    2016-01-01

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage

  5. Estimation of Probability Density Functions of Damage Parameter for Valve Leakage Detection in Reciprocating Pump Used in Nuclear Power Plants

    Directory of Open Access Journals (Sweden)

    Jong Kyeom Lee

    2016-10-01

    Full Text Available This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  6. Assumed Probability Density Functions for Shallow and Deep Convection

    Directory of Open Access Journals (Sweden)

    Steven K Krueger

    2010-10-01

    Full Text Available The assumed joint probability density function (PDF between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PDF families are based on the double Gaussian form and the remaining two are the single Gaussian and a Double Delta Function (analogous to a mass flux model. The assumed PDF method is tested for grid sizes as small as 0.4 km to as large as 204.8 km. In addition, studies are performed for PDF sensitivity to errors in the input moments and for how well the PDFs diagnose some higher-order moments. In general, the double Gaussian PDFs more accurately represent SGS cloud structure and turbulence moments in the boundary layer compared to the single Gaussian and Double Delta Function PDFs for the range of grid sizes tested. This is especially true for small SGS cloud fractions. While the most complex PDF, Lewellen-Yoh, better represents shallow convective cloud properties (cloud fraction and liquid water mixing ratio compared to the less complex Analytic Double Gaussian 1 PDF, there appears to be no advantage in implementing Lewellen-Yoh for deep convection. However, the Analytic Double Gaussian 1 PDF better represents the liquid water flux, is less sensitive to errors in the input moments, and diagnoses higher order moments more accurately. Between the Lewellen-Yoh and Analytic Double Gaussian 1 PDFs, it appears that neither family is distinctly better at representing cloudy layers. However, due to the reduced computational cost and fairly robust results, it appears that the Analytic Double Gaussian 1 PDF could be an ideal family for SGS cloud and turbulence

  7. Representation of Probability Density Functions from Orbit Determination using the Particle Filter

    Science.gov (United States)

    Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell

    2012-01-01

    Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.

  8. General Exact Solution to the Problem of the Probability Density for Sums of Random Variables

    Science.gov (United States)

    Tribelsky, Michael I.

    2002-07-01

    The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  9. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  10. [Choice between probability and value of alimentary reinforcement as means for revealing individual typological features of dog behavior].

    Science.gov (United States)

    Chilingarian, L I

    2005-01-01

    Individual typological features of behavior of dogs were investigated by the method of choice between the low-valuable food available constantly and food of high quality presented with low probability. Animals were subjected to instrumental conditioning with the same conditioned stimuli but different types of reinforcement. Depression of a white pedal was always reinforced with meat-bread-crumb mixture, depression of a black pedal was reinforced with two pieces of liver (with probabilities of 100, 40, 33, 20, or 0%). The choice of reinforcement depended on probability of valuable food and individual typological features of the nervous system of a dog. Decreasing the probability of the reinforcement value to 40-20% revealed differences in behavior of dogs. Dogs of the first group, presumably with the weak type of the nervous system, more frequently pressed the white pedal (always reinforced) than the black pedal thus "avoiding a situation of risk" to receive an empty cup. They displayed symptoms of neurosis: whimper, refusals of food or of the choice of reinforcement, and obtrusive movements. Dogs of the second group, presumably with the strong type of the nervous system, more frequently pressed the black pedal (more valuable food) for the low-probability reward until they obtained the valuable food. They did not show neurosis symptoms and were not afraid of "situation of risk". A decrease in probability of the valuable reinforcement increased a percentage of long-latency depressions of pedals. It can be probably suggested that this phenomenon was associated with increasing involvement of cognitive processes, when contributions of the assessments of probability and value of the reinforcement to decision making became approximately equal. Choice between the probability and value of alimentary reinforcement is a good method for revealing individual typological features of dogs.

  11. Influence of the level of fit of a density probability function to wind-speed data on the WECS mean power output estimation

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Velazquez, Sergio

    2008-01-01

    Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error ε made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R 2 statistic (R a 2 ). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R a 2 statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R a 2 increases

  12. Probability function of breaking-limited surface elevation. [wind generated waves of ocean

    Science.gov (United States)

    Tung, C. C.; Huang, N. E.; Yuan, Y.; Long, S. R.

    1989-01-01

    The effect of wave breaking on the probability function of surface elevation is examined. The surface elevation limited by wave breaking zeta sub b(t) is first related to the original wave elevation zeta(t) and its second derivative. An approximate, second-order, nonlinear, non-Gaussian model for zeta(t) of arbitrary but moderate bandwidth is presented, and an expression for the probability density function zeta sub b(t) is derived. The results show clearly that the effect of wave breaking on the probability density function of surface elevation is to introduce a secondary hump on the positive side of the probability density function, a phenomenon also observed in wind wave tank experiments.

  13. Probability density functions for radial anisotropy: implications for the upper 1200 km of the mantle

    Science.gov (United States)

    Beghein, Caroline; Trampert, Jeannot

    2004-01-01

    The presence of radial anisotropy in the upper mantle, transition zone and top of the lower mantle is investigated by applying a model space search technique to Rayleigh and Love wave phase velocity models. Probability density functions are obtained independently for S-wave anisotropy, P-wave anisotropy, intermediate parameter η, Vp, Vs and density anomalies. The likelihoods for P-wave and S-wave anisotropy beneath continents cannot be explained by a dry olivine-rich upper mantle at depths larger than 220 km. Indeed, while shear-wave anisotropy tends to disappear below 220 km depth in continental areas, P-wave anisotropy is still present but its sign changes compared to the uppermost mantle. This could be due to an increase with depth of the amount of pyroxene relative to olivine in these regions, although the presence of water, partial melt or a change in the deformation mechanism cannot be ruled out as yet. A similar observation is made for old oceans, but not for young ones where VSH> VSV appears likely down to 670 km depth and VPH> VPV down to 400 km depth. The change of sign in P-wave anisotropy seems to be qualitatively correlated with the presence of the Lehmann discontinuity, generally observed beneath continents and some oceans but not beneath ridges. Parameter η shows a similar age-related depth pattern as shear-wave anisotropy in the uppermost mantle and it undergoes the same change of sign as P-wave anisotropy at 220 km depth. The ratio between dln Vs and dln Vp suggests that a chemical component is needed to explain the anomalies in most places at depths greater than 220 km. More tests are needed to infer the robustness of the results for density, but they do not affect the results for anisotropy.

  14. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  15. Density limit study on the W7-AS stellarator

    International Nuclear Information System (INIS)

    Grigull, P.; Giannone, L.; Stroth, U.

    1998-01-01

    Data from currentless NBI discharges in W7-AS strongly indicate that the maximum density for quasi-stationary operation is limited by detachment from limiters. The threshold density at the edge scales with P s 0.5 B 0.8 (with P s being the net power flow across the LCMS) which is consistent with an edge based analytic estimation presuming constant threshold downstream temperatures. (author)

  16. Jump probabilities in the non-Markovian quantum jump method

    International Nuclear Information System (INIS)

    Haerkoenen, Kari

    2010-01-01

    The dynamics of a non-Markovian open quantum system described by a general time-local master equation is studied. The propagation of the density operator is constructed in terms of two processes: (i) deterministic evolution and (ii) evolution of a probability density functional in the projective Hilbert space. The analysis provides a derivation for the jump probabilities used in the recently developed non-Markovian quantum jump (NMQJ) method (Piilo et al 2008 Phys. Rev. Lett. 100 180402).

  17. Computing rates of Markov models of voltage-gated ion channels by inverting partial differential equations governing the probability density functions of the conducting and non-conducting states.

    Science.gov (United States)

    Tveito, Aslak; Lines, Glenn T; Edwards, Andrew G; McCulloch, Andrew

    2016-07-01

    Markov models are ubiquitously used to represent the function of single ion channels. However, solving the inverse problem to construct a Markov model of single channel dynamics from bilayer or patch-clamp recordings remains challenging, particularly for channels involving complex gating processes. Methods for solving the inverse problem are generally based on data from voltage clamp measurements. Here, we describe an alternative approach to this problem based on measurements of voltage traces. The voltage traces define probability density functions of the functional states of an ion channel. These probability density functions can also be computed by solving a deterministic system of partial differential equations. The inversion is based on tuning the rates of the Markov models used in the deterministic system of partial differential equations such that the solution mimics the properties of the probability density function gathered from (pseudo) experimental data as well as possible. The optimization is done by defining a cost function to measure the difference between the deterministic solution and the solution based on experimental data. By evoking the properties of this function, it is possible to infer whether the rates of the Markov model are identifiable by our method. We present applications to Markov model well-known from the literature. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Compensatory cerebral motor control following presumed perinatal ischemic stroke

    NARCIS (Netherlands)

    van der Hoorn, Anouk; Potgieser, Adriaan R E; Brouwer, Oebele F; de Jong, Bauke M

    Case: A fifteen year-old left-handed girl presented with right-sided focal motor seizures. Neuroimaging showed a large left hemisphere lesion compatible with a middle cerebral artery stroke of presumed perinatal origin. She was not previously diagnosed with a motor deficit, although neurological

  19. Novel density-based and hierarchical density-based clustering algorithms for uncertain data.

    Science.gov (United States)

    Zhang, Xianchao; Liu, Han; Zhang, Xiaotong

    2017-09-01

    Uncertain data has posed a great challenge to traditional clustering algorithms. Recently, several algorithms have been proposed for clustering uncertain data, and among them density-based techniques seem promising for handling data uncertainty. However, some issues like losing uncertain information, high time complexity and nonadaptive threshold have not been addressed well in the previous density-based algorithm FDBSCAN and hierarchical density-based algorithm FOPTICS. In this paper, we firstly propose a novel density-based algorithm PDBSCAN, which improves the previous FDBSCAN from the following aspects: (1) it employs a more accurate method to compute the probability that the distance between two uncertain objects is less than or equal to a boundary value, instead of the sampling-based method in FDBSCAN; (2) it introduces new definitions of probability neighborhood, support degree, core object probability, direct reachability probability, thus reducing the complexity and solving the issue of nonadaptive threshold (for core object judgement) in FDBSCAN. Then, we modify the algorithm PDBSCAN to an improved version (PDBSCANi), by using a better cluster assignment strategy to ensure that every object will be assigned to the most appropriate cluster, thus solving the issue of nonadaptive threshold (for direct density reachability judgement) in FDBSCAN. Furthermore, as PDBSCAN and PDBSCANi have difficulties for clustering uncertain data with non-uniform cluster density, we propose a novel hierarchical density-based algorithm POPTICS by extending the definitions of PDBSCAN, adding new definitions of fuzzy core distance and fuzzy reachability distance, and employing a new clustering framework. POPTICS can reveal the cluster structures of the datasets with different local densities in different regions better than PDBSCAN and PDBSCANi, and it addresses the issues in FOPTICS. Experimental results demonstrate the superiority of our proposed algorithms over the existing

  20. A Student’s t Mixture Probability Hypothesis Density Filter for Multi-Target Tracking with Outliers

    Science.gov (United States)

    Liu, Zhuowei; Chen, Shuxin; Wu, Hao; He, Renke; Hao, Lin

    2018-01-01

    In multi-target tracking, the outliers-corrupted process and measurement noises can reduce the performance of the probability hypothesis density (PHD) filter severely. To solve the problem, this paper proposed a novel PHD filter, called Student’s t mixture PHD (STM-PHD) filter. The proposed filter models the heavy-tailed process noise and measurement noise as a Student’s t distribution as well as approximates the multi-target intensity as a mixture of Student’s t components to be propagated in time. Then, a closed PHD recursion is obtained based on Student’s t approximation. Our approach can make full use of the heavy-tailed characteristic of a Student’s t distribution to handle the situations with heavy-tailed process and the measurement noises. The simulation results verify that the proposed filter can overcome the negative effect generated by outliers and maintain a good tracking accuracy in the simultaneous presence of process and measurement outliers. PMID:29617348

  1. The effect of fog on the probability density distribution of the ranging data of imaging laser radar

    Directory of Open Access Journals (Sweden)

    Wenhua Song

    2018-02-01

    Full Text Available This paper outlines theoretically investigations of the probability density distribution (PDD of ranging data for the imaging laser radar (ILR system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.

  2. The effect of fog on the probability density distribution of the ranging data of imaging laser radar

    Science.gov (United States)

    Song, Wenhua; Lai, JianCheng; Ghassemlooy, Zabih; Gu, Zhiyong; Yan, Wei; Wang, Chunyong; Li, Zhenhua

    2018-02-01

    This paper outlines theoretically investigations of the probability density distribution (PDD) of ranging data for the imaging laser radar (ILR) system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.

  3. The probability representation as a new formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Man'ko, Margarita A; Man'ko, Vladimir I

    2012-01-01

    We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.

  4. Presumed Cases of Mumps in Pregnancy: Clinical and Infection Control Implications

    Directory of Open Access Journals (Sweden)

    Svjetlana Lozo

    2012-01-01

    Full Text Available Recently, a mumps outbreak in New York and New Jersey was reported by the Centers for Disease Control and Prevention (CDC. Subsequently, the dissemination of the disease was rapid, and, from June 28th 2009 through January 29th 2010, a total of 1,521 cases of mumps were reported in New York and New Jersey. Seven presumed cases occurred in pregnant women cared for at our institution. Mumps diagnosis as per the NYC Department of Health and Mental Hygiene was based on clinical manifestations, particularly parotitis. Prior immunizations with mumps vaccine and negative IgM were not adequate to rule out mumps infections. All of our seven patients had exposure to mumps in either their household or their community, and some of the them had symptoms of mumps. Due to the difficulties in interpreting serologies of these patients, their cases led to a presumed diagnosis of mumps. The diagnosis of mumps lead to the isolation of patients and health care personnel that were in contact with them. In this paper, we detail the presenting findings, diagnostic dilemmas and infection control challenges associated with presumed cases of mumps in pregnancy.

  5. On the quantification and efficient propagation of imprecise probabilities resulting from small datasets

    Science.gov (United States)

    Zhang, Jiaxin; Shields, Michael D.

    2018-01-01

    This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.

  6. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  7. Influence of the level of fit of a density probability function to wind-speed data on the WECS mean power output estimation

    Energy Technology Data Exchange (ETDEWEB)

    Carta, Jose A. [Department of Mechanical Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Las Palmas de Gran Canaria, Canary Islands (Spain); Ramirez, Penelope; Velazquez, Sergio [Department of Renewable Energies, Technological Institute of the Canary Islands, Pozo Izquierdo Beach s/n, 35119 Santa Lucia, Gran Canaria, Canary Islands (Spain)

    2008-10-15

    Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error {epsilon} made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R{sup 2} statistic (R{sub a}{sup 2}). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R{sub a}{sup 2} statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R{sub a}{sup 2} increases. (author)

  8. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  9. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    Directory of Open Access Journals (Sweden)

    Carmen Moret-Tatay

    2018-05-01

    Full Text Available The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area. The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  10. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    Science.gov (United States)

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  11. Probability density function of a puff dispersing from the wall of a turbulent channel

    Science.gov (United States)

    Nguyen, Quoc; Papavassiliou, Dimitrios

    2015-11-01

    Study of dispersion of passive contaminants in turbulence has proved to be helpful in understanding fundamental heat and mass transfer phenomena. Many simulation and experimental works have been carried out to locate and track motions of scalar markers in a flow. One method is to combine Direct Numerical Simulation (DNS) and Lagrangian Scalar Tracking (LST) to record locations of markers. While this has proved to be useful, high computational cost remains a concern. In this study, we develop a model that could reproduce results obtained by DNS and LST for turbulent flow. Puffs of markers with different Schmidt numbers were released into a flow field at a frictional Reynolds number of 150. The point of release was at the channel wall, so that both diffusion and convection contribute to the puff dispersion pattern, defining different stages of dispersion. Based on outputs from DNS and LST, we seek the most suitable and feasible probability density function (PDF) that represents distribution of markers in the flow field. The PDF would play a significant role in predicting heat and mass transfer in wall turbulence, and would prove to be helpful where DNS and LST are not always available.

  12. Presumed Optic Disc Melanocytoma in a Young Nigerian: A ...

    African Journals Online (AJOL)

    homogenous soft tissue mass with broad base arising from the choroid in the optic nerve area and projecting into the vitreous cavity. No retinal detachment or sub-retinal fluid was seen. An assessment of right presumed ODM was made. She was refracted with visual acuity improvement to 6/5 in either eye and spectacles ...

  13. Detection of periprosthetic joint infections in presumed aseptic patients

    DEFF Research Database (Denmark)

    Xu, Yijuan; Lorenzen, Jan; Thomsen, Trine Rolighed

    2016-01-01

    Title: Detection of periprosthetic joint infections in presumed aseptic patients Yijuan Xu1, Jan Lorenzen1, Trine Rolighed Thomsen1,2, Kathrin Kluba3, Kathrin Chamaon3, Christoph Lohmann3 1. Danish Technological Institute, Aarhus, Denmark 2. Center for Microbial Communities, Department of Biotech......Title: Detection of periprosthetic joint infections in presumed aseptic patients Yijuan Xu1, Jan Lorenzen1, Trine Rolighed Thomsen1,2, Kathrin Kluba3, Kathrin Chamaon3, Christoph Lohmann3 1. Danish Technological Institute, Aarhus, Denmark 2. Center for Microbial Communities, Department...... of Biotechnology, Chemistry and Environmental Engineering, Aalborg University, Denmark 3. Department of Orthopaedics, Otto-von-Guericke University of Magdeburg, Germany Aim: ”The HypOrth project (New approaches in the development of Hypoallergenic implant material in Orthopaedics: Steps to personalised medicine......) aims to investigate adverse immune reactions to implant materials. For this project, it is of utmost importance to exclude patients with periprosthetic joint infections (PJIs). The aim of this study was to rule out PJIs in included patients using prolonged culture and next generation sequencing (NGS...

  14. Learning Grasp Affordance Densities

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Kroemer, Oliver

    2011-01-01

    and relies on kernel density estimation to provide a continuous model. Grasp densities are learned and refined from exploration, by letting a robot “play” with an object in a sequence of graspand-drop actions: The robot uses visual cues to generate a set of grasp hypotheses; it then executes......We address the issue of learning and representing object grasp affordance models. We model grasp affordances with continuous probability density functions (grasp densities) which link object-relative grasp poses to their success probability. The underlying function representation is nonparametric...... these and records their outcomes. When a satisfactory number of grasp data is available, an importance-sampling algorithm turns these into a grasp density. We evaluate our method in a largely autonomous learning experiment run on three objects of distinct shapes. The experiment shows how learning increases success...

  15. Effects of Potential Lane-Changing Probability on Uniform Flow

    International Nuclear Information System (INIS)

    Tang Tieqiao; Huang Haijun; Shang Huayan

    2010-01-01

    In this paper, we use the car-following model with the anticipation effect of the potential lane-changing probability (Acta Mech. Sin. 24 (2008) 399) to investigate the effects of the potential lane-changing probability on uniform flow. The analytical and numerical results show that the potential lane-changing probability can enhance the speed and flow of uniform flow and that their increments are related to the density.

  16. The maximum entropy method of moments and Bayesian probability theory

    Science.gov (United States)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  17. 41 CFR 301-72.1 - Why is common carrier presumed to be the most advantageous method of transportation?

    Science.gov (United States)

    2010-07-01

    ... presumed to be the most advantageous method of transportation? 301-72.1 Section 301-72.1 Public Contracts... Transportation § 301-72.1 Why is common carrier presumed to be the most advantageous method of transportation? Travel by common carrier is presumed to be the most advantageous method of transportation because it...

  18. Probability measures, Lévy measures and analyticity in time

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich

    2008-01-01

    We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators, we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...

  19. Probability Measures, Lévy Measures, and Analyticity in Time

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich

    We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...

  20. Mechanisms Affecting Population Density in Fragmented Habitat

    Directory of Open Access Journals (Sweden)

    Lutz Tischendorf

    2005-06-01

    Full Text Available We conducted a factorial simulation experiment to analyze the relative importance of movement pattern, boundary-crossing probability, and mortality in habitat and matrix on population density, and its dependency on habitat fragmentation, as well as inter-patch distance. We also examined how the initial response of a species to a fragmentation event may affect our observations of population density in post-fragmentation experiments. We found that the boundary-crossing probability from habitat to matrix, which partly determines the emigration rate, is the most important determinant for population density within habitat patches. The probability of crossing a boundary from matrix to habitat had a weaker, but positive, effect on population density. Movement behavior in habitat had a stronger effect on population density than movement behavior in matrix. Habitat fragmentation and inter-patch distance may have a positive or negative effect on population density. The direction of both effects depends on two factors. First, when the boundary-crossing probability from habitat to matrix is high, population density may decline with increasing habitat fragmentation. Conversely, for species with a high matrix-to-habitat boundary-crossing probability, population density may increase with increasing habitat fragmentation. Second, the initial distribution of individuals across the landscape: we found that habitat fragmentation and inter-patch distance were positively correlated with population density when individuals were distributed across matrix and habitat at the beginning of our simulation experiments. The direction of these relationships changed to negative when individuals were initially distributed across habitat only. Our findings imply that the speed of the initial response of organisms to habitat fragmentation events may determine the direction of observed relationships between habitat fragmentation and population density. The time scale of post

  1. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  2. Residual Defect Density in Random Disks Deposits.

    Science.gov (United States)

    Topic, Nikola; Pöschel, Thorsten; Gallas, Jason A C

    2015-08-03

    We investigate the residual distribution of structural defects in very tall packings of disks deposited randomly in large channels. By performing simulations involving the sedimentation of up to 50 × 10(9) particles we find all deposits to consistently show a non-zero residual density of defects obeying a characteristic power-law as a function of the channel width. This remarkable finding corrects the widespread belief that the density of defects should vanish algebraically with growing height. A non-zero residual density of defects implies a type of long-range spatial order in the packing, as opposed to only local ordering. In addition, we find deposits of particles to involve considerably less randomness than generally presumed.

  3. Presumed consent in organ donation: the devil is in the detail

    OpenAIRE

    Hutchinson, Odette

    2008-01-01

    This article follows the recent publication of the Organs for Donation Task Force report, "Organs for Transplants", and considers the debate surrounding a change in the law in favour of presumed consent in organ donation.

  4. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  5. What probabilities tell about quantum systems, with application to entropy and entanglement

    CERN Document Server

    Myers, John M

    2010-01-01

    The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”

  6. The probability of a tornado missile hitting a target

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1983-01-01

    It is shown that tornado missile transportation is a diffusion Markovian process. Therefore, the Green's function method is applied for the estimation of the probability of hitting a unit target area. This propability is expressed through a joint density of tornado intensity and path area, a probability of tornado missile injection and a tornado missile height distribution. (orig.)

  7. Using probability density function in the procedure for recognition of the type of physical exercise

    Directory of Open Access Journals (Sweden)

    Cakić Nikola

    2017-01-01

    Full Text Available This paper presents a method for recognition of physical exercises, using only a triaxial accelerometer of a smartphone. The smartphone itself is free to move inside subject's pocket. Exercises for leg muscle strengthening from subject's standing position squat, right knee rise and lunge with right leg were analyzed. All exercises were performed with the accelerometric sensor of a smartphone placed in the pocket next to the leg used for exercises. In order to test the proposed recognition method, the knee rise exercise of the opposite leg with the same position of the sensor was randomly selected. Filtering of the raw accelerometric signals was carried out using Butterworth tenth-order low-pass filter. The filtered signals from each of the three axes were described using three signal descriptors. After the descriptors were calculated, a probability density function was constructed for each of the descriptors. The program that implemented the proposed recognition method was executed online within an Android application of the smartphone. Signals from two male and two female subjects were considered as a reference for exercise recognition. The exercise recognition accuracy was 94.22% for three performed exercises, and 85.33% for all four considered exercises.

  8. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  9. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  10. On density forecast evaluation

    NARCIS (Netherlands)

    Diks, C.

    2008-01-01

    Traditionally, probability integral transforms (PITs) have been popular means for evaluating density forecasts. For an ideal density forecast, the PITs should be uniformly distributed on the unit interval and independent. However, this is only a necessary condition, and not a sufficient one, as

  11. Predicting Ligand Binding Sites on Protein Surfaces by 3-Dimensional Probability Density Distributions of Interacting Atoms

    Science.gov (United States)

    Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei

    2016-01-01

    Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851

  12. Annihilation probability density and other applications of the Schwinger multichannel method to the positron and electron scattering

    International Nuclear Information System (INIS)

    Varella, Marcio Teixeira do Nascimento

    2001-12-01

    We have calculated annihilation probability densities (APD) for positron collisions against He atom and H 2 molecule. It was found that direct annihilation prevails at low energies, while annihilation following virtual positronium (Ps) formation is the dominant mechanism at higher energies. In room-temperature collisions (10 -2 eV) the APD spread over a considerable extension, being quite similar to the electronic densities of the targets. The capture of the positron in an electronic Feshbach resonance strongly enhanced the annihilation rate in e + -H 2 collisions. We also discuss strategies to improve the calculation of the annihilation parameter (Z eff ), after debugging the computational codes of the Schwinger Multichannel Method (SMC). Finally, we consider the inclusion of the Ps formation channel in the SMC and show that effective configurations (pseudo eigenstates of the Hamiltonian of the collision ) are able to significantly reduce the computational effort in positron scattering calculations. Cross sections for electron scattering by polyatomic molecules were obtained in three different approximations: static-exchange (SE); tatic-exchange-plus-polarization (SEP); and multichannel coupling. The calculations for polar targets were improved through the rotational resolution of scattering amplitudes in which the SMC was combined with the first Born approximation (FBA). In general, elastic cross sections (SE and SEP approximations) showed good agreement with available experimental data for several targets. Multichannel calculations for e - -H 2 O scattering, on the other hand, presented spurious structures at the electronic excitation thresholds (author)

  13. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  14. DCMDN: Deep Convolutional Mixture Density Network

    Science.gov (United States)

    D'Isanto, Antonio; Polsterer, Kai Lars

    2017-09-01

    Deep Convolutional Mixture Density Network (DCMDN) estimates probabilistic photometric redshift directly from multi-band imaging data by combining a version of a deep convolutional network with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) are applied as performance criteria. DCMDN is able to predict redshift PDFs independently from the type of source, e.g. galaxies, quasars or stars and renders pre-classification of objects and feature extraction unnecessary; the method is extremely general and allows the solving of any kind of probabilistic regression problems based on imaging data, such as estimating metallicity or star formation rate in galaxies.

  15. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  16. Probability density function modeling of scalar mixing from concentrated sources in turbulent channel flow

    Science.gov (United States)

    Bakosi, J.; Franzese, P.; Boybeyi, Z.

    2007-11-01

    Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth and Pope [Phys. Fluids 29, 387 (1986)] with Durbin's [J. Fluid Mech. 249, 465 (1993)] method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous transport with a nonlocal representation of the near-wall Reynolds stress anisotropy. The presence of walls is incorporated through the imposition of no-slip and impermeability conditions on particles without the use of damping or wall-functions. Information on the turbulent time scale is supplied by the gamma-distribution model of van Slooten et al. [Phys. Fluids 10, 246 (1998)]. Two different micromixing models are compared that incorporate the effect of small scale mixing on the transported scalar: the widely used interaction by exchange with the mean and the interaction by exchange with the conditional mean model. Single-point velocity and concentration statistics are compared to direct numerical simulation and experimental data at Reτ=1080 based on the friction velocity and the channel half width. The joint model accurately reproduces a wide variety of conditional and unconditional statistics in both physical and composition space.

  17. Power probability density function control and performance assessment of a nuclear research reactor

    International Nuclear Information System (INIS)

    Abharian, Amir Esmaeili; Fadaei, Amir Hosein

    2014-01-01

    Highlights: • In this paper, the performance assessment of static PDF control system is discussed. • The reactor PDF model is set up based on the B-spline functions. • Acquaints of Nu, and Th-h. equations solve concurrently by reformed Hansen’s method. • A principle of performance assessment is put forward for the PDF of the NR control. - Abstract: One of the main issues in controlling a system is to keep track of the conditions of the system function. The performance condition of the system should be inspected continuously, to keep the system in reliable working condition. In this study, the nuclear reactor is considered as a complicated system and a principle of performance assessment is used for analyzing the performance of the power probability density function (PDF) of the nuclear research reactor control. First, the model of the power PDF is set up, then the controller is designed to make the power PDF for tracing the given shape, that make the reactor to be a closed-loop system. The operating data of the closed-loop reactor are used to assess the control performance with the performance assessment criteria. The modeling, controller design and the performance assessment of the power PDF are all applied to the control of Tehran Research Reactor (TRR) power in a nuclear process. In this paper, the performance assessment of the static PDF control system is discussed, the efficacy and efficiency of the proposed method are investigated, and finally its reliability is proven

  18. Cross Check of NOvA Oscillation Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics

    2018-01-12

    In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.

  19. 28 CFR 104.44 - Determination of presumed noneconomic losses for decedents.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Determination of presumed noneconomic losses for decedents. 104.44 Section 104.44 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) SEPTEMBER 11TH VICTIM COMPENSATION FUND OF 2001 Amount of Compensation for Eligible Claimants. § 104.44...

  20. Presuming consent in the ethics of posthumous sperm procurement and conception.

    Science.gov (United States)

    Kroon, Frederick

    2015-12-01

    This paper compares standard conceptions of consent with the conception of consent defended by Kelton Tremellen and Julian Savulescu in their attempt to re-orient the ethical debate around posthumous sperm procurement and conception, as published in Reproductive BioMedicine Online in 2015. According to their radical proposal, the surviving partner's wishes are, in effect, the only condition that needs to be considered for there to be a legitimate moral case for these procedures: the default should be presumed consent to the procedures, whether or not the agent did consent or would have consented. The present paper argues that Tremellen and Savulescu's case for this position is flawed, but offers a reconstruction that articulates what may well be a hidden, and perhaps reasonable, assumption behind the argument. But while the new argument appears more promising, the reconstruction also suggests that the position of presumed consent is currently unlikely to be acceptable as policy.

  1. Presuming consent in the ethics of posthumous sperm procurement and conception

    Directory of Open Access Journals (Sweden)

    Frederick Kroon

    2015-12-01

    Full Text Available This paper compares standard conceptions of consent with the conception of consent defended by Kelton Tremellen and Julian Savulescu in their attempt to re-orient the ethical debate around posthumous sperm procurement and conception, as published in Reproductive BioMedicine Online in 2015. According to their radical proposal, the surviving partner’s wishes are, in effect, the only condition that needs to be considered for there to be a legitimate moral case for these procedures: the default should be presumed consent to the procedures, whether or not the agent did consent or would have consented. The present paper argues that Tremellen and Savulescu’s case for this position is flawed, but offers a reconstruction that articulates what may well be a hidden, and perhaps reasonable, assumption behind the argument. But while the new argument appears more promising, the reconstruction also suggests that the position of presumed consent is currently unlikely to be acceptable as policy.

  2. Method for assessing the probability of accumulated doses from an intermittent source using the convolution technique

    International Nuclear Information System (INIS)

    Coleman, J.H.

    1980-10-01

    A technique is discussed for computing the probability distribution of the accumulated dose received by an arbitrary receptor resulting from several single releases from an intermittent source. The probability density of the accumulated dose is the convolution of the probability densities of doses from the intermittent releases. Emissions are not assumed to be constant over the brief release period. The fast fourier transform is used in the calculation of the convolution

  3. Absolute Kr I and Kr II transition probabilities

    International Nuclear Information System (INIS)

    Brandt, T.; Helbig, V.; Nick, K.P.

    1982-01-01

    Transition probabilities for 11 KrI and 9 KrII lines between 366.5 and 599.3nm were obtained from measurements with a wall-stabilised arc at atmospheric pressure in pure krypton. The population densities of the excited krypton levels were calculated under the assumption of LTE from electron densities measured by laser interferometry. The uncertainties for the KrI and the KrII data are 15 and 25% respectively. (author)

  4. Presumed PDF modeling of microjet assisted CH4–H2/air turbulent flames

    International Nuclear Information System (INIS)

    Chouaieb, Sirine; Kriaa, Wassim; Mhiri, Hatem; Bournot, Philippe

    2016-01-01

    Highlights: • Microjet assisted CH 4 –H 2 /air turbulent flames are numerically investigated. • Temperature, species and soot are well predicted by the Presumed PDF model. • An inner flame is identified due to the microjet presence. • The addition of hydrogen to the microjet assisted flames enhances mixing. • Soot emission is reduced by 36% for a 10% enriched microjet assisted flame. - Abstract: The characteristics of microjet assisted CH 4 –H 2 /air flames in a turbulent mode are numerically investigated. Simulations are performed using the Computational Fluid Dynamics code Fluent. The Presumed PDF and the Discrete Ordinates models are considered respectively for combustion and radiation modeling. The k–ε Realizable model is adopted as a turbulence closure model. The Tesner model is used to calculate soot particle quantities. In the first part of this paper, the Presumed PDF model is compared to the Eddy Dissipation model and to slow chemistry combustion models from literature. Results show that the Presumed PDF model predicts correctly thermal and species fields, as well as soot formation. The effect of hydrogen enrichment on CH 4 /air confined flames under the addition of an air microjet is investigated in the second part of this work. The found results show that an inner flame was identified due to the air microjet for the CH 4 –H 2 /air flames. Moreover, the increase of hydrogen percentage in the fuel mixture leads to mixing enhancement and consequently to considerable soot emission reduction.

  5. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  6. Transverse comparisons between ultrasound and radionuclide parameters in children with presumed antenatally detected pelvi-ureteric junction obstruction

    Energy Technology Data Exchange (ETDEWEB)

    Duong, Hong Phuoc; Janssen, Francoise; Hall, Michelle; Ismaili, Khalid [Universite Libre de Bruxelles (ULB), Department of Pediatric Nephrology, Hopital Universitaire des Enfants Reine Fabiola, Brussels (Belgium); Piepsz, Amy [Hopital Universitaire Saint-Pierre, Department of Radioisotopes, Ghent (Belgium); Khelif, Karim; Collier, Frank [Universite Libre de Bruxelles (ULB), Department of Pediatric Urology, Hopital Universitaire des Enfants Reine Fabiola, Brussel (Belgium); Man, Kathia de [University Hospital Ghent, Department of Nuclear Medicine, Ghent (Belgium); Damry, Nash [Universite Libre de Bruxelles (ULB), Department of Pediatric Radiology, Hopital Universitaire des Enfants Reine Fabiola, Brussel (Belgium)

    2015-05-01

    The main criteria used for deciding on surgery in children with presumed antenatally detected pelviureteric junction obstruction (PPUJO) are the level of hydronephrosis (ultrasonography), the level of differential renal function (DRF) and the quality of renal drainage after a furosemide challenge (renography), the importance of each factor being far from generally agreed. Can we predict, on the basis of ultrasound parameters, the patient in whom radionuclide renography can be avoided? We retrospectively analysed the medical charts of 81 consecutive children with presumed unilateral PPUJO detected antenatally. Ultrasound and renographic studies performed at the same time were compared. Anteroposterior pelvic diameter (APD) and calyceal size were both divided into three levels of dilatation. Parenchymal thickness was considered either normal or significantly decreased. Acquisition of renograms under furosemide stimulation provided quantification of DRF, quality of renal drainage and cortical transit. The percentages of patients with low DRF and poor drainage were significantly higher among those with major hydronephrosis, severe calyceal dilatation or parenchymal thinning. Moreover, impaired cortical transit, which is a major risk factor for functional decline, was seen more frequently among those with very severe calyceal dilatation. However, none of the structural parameters obtained by ultrasound examination was able to predict whether the level of renal function or the quality of drainage was normal or abnormal. Alternatively, an APD <30 mm, a calyceal dilatation of <10 mm and a normal parenchymal thickness were associated with a low probability of decreased renal function or poor renal drainage. In the management strategy of patients with prenatally detected PPUJO, nuclear medicine examinations may be postponed in those with an APD <30 mm, a calyceal dilatation of <10 mm and a normal parenchymal thickness. On the contrary, precise estimation of DRF and renal

  7. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  8. A joint probability density function of wind speed and direction for wind energy analysis

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Bueno, Celia

    2008-01-01

    A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy

  9. Postfragmentation density function for bacterial aggregates in laminar flow.

    Science.gov (United States)

    Byrne, Erin; Dzul, Steve; Solomon, Michael; Younger, John; Bortz, David M

    2011-04-01

    The postfragmentation probability density of daughter flocs is one of the least well-understood aspects of modeling flocculation. We use three-dimensional positional data of Klebsiella pneumoniae bacterial flocs in suspension and the knowledge of hydrodynamic properties of a laminar flow field to construct a probability density function of floc volumes after a fragmentation event. We provide computational results which predict that the primary fragmentation mechanism for large flocs is erosion. The postfragmentation probability density function has a strong dependence on the size of the original floc and indicates that most fragmentation events result in clumps of one to three bacteria eroding from the original floc. We also provide numerical evidence that exhaustive fragmentation yields a limiting density inconsistent with the log-normal density predicted in the literature, most likely due to the heterogeneous nature of K. pneumoniae flocs. To support our conclusions, artificial flocs were generated and display similar postfragmentation density and exhaustive fragmentation. ©2011 American Physical Society

  10. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  11. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  12. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  13. Protein distance constraints predicted by neural networks and probability density functions

    DEFF Research Database (Denmark)

    Lund, Ole; Frimand, Kenneth; Gorodkin, Jan

    1997-01-01

    We predict interatomic C-α distances by two independent data driven methods. The first method uses statistically derived probability distributions of the pairwise distance between two amino acids, whilst the latter method consists of a neural network prediction approach equipped with windows taki...... method based on the predicted distances is presented. A homepage with software, predictions and data related to this paper is available at http://www.cbs.dtu.dk/services/CPHmodels/...

  14. Probability density adjoint for sensitivity analysis of the Mean of Chaos

    Energy Technology Data Exchange (ETDEWEB)

    Blonigan, Patrick J., E-mail: blonigan@mit.edu; Wang, Qiqi, E-mail: qiqi@mit.edu

    2014-08-01

    Sensitivity analysis, especially adjoint based sensitivity analysis, is a powerful tool for engineering design which allows for the efficient computation of sensitivities with respect to many parameters. However, these methods break down when used to compute sensitivities of long-time averaged quantities in chaotic dynamical systems. This paper presents a new method for sensitivity analysis of ergodic chaotic dynamical systems, the density adjoint method. The method involves solving the governing equations for the system's invariant measure and its adjoint on the system's attractor manifold rather than in phase-space. This new approach is derived for and demonstrated on one-dimensional chaotic maps and the three-dimensional Lorenz system. It is found that the density adjoint computes very finely detailed adjoint distributions and accurate sensitivities, but suffers from large computational costs.

  15. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  16. Presumed atypical HDR syndrome associated with Band Keratopathy and pigmentary retinopathy.

    Science.gov (United States)

    Kim, Cinoo; Cheong, Hae Il; Kim, Jeong Hun; Yu, Young Suk; Kwon, Ji Won

    2011-01-01

    This report describes presumed atypical hypoparathyroidism, deafness, and renal dysplasia (HDR) syndrome associated with unexpected ocular findings. The patient had exotropia, bilateral band keratopathy, and pigmentary retinopathy, including attenuated retinal vessels and atrophy of the retinal pigment epithelium. Even though the calcific plaques were successfully removed, visual acuity in both eyes gradually decreased and electroretinography was extinguished. Copyright 2009, SLACK Incorporated.

  17. Evaluation of burst probability for tubes by Weibull distributions

    International Nuclear Information System (INIS)

    Kao, S.

    1975-10-01

    The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities

  18. Presumed Perinatal Stroke in a Child with Down Syndrome and Moyamoya Disease

    Science.gov (United States)

    Pysden, Karen; Fallon, Penny; Moorthy, Bhagavatheswaran; Ganesan, Vijeya

    2010-01-01

    Moyamoya disease describes a cerebral arteriopathy characterized by stenosis or occlusion of the terminal internal carotid and/or the proximal middle cerebral arteries. We report a female child with trisomy 21 and bilateral moyamoya disease who presented, unusually, with a presumed perinatal cerebral infarct. The clinical, radiological, and…

  19. Feline dry eye syndrome of presumed neurogenic origin: a case report

    Directory of Open Access Journals (Sweden)

    Lionel Sebbag

    2017-12-01

    Full Text Available Case summary A 14-year-old female spayed Abyssinian cat, which about 1 year previously underwent thoracic limb amputation, radiotherapy and chemotherapy for an incompletely excised vaccine-related fibrosarcoma, was presented for evaluation of corneal opacity in the left eye (OS. The ocular surface of both eyes (OU had a lackluster appearance and there was a stromal corneal ulcer OS. Results of corneal aesthesiometry, Schirmer tear test-1 (STT-1 and tear film breakup time revealed corneal hypoesthesia, and quantitative and qualitative tear film deficiency OU. Noxious olfactory stimulation caused increased lacrimation relative to standard STT-1 values suggesting an intact nasolacrimal reflex. Various lacrimostimulants were administered in succession; namely, 1% pilocarpine administered topically (15 days or orally (19 days, and topically applied 0.03% tacrolimus (47 days. Pilocarpine, especially when given orally, was associated with notable increases in STT-1 values, but corneal ulceration remained/recurred regardless of administration route, and oral pilocarpine resulted in gastrointestinal upset. Tacrolimus was not effective. After 93 days, the cat became weak and lame and a low thyroxine concentration was detected in serum. The cat was euthanized and a necropsy performed. Both lacrimal glands were histologically normal, but chronic neutrophilic keratitis and reduced conjunctival goblet cell density were noted OU. Relevance and novel information The final diagnosis was dry eye syndrome (DES of presumed neurogenic origin, associated with corneal hypoesthesia. This report reinforces the importance of conducting tearfilm testing in cats with ocular surface disease, as clinical signs of DES were different from those described in dogs.

  20. Feline dry eye syndrome of presumed neurogenic origin: a case report.

    Science.gov (United States)

    Sebbag, Lionel; Pesavento, Patricia A; Carrasco, Sebastian E; Reilly, Christopher M; Maggs, David J

    2018-01-01

    A 14-year-old female spayed Abyssinian cat, which about 1 year previously underwent thoracic limb amputation, radiotherapy and chemotherapy for an incompletely excised vaccine-related fibrosarcoma, was presented for evaluation of corneal opacity in the left eye (OS). The ocular surface of both eyes (OU) had a lackluster appearance and there was a stromal corneal ulcer OS. Results of corneal aesthesiometry, Schirmer tear test-1 (STT-1) and tear film breakup time revealed corneal hypoesthesia, and quantitative and qualitative tear film deficiency OU. Noxious olfactory stimulation caused increased lacrimation relative to standard STT-1 values suggesting an intact nasolacrimal reflex. Various lacrimostimulants were administered in succession; namely, 1% pilocarpine administered topically (15 days) or orally (19 days), and topically applied 0.03% tacrolimus (47 days). Pilocarpine, especially when given orally, was associated with notable increases in STT-1 values, but corneal ulceration remained/recurred regardless of administration route, and oral pilocarpine resulted in gastrointestinal upset. Tacrolimus was not effective. After 93 days, the cat became weak and lame and a low thyroxine concentration was detected in serum. The cat was euthanized and a necropsy performed. Both lacrimal glands were histologically normal, but chronic neutrophilic keratitis and reduced conjunctival goblet cell density were noted OU. The final diagnosis was dry eye syndrome (DES) of presumed neurogenic origin, associated with corneal hypoesthesia. This report reinforces the importance of conducting tearfilm testing in cats with ocular surface disease, as clinical signs of DES were different from those described in dogs.

  1. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    Science.gov (United States)

    Wilcox, Taylor M; Mckelvey, Kevin S.; Young, Michael K.; Sepulveda, Adam; Shepard, Bradley B.; Jane, Stephen F; Whiteley, Andrew R.; Lowe, Winsor H.; Schwartz, Michael K.

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive independent estimates of eDNA production rates and downstream persistence from brook trout (Salvelinus fontinalis) in streams. We use these estimates to parameterize models comparing the false negative detection rates of eDNA sampling and traditional backpack electrofishing. We find that using the protocols in this study eDNA had reasonable detection probabilities at extremely low animal densities (e.g., probability of detection 0.18 at densities of one fish per stream kilometer) and very high detection probabilities at population-level densities (e.g., probability of detection > 0.99 at densities of ≥ 3 fish per 100 m). This is substantially more sensitive than traditional electrofishing for determining the presence of brook trout and may translate into important cost savings when animals are rare. Our findings are consistent with a growing body of literature showing that eDNA sampling is a powerful tool for the detection of aquatic species, particularly those that are rare and difficult to sample using traditional methods.

  2. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  3. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  4. Optical coherence tomography and fundus autofluorescence findings in presumed congenital simple retinal pigment epithelium hamartoma

    Directory of Open Access Journals (Sweden)

    Baskaran, Prabu

    2017-10-01

    Full Text Available Aim: Presumed congenital simple retinal pigment epithelium hamartoma is a rare benign lesion of the macula that mimics congenital hypertrophy of the retinal pigment epithelium (RPE and combined hamartoma of the retina and the RPE; newer imaging modalities can help in diagnosis. We report three patients with presumed congenital simple RPE hamartoma, and describe the enhanced-depth imaging optical coherence tomography (EDI-OCT and fundus autofluorescence (FAF findings. Methods: Two patients were asymptomatic; one had an intraocular foreign body in addition to the hamartoma. All had a similar jet black, elevated lesion in the macula, sparing the fovea. EDI-OCT showed a characteristic hyperreflective layer with complete optical shadowing of the deeper layers; FAF showed pronounced hypoautofluorescence of the lesion. Conclusion: Multimodal imaging with FAF and EDI-OCT can help to differentiate simple RPE hamartoma from similar RPE lesions, and may serve as a useful adjunct to clinical diagnosis of this rare tumor. We present the second largest series of presumed congenital simple RPE hamartoma, and – to the best of our knowledge – the first report of FAF findings of this tumor.

  5. Analytic formulation of neutrino oscillation probability in constant matter

    International Nuclear Information System (INIS)

    Kimura, Keiichi; Takamura, Akira; Yokomakura, Hidekazu

    2003-01-01

    In this paper, based on the work (Kimura K et al 2002 Phys. Lett. B 537 86) we present the simple derivation of an exact and analytic formula for neutrino oscillation probability. We consider three flavour neutrino oscillations in matter with constant density

  6. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  7. Histogram Estimators of Bivariate Densities

    National Research Council Canada - National Science Library

    Husemann, Joyce A

    1986-01-01

    One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...

  8. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Survival probability in a one-dimensional quantum walk on a trapped lattice

    International Nuclear Information System (INIS)

    Goenuelol, Meltem; Aydiner, Ekrem; Shikano, Yutaka; Muestecaplioglu, Oezguer E

    2011-01-01

    The dynamics of the survival probability of quantum walkers on a one-dimensional lattice with random distribution of absorbing immobile traps is investigated. The survival probability of quantum walkers is compared with that of classical walkers. It is shown that the time dependence of the survival probability of quantum walkers has a piecewise stretched exponential character depending on the density of traps in numerical and analytical observations. The crossover between the quantum analogues of the Rosenstock and Donsker-Varadhan behavior is identified.

  10. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  11. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  12. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  13. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    Science.gov (United States)

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  14. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Science.gov (United States)

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  15. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  16. Safety of a Brief Emergency Department Observation Protocol for Patients With Presumed Fentanyl Overdose.

    Science.gov (United States)

    Scheuermeyer, Frank X; DeWitt, Christopher; Christenson, Jim; Grunau, Brian; Kestler, Andrew; Grafstein, Eric; Buxton, Jane; Barbic, David; Milanovic, Stefan; Torkjari, Reza; Sahota, Indy; Innes, Grant

    2018-03-09

    Fentanyl overdoses are increasing and few data guide emergency department (ED) management. We evaluate the safety of an ED protocol for patients with presumed fentanyl overdose. At an urban ED, we used administrative data and explicit chart review to identify and describe consecutive patients with uncomplicated presumed fentanyl overdose (no concurrent acute medical issues) from September to December 2016. We linked regional ED and provincial vital statistics databases to ascertain admissions, revisits, and mortality. Primary outcome was a composite of admission and death within 24 hours. Other outcomes included treatment with additional ED naloxone, development of a new medical issue while in the ED, and length of stay. A prespecified subgroup analysis assessed low-risk patients with normal triage vital signs. There were 1,009 uncomplicated presumed fentanyl overdose, mainly by injection. Median age was 34 years, 85% were men, and 82% received out-of-hospital naloxone. One patient was hospitalized and one discharged patient died within 24 hours (combined outcome 0.2%; 95% confidence interval [CI] 0.04% to 0.8%). Sixteen patients received additional ED naloxone (1.6%; 95% CI 1.0% to 2.6%), none developed a new medical issue (0%; 95% CI 0% to 0.5%), and median length of stay was 173 minutes (interquartile range 101 to 267). For 752 low-risk patients, no patients were admitted or developed a new issue, and one died postdischarge; 3 (0.4%; 95% CI 0.01% to 1.3%) received ED naloxone. In our cohort of ED patients with uncomplicated presumed fentanyl overdose-typically after injection-deterioration, admission, mortality, and postdischarge complications appear low; the majority can be discharged after brief observation. Patients with normal triage vital signs are unlikely to require ED naloxone. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  17. Kinetic and dynamic probability-density-function descriptions of disperse turbulent two-phase flows

    Science.gov (United States)

    Minier, Jean-Pierre; Profeta, Christophe

    2015-11-01

    This article analyzes the status of two classical one-particle probability density function (PDF) descriptions of the dynamics of discrete particles dispersed in turbulent flows. The first PDF formulation considers only the process made up by particle position and velocity Zp=(xp,Up) and is represented by its PDF p (t ;yp,Vp) which is the solution of a kinetic PDF equation obtained through a flux closure based on the Furutsu-Novikov theorem. The second PDF formulation includes fluid variables into the particle state vector, for example, the fluid velocity seen by particles Zp=(xp,Up,Us) , and, consequently, handles an extended PDF p (t ;yp,Vp,Vs) which is the solution of a dynamic PDF equation. For high-Reynolds-number fluid flows, a typical formulation of the latter category relies on a Langevin model for the trajectories of the fluid seen or, conversely, on a Fokker-Planck equation for the extended PDF. In the present work, a new derivation of the kinetic PDF equation is worked out and new physical expressions of the dispersion tensors entering the kinetic PDF equation are obtained by starting from the extended PDF and integrating over the fluid seen. This demonstrates that, under the same assumption of a Gaussian colored noise and irrespective of the specific stochastic model chosen for the fluid seen, the kinetic PDF description is the marginal of a dynamic PDF one. However, a detailed analysis reveals that kinetic PDF models of particle dynamics in turbulent flows described by statistical correlations constitute incomplete stand-alone PDF descriptions and, moreover, that present kinetic-PDF equations are mathematically ill posed. This is shown to be the consequence of the non-Markovian characteristic of the stochastic process retained to describe the system and the use of an external colored noise. Furthermore, developments bring out that well-posed PDF descriptions are essentially due to a proper choice of the variables selected to describe physical systems

  18. Treatment outcome in patients with presumed tubercular uveitis at a tertiary referral eye care centre in Singapore.

    Science.gov (United States)

    Ang, Leslie; Kee, Aera; Yeo, Tun Hang; Dinesh, V G; Ho, Su Ling; Teoh, Stephen C; Agrawal, Rupesh

    2018-02-01

    To report the clinical features and outcome of patients with presumed tubercular uveitis (TBU). Retrospective analysis of patients with presumed TBU at a tertiary referral eye care centre in Singapore between 2007 and 2012 was done. Main outcome measures were failure of complete resolution of uveitis or recurrence of inflammation. Fifty three patients with mean age of 44.18 ± 15.26 years with 54.72% being males were included. 19 (35.85%) had bilateral involvement, with panuveitis and anterior uveitis being the most common presentations. 36 (67.92%) patients received antitubercular therapy (ATT), and 28 received concurrent systemic steroids. 15 (28.30%) eyes of 11 (30.55%) patients in the ATT group and 4 (21.05%) eyes of 3 (17.64%) patients in the non-ATT group had treatment failure (p value = 0.51). The use of ATT, with or without concurrent corticosteroid, may not have a statistically significant impact in improving treatment success in patients with presumed TBU.

  19. Probabilities of filaments in a Poissonian distribution of points -I

    International Nuclear Information System (INIS)

    Betancort-Rijo, J.

    1989-01-01

    Statistical techniques are devised to assess the likelihood of a Poisson sample of points in two and three dimensions, containing specific filamentary structures. For that purpose, the expression of Otto et al (1986. Astrophys. J., 304) for the probability density of clumps in a Poissonian distribution of points is generalized for any value of the density contrast. A way of counting filaments differing from that of Otto et al. is proposed, because at low density contrast the filaments counted by Otto et al. are distributed in a clumpy fashion, each clump of filaments corresponding to a distinct observed filament. (author)

  20. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  1. Correlator bank detection of gravitational wave chirps--False-alarm probability, template density, and thresholds: Behind and beyond the minimal-match issue

    International Nuclear Information System (INIS)

    Croce, R.P.; Demma, Th.; Pierro, V.; Pinto, I.M.; Longo, M.; Marano, S.; Matta, V.

    2004-01-01

    The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness

  2. Determination of stability of epimetamorphic rock slope using Minimax Probability Machine

    Directory of Open Access Journals (Sweden)

    Manoj Kumar

    2016-01-01

    Full Text Available The article employs Minimax Probability Machine (MPM for the prediction of the stability status of epimetamorphic rock slope. The MPM gives a worst-case bound on the probability of misclassification of future data points. Bulk density (d, height (H, inclination (β, cohesion (c and internal friction angle (φ have been used as input of the MPM. This study uses the MPM as a classification technique. Two models {Linear Minimax Probability Machine (LMPM and Kernelized Minimax Probability Machine (KMPM} have been developed. The generalization capability of the developed models has been checked by a case study. The experimental results demonstrate that MPM-based approaches are promising tools for the prediction of the stability status of epimetamorphic rock slope.

  3. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  4. Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds

    Science.gov (United States)

    Conway, C.J.; Gibbs, J.P.

    2011-01-01

    Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.

  5. Presumed choroidal metastasis of Merkel cell carcinoma

    International Nuclear Information System (INIS)

    Small, K.W.; Rosenwasser, G.O.; Alexander, E. III; Rossitch, G.; Dutton, J.J.

    1990-01-01

    Merkel cell carcinoma is a rare skin tumor of neural crest origin and is part of the amine precursor uptake and decarboxylase system. It typically occurs on the face of elderly people. Distant metastasis is almost uniformly fatal. Choroidal metastasis, to our knowledge, has not been described. We report a patient with Merkel cell carcinoma who had a synchronous solid choroidal tumor and a biopsy-proven brain metastasis. Our 56-year-old patient presented with a rapidly growing, violaceous preauricular skin tumor. Computed tomography of the head disclosed incidental brain and choroidal tumors. Light and electron microscopy of biopsy specimens of both the skin and the brain lesions showed Merkel cell carcinoma. Ophthalmoscopy, fluorescein angiography, and A and B echography revealed a solid choroidal mass. The brain and skin tumors responded well to irradiation. A radioactive episcleral plaque was applied subsequently to the choroidal tumor. All tumors regressed, and the patient was doing well 28 months later. To our knowledge this is the first case of presumed choroidal metastasis of Merkel cell carcinoma

  6. The correlation of defect distribution in collisional phase with measured cascade collapse probability

    International Nuclear Information System (INIS)

    Morishita, K.; Ishino, S.; Sekimura, N.

    1995-01-01

    The spatial distributions of atomic displacement at the end of the collisional phase of cascade damage processes were calculated using the computer simulation code MARLOWE, which is based on the binary collision approximation (BCA). The densities of the atomic displacement were evaluated in high dense regions (HDRs) of cascades in several pure metals (Fe, Ni, Cu, Ag, Au, Mo and W). They were compared with the measured cascade collapse probabilities reported in the literature where TEM observations were carried out using thin metal foils irradiated by low-dose ions at room temperature. We found that there exists the minimum or ''critical'' values of the atomic displacement densities for the HDR to collapse into TEM-visible vacancy clusters. The critical densities are generally independent of the cascade energy in the same metal. Furthermore, the material dependence of the critical densities can be explained by the difference in the vacancy mobility at the melting temperature of target materials. This critical density calibration, which is extracted from the ion-irradiation experiments and the BCA simulations, is applied to estimation of cascade collapse probabilities in the metals irradiated by fusion neutrons. (orig.)

  7. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  8. The Antibiotic Prescribing Pathway for Presumed Urinary Tract Infections in Nursing Home Residents.

    Science.gov (United States)

    Kistler, Christine E; Zimmerman, Sheryl; Scales, Kezia; Ward, Kimberly; Weber, David; Reed, David; McClester, Mallory; Sloane, Philip D

    2017-08-01

    Due to the high rates of inappropriate antibiotic prescribing for presumed urinary tract infections (UTIs) in nursing home (NH) residents, we sought to examine the antibiotic prescribing pathway and the extent to which it agrees with the Loeb criteria; findings can suggest strategies for antibiotic stewardship. Chart review of 260 randomly-selected cases from 247 NH residents treated with an antibiotic for a presumed UTI in 31 NHs in North Carolina. We examined the prescribing pathway from presenting illness, to the prescribing event, illness work-up and subsequent clinical events including emergency department use, hospitalization, and death. Analyses described the decision-making processes and outcomes and compared decisions made with Loeb criteria for initiation of antibiotics. Of 260 cases, 60% had documented signs/symptoms of the presenting illness and 15% met the Loeb criteria. Acute mental status change was the most commonly documented sign/symptom (24%). NH providers (81%) were the most common prescribers and ciprofloxacin (32%) was the most commonly prescribed antibiotic. Fourteen percent of presumed UTI cases included a white blood cell count, 71% included a urinalysis, and 72% had a urine culture. Seventy-five percent of cultures grew at least one organism with ≥100,000 colony-forming units/milliliter and 12% grew multi-drug resistant organisms; 28% of antibiotics were prescribed for more than 7 days, and 7% of cases had a subsequent death, emergency department visit, or hospitalization within 7 days. Non-specific signs/symptoms appeared to influence prescribing more often than urinary tract-specific signs/symptoms. Prescribers rarely stopped antibiotics, and a minority prescribed for overly long periods. Providers may need additional support to guide the decision-making process to reduce antibiotic overuse and antibiotic resistance. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.

  9. Oak regeneration and overstory density in the Missouri Ozarks

    Science.gov (United States)

    David R. Larsen; Monte A. Metzger

    1997-01-01

    Reducing overstory density is a commonly recommended method of increasing the regeneration potential of oak (Quercus) forests. However, recommendations seldom specify the probable increase in density or the size of reproduction associated with a given residual overstory density. This paper presents logistic regression models that describe this...

  10. Wind power statistics and an evaluation of wind energy density

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, M.; Parsa, S.; Majidi, M. [Materials and Energy Research Centre, Tehran (Iran, Islamic Republic of)

    1995-11-01

    In this paper the statistical data of fifty days` wind speed measurements at the MERC- solar site are used to find out the wind energy density and other wind characteristics with the help of the Weibull probability distribution function. It is emphasized that the Weibull and Rayleigh probability functions are useful tools for wind energy density estimation but are not quite appropriate for properly fitting the actual wind data of low mean speed, short-time records. One has to use either the actual wind data (histogram) or look for a better fit by other models of the probability function. (Author)

  11. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  12. Secondary prevention after cerebral ischaemia of presumed arterial origin: is aspirin still the touchstone?

    NARCIS (Netherlands)

    A. Algra (Ale); P.J. Koudstaal (Peter Jan); J. van Gijn (Jan)

    1999-01-01

    textabstractPatients who have had a transient ischaemic attack or nondisabling ischaemic stroke of presumed arterial origin have an annual risk of death from all vascular causes, non-fatal stroke, or non-fatal myocardial infarction that ranges between 4% and 11% without treatment. In the

  13. Cost-Effectiveness of Laparoscopic Hysterectomy With Morcellation Compared With Abdominal Hysterectomy for Presumed Myomas.

    Science.gov (United States)

    Rutstein, Sarah E; Siedhoff, Matthew T; Geller, Elizabeth J; Doll, Kemi M; Wu, Jennifer M; Clarke-Pearson, Daniel L; Wheeler, Stephanie B

    2016-02-01

    Hysterectomy for presumed leiomyomata is 1 of the most common surgical procedures performed in nonpregnant women in the United States. Laparoscopic hysterectomy (LH) with morcellation is an appealing alternative to abdominal hysterectomy (AH) but may result in dissemination of malignant cells and worse outcomes in the setting of an occult leiomyosarcoma (LMS). We sought to evaluate the cost-effectiveness of LH versus AH. Decision-analytic model of 100 000 women in the United States assessing the incremental cost-effectiveness ratio (ICER) in dollars per quality-adjusted life-year (QALY) gained (Canadian Task Force classification III). U.S. hospitals. Adult premenopausal women undergoing LH or AH for presumed benign leiomyomata. We developed a decision-analytic model from a provider perspective across 5 years, comparing the cost-effectiveness of LH to AH in terms of dollar (2014 US dollars) per QALY gained. The model included average total direct medical costs and utilities associated with the procedures, complications, and clinical outcomes. Baseline estimates and ranges for cost and probability data were drawn from the existing literature. Estimated overall deaths were lower in LH versus AH (98 vs 103). Death due to LMS was more common in LH versus AH (86 vs 71). Base-case assumptions estimated that average per person costs were lower in LH versus AH, with a savings of $2193 ($24 181 vs $26 374). Over 5 years, women in the LH group experienced 4.99 QALY versus women in the AH group with 4.91 QALY (incremental gain of .085 QALYs). LH dominated AH in base-case estimates: LH was both less expensive and yielded greater QALY gains. The ICER was sensitive to operative costs for LH and AH. Varying operative costs of AH yielded an ICER of $87 651/QALY gained (minimum) to AH being dominated (maximum). Probabilistic sensitivity analyses, in which all input parameters and costs were varied simultaneously, demonstrated a relatively robust model. The AH approach was dominated

  14. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  15. Sucessfull management of bilateral presumed Candida endogenous endophtalmitis following pancreatitis

    Directory of Open Access Journals (Sweden)

    Ricardo Evangelista Marrocos de Aragão

    2016-06-01

    Full Text Available ABSTRACT Endogenous endophthalmitis is a rare, and frequently devastating, ophthalmic disease. It occurs mostly in immunocompromised patients, or those with diabetes mellitus, cancer or intravenous drugs users. Candida infection is the most common cause of endogenous endophthalmitis. Ocular candidiasis develops within days to weeks of fungemia. The association of treatment for pancreatitis with endophthalmitis is unusual. Treatment with broad-spectrum antibiotics and total parenteral nutrition may explain endogenous endophthalmitis. We report the case of a patient with pancreatitis treated with broad-spectrum antibiotics and total parenteral nutrition who developed bilateral presumed Candida endogenous endophthalmitis that was successfully treated with vitrectomy and intravitreal amphotericin B.

  16. Parametric resonance in neutrino oscillation: A guide to control the effects of inhomogeneous matter density

    International Nuclear Information System (INIS)

    Koike, Masafumi; Ota, Toshihiko; Saito, Masako; Sato, Joe

    2016-01-01

    Effects of the inhomogeneous matter density on the three-generation neutrino oscillation probability are analyzed. Realistic profile of the matter density is expanded into a Fourier series. Taking in the Fourier modes one by one, we demonstrate that each mode has its corresponding target energy. The high Fourier mode selectively modifies the oscillation probability of the low-energy region. This rule is well described by the parametric resonance between the neutrino oscillation and the matter effect. The Fourier analysis gives a simple guideline to systematically control the uncertainty of the oscillation probability caused by the uncertain density of matter. Precise analysis of the oscillation probability down to the low-energy region requires accurate evaluation of the Fourier coefficients of the matter density up to the corresponding high modes.

  17. ON THE ORIGIN OF THE HIGH COLUMN DENSITY TURNOVER IN THE H I COLUMN DENSITY DISTRIBUTION

    International Nuclear Information System (INIS)

    Erkal, Denis; Gnedin, Nickolay Y.; Kravtsov, Andrey V.

    2012-01-01

    We study the high column density regime of the H I column density distribution function and argue that there are two distinct features: a turnover at N H I ≈ 10 21 cm –2 , which is present at both z = 0 and z ≈ 3, and a lack of systems above N H I ≈ 10 22 cm –2 at z = 0. Using observations of the column density distribution, we argue that the H I-H 2 transition does not cause the turnover at N H I ≈ 10 21 cm –2 but can plausibly explain the turnover at N H I ∼> 10 22 cm –2 . We compute the H I column density distribution of individual galaxies in the THINGS sample and show that the turnover column density depends only weakly on metallicity. Furthermore, we show that the column density distribution of galaxies, corrected for inclination, is insensitive to the resolution of the H I map or to averaging in radial shells. Our results indicate that the similarity of H I column density distributions at z = 3 and 0 is due to the similarity of the maximum H I surface densities of high-z and low-z disks, set presumably by universal processes that shape properties of the gaseous disks of galaxies. Using fully cosmological simulations, we explore other candidate physical mechanisms that could produce a turnover in the column density distribution. We show that while turbulence within giant molecular clouds cannot affect the damped Lyα column density distribution, stellar feedback can affect it significantly if the feedback is sufficiently effective in removing gas from the central 2-3 kpc of high-redshift galaxies. Finally, we argue that it is meaningful to compare column densities averaged over ∼ kpc scales with those estimated from quasar spectra that probe sub-pc scales due to the steep power spectrum of H I column density fluctuations observed in nearby galaxies.

  18. Presumed symbolic use of diurnal raptors by Neanderthals.

    Directory of Open Access Journals (Sweden)

    Eugène Morin

    Full Text Available In Africa and western Eurasia, occurrences of burials and utilized ocher fragments during the late Middle and early Late Pleistocene are often considered evidence for the emergence of symbolically-mediated behavior. Perhaps less controversial for the study of human cognitive evolution are finds of marine shell beads and complex designs on organic and mineral artifacts in early modern human (EMH assemblages conservatively dated to ≈ 100-60 kilo-years (ka ago. Here we show that, in France, Neanderthals used skeletal parts of large diurnal raptors presumably for symbolic purposes at Combe-Grenal in a layer dated to marine isotope stage (MIS 5b (≈ 90 ka and at Les Fieux in stratigraphic units dated to the early/middle phase of MIS 3 (60-40 ka. The presence of similar objects in other Middle Paleolithic contexts in France and Italy suggest that raptors were used as means of symbolic expression by Neanderthals in these regions.

  19. Presumed symbolic use of diurnal raptors by Neanderthals.

    Science.gov (United States)

    Morin, Eugène; Laroulandie, Véronique

    2012-01-01

    In Africa and western Eurasia, occurrences of burials and utilized ocher fragments during the late Middle and early Late Pleistocene are often considered evidence for the emergence of symbolically-mediated behavior. Perhaps less controversial for the study of human cognitive evolution are finds of marine shell beads and complex designs on organic and mineral artifacts in early modern human (EMH) assemblages conservatively dated to ≈ 100-60 kilo-years (ka) ago. Here we show that, in France, Neanderthals used skeletal parts of large diurnal raptors presumably for symbolic purposes at Combe-Grenal in a layer dated to marine isotope stage (MIS) 5b (≈ 90 ka) and at Les Fieux in stratigraphic units dated to the early/middle phase of MIS 3 (60-40 ka). The presence of similar objects in other Middle Paleolithic contexts in France and Italy suggest that raptors were used as means of symbolic expression by Neanderthals in these regions.

  20. Stochastic transport models for mixing in variable-density turbulence

    Science.gov (United States)

    Bakosi, J.; Ristorcelli, J. R.

    2011-11-01

    In variable-density (VD) turbulent mixing, where very-different- density materials coexist, the density fluctuations can be an order of magnitude larger than their mean. Density fluctuations are non-negligible in the inertia terms of the Navier-Stokes equation which has both quadratic and cubic nonlinearities. Very different mixing rates of different materials give rise to large differential accelerations and some fundamentally new physics that is not seen in constant-density turbulence. In VD flows material mixing is active in a sense far stronger than that applied in the Boussinesq approximation of buoyantly-driven flows: the mass fraction fluctuations are coupled to each other and to the fluid momentum. Statistical modeling of VD mixing requires accounting for basic constraints that are not important in the small-density-fluctuation passive-scalar-mixing approximation: the unit-sum of mass fractions, bounded sample space, and the highly skewed nature of the probability densities become essential. We derive a transport equation for the joint probability of mass fractions, equivalent to a system of stochastic differential equations, that is consistent with VD mixing in multi-component turbulence and consistently reduces to passive scalar mixing in constant-density flows.

  1. A Case Series of the Probability Density and Cumulative Distribution of Laryngeal Disease in a Tertiary Care Voice Center.

    Science.gov (United States)

    de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander

    2017-11-01

    To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.

  2. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  3. Decreased Serum Lipids in Patients with Probable Alzheimer´s Disease

    Directory of Open Access Journals (Sweden)

    Orhan Lepara

    2009-08-01

    Full Text Available Alzheimer’s disease (AD is a multifactorial disease but its aetiology and pathophisiology are still not fully understood. Epidemiologic studies examining the association between lipids and dementia have reported conflicting results. High total cholesterol has been associated with both an increased, and decreased, risk of AD and/or vascular dementia (VAD, whereas other studies found no association. The aim of this study was to investigate the serum lipids concentration in patients with probable AD, as well as possible correlation between serum lipids concentrations and cognitive impairment.Our cross-sectional study included 30 patients with probable AD and 30 age and sex matched control subjects. The probable AD was clinically diagnosed by NINCDS-ADRDA criteria. Serum total cholesterol (TC, high-density lipoprotein cholesterol (HDL-C and triglyceride (TG levels were determined at the initial assessment using standard enzymatic colorimetric techniques. Low-den- sity lipoprotein cholesterol (LDL-C and very low density lipoprotein cholesterol (VLDL-C levels were calculated. Subjects with probable AD had significantly lower serum TG (p<0,01, TC (p<0,05, LDL-C (p<0,05 and VLDL-C (p<0,01 compared to the control group. We did not observe signifi-cant difference in HDL-C level between patients with probable AD and control subjects. Negative, although not significant correlation between TG, TC and VLDL-C and MMSE in patients with AD was observed. In the control group of subjects there was a negative correlation between TC and MMSE but it was not statistically significant (r = -0,28. Further studies are required to explore the possibility for serum lipids to serve as diagnostic and therapeutic markers of AD.

  4. Non-LTE population probabilities of the excited ionic levels in a steady state plasma

    International Nuclear Information System (INIS)

    Salzmann, D.

    1982-01-01

    A Complete-Staedy-State (CSS) model for the charge state distribution and the ionic levels population probabilities of ions in hot non-LTE plasmas is described. The following properties of this model are described: (i) it is shown that CSS covers LTE and Corona Equilibrium (CE) in the high and low electron density regimes respectively, (ii) an explicit expression is found for the low electron density asymptotic behaviour of the population probabilities, (iii) it is shown that at intermediate density regions the CSS model predicts results similar to that of the Quasi-Steady-State model, (iv) new validity limits are derived for LTE and CE, (v) the population distribution of the excited levels is revised, (vi) an analytical expression is found for the high electron density asymptotic behaviour of the population distribution, (vii) the influence of the radiation reabsorption in a spherically symmetric CSS plasma is briefly described, and (viii) the effect of the inaccuracies in the rate-coefficients on the results of CSS calculations is evaluated. (author)

  5. Design and simulation of stratified probability digital receiver with application to the multipath communication

    Science.gov (United States)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  6. Probability analysis of WWER-1000 fuel elements behavior under steady-state, transient and accident conditions of reactor operation

    International Nuclear Information System (INIS)

    Tutnov, A.; Alexeev, E.

    2001-01-01

    'PULSAR-2' and 'PULSAR+' codes make it possible to simulate thermo-mechanical and thermo-physical parameters of WWER fuel elements. The probabilistic approach is used instead of traditional deterministic one to carry out a sensitive study of fuel element behavior under steady-state operation mode. Fuel elements initial parameters are given as a density of the probability distributions. Calculations are provided for all possible combinations of initial data as fuel-cladding gap, fuel density and gas pressure. Dividing values of these parameters to intervals final variants for calculations are obtained . Intervals of permissible fuel-cladding gap size have been divided to 10 equal parts, fuel density and gas pressure - to 5 parts. Probability of each variant realization is determined by multiplying the probabilities of separate parameters, because the tolerances of these parameters are distributed independently. Simulation results are turn out in the probabilistic bar charts. The charts present probability distribution of the changes in fuel outer diameter, hoop stress kinetics and fuel temperature versus irradiation time. A normative safety factor is introduced for control of any criterion realization and for determination of a reserve to the criteria failure. A probabilistic analysis of fuel element behavior under Reactivity Initiating Accident (RIA) is also performed and probability fuel element depressurization under hypothetical RIA is presented

  7. Presumed cultural similarity paradox : Expatriate adjustment and performance across the border or over the globe

    NARCIS (Netherlands)

    Vromans, P.; van Engen, M.L.; Mol, S.

    2013-01-01

    Purpose To introduce the presumed cultural similarity paradox as a possible explanation for the findings that adjusting to a culturally similar country is just as difficult as adjusting to a culturally dissimilar country. We provide a conceptual framework, enabling further understanding and research

  8. Presumed cultural similarity paradox: expatriate adjustment and performance across the border or over the globe

    NARCIS (Netherlands)

    Vromans, P.; van Engen, M.; Mol, S.

    2013-01-01

    Purpose - To introduce the presumed cultural similarity paradox as a possible explanation for the findings that adjusting to a culturally similar country is just as difficult as adjusting to a culturally dissimilar country. We provide a conceptual framework, enabling further understanding and

  9. Reactivation of presumed adenoviral keratitis after laser in situ keratomileusis.

    Science.gov (United States)

    Safak, Nilgün; Bilgihan, Kamil; Gürelik, Gökhan; Ozdek, Sengül; Hasanreisoğlu, Berati

    2002-04-01

    We report a patient with reactivation of presumed adenoviral keratoconjunctivitis after laser in situ keratomileusis (LASIK) to correct high myopia. The preoperative refraction was -13.00 diopters (D) in the right eye and -14.00 D in the left eye, and the best corrected visual acuity was 20/20 in both eyes. On the first postoperative day, mild conjunctival hyperemia and multiple subepithelial infiltrations localized in the flap zone consistent with adenoviral keratoconjunctivitis were seen. After prompt treatment, the lesions resolved. As a consequence, LASIK successfully corrected the high myopia. Adenoviral keratoconjunctivitis can be reactivated after LASIK, unlike after photorefractive keratectomy, despite the absence of symptomatic and clinical findings before the procedure.

  10. Variable kernel density estimation in high-dimensional feature spaces

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2017-02-01

    Full Text Available Estimating the joint probability density function of a dataset is a central task in many machine learning applications. In this work we address the fundamental problem of kernel bandwidth estimation for variable kernel density estimation in high...

  11. Mesquite seed density in fecal samples of Raramuri Criollo vs. Angus x Hereford cows grazing Chihuahuan Desert Rangeland

    Science.gov (United States)

    This study was part of a larger project investigating breed-related differences in feeding habits of Raramuri Criollo (RC) versus Angus x Hereford (AH) cows. Seed densities in fecal samples collected in July and August 2015 were analyzed to compare presumed mesquite bean consumption of RC and AH cow...

  12. Estimating the probability that the Taser directly causes human ventricular fibrillation.

    Science.gov (United States)

    Sun, H; Haemmerich, D; Rahko, P S; Webster, J G

    2010-04-01

    This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.

  13. Linear and nonlinear optical signals in probability and phase-space representations

    International Nuclear Information System (INIS)

    Man'ko, Margarita A

    2006-01-01

    Review of different representations of signals including the phase-space representations and tomographic representations is presented. The signals under consideration are either linear or nonlinear ones. The linear signals satisfy linear quantumlike Schroedinger and von Neumann equations. Nonlinear signals satisfy nonlinear Schroedinger equations as well as Gross-Pitaevskii equation describing solitons in Bose-Einstein condensate. The Ville-Wigner distributions for solitons are considered in comparison with tomographic-probability densities describing solitons completely. different kinds of tomographies - symplectic tomography, optical tomography and Fresnel tomography are reviewed. New kind of map of the signals onto probability distributions of discrete photon number-like variable is discussed. Mutual relations between different transformations of signal functions are established in explicit form. Such characteristics of the signal-probability distribution as entropy is discussed

  14. Plant Density Effect in Different Planting Dates on Growth Indices, Yield and

    Directory of Open Access Journals (Sweden)

    F Azizi

    2013-04-01

    Full Text Available In order to determine the appropriate plant density in different planting dates for sweet corn cultivar KSC403su, an experiment was conducted using a randomized complete block design in split plot lay out with three replications at Seed and Plant Improvement Institute in Karaj in 2006. Three planting dates (22 May, 5 June and 22 June were assigned as main plots and three plant densities (65000, 75000 and 85000 plants per hectare were considered as sub plots. Effect of planting date on row/ear, 1000 kernels weight, biological yield and harvest index was significant at 1% probability level and it was significant at 5% probability level for kernels/ear row and grain yield. All traits decreased with postponement of planting date to 5 June except for row/ear, kernels/row and grain yield. More delay in planting from 22 May to 22 June caused that grain yield was decreased significantly about 32.5% (from 14.45 to 9.78 ton/ha. Effect of plant density was significant at 1% probability level for all the traits. All of the traits decreased significantly with increasing plant density except for biological yield. The highest grain yield was resulted from 65000 plants per hectare density (14.20 ton/ha. Interaction effect of planting date and plant density was significant at 5% probability level for biological yield and harvest index but it wasn’t significant for the other traits. Growth indices decreased with delay in planting date and increasing plant density. Only leaf area index increased in more plant densities. From the results of this experiment it might be resulted that appropriate planting date to produce the highest grain yield is 22 May to 5 June for sweet corn cultivar KSC403su and also the highest grain yield can obtain from 65000 plants per hectare density.

  15. 28 CFR 104.46 - Determination of presumed noneconomic losses for claimants who suffered physical harm.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Determination of presumed noneconomic losses for claimants who suffered physical harm. 104.46 Section 104.46 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) SEPTEMBER 11TH VICTIM COMPENSATION FUND OF 2001 Amount of Compensation for...

  16. Quantum operations, state transformations and probabilities

    International Nuclear Information System (INIS)

    Chefles, Anthony

    2002-01-01

    In quantum operations, probabilities characterize both the degree of the success of a state transformation and, as density operator eigenvalues, the degree of mixedness of the final state. We give a unified treatment of pure→pure state transformations, covering both probabilistic and deterministic cases. We then discuss the role of majorization in describing the dynamics of mixing in quantum operations. The conditions for mixing enhancement for all initial states are derived. We show that mixing is monotonically decreasing for deterministic pure→pure transformations, and discuss the relationship between these transformations and deterministic local operations with classical communication entanglement transformations

  17. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  18. Modeling the radiation transfer of discontinuous canopies: results for gap probability and single-scattering contribution

    Science.gov (United States)

    Zhao, Feng; Zou, Kai; Shang, Hong; Ji, Zheng; Zhao, Huijie; Huang, Wenjiang; Li, Cunjun

    2010-10-01

    In this paper we present an analytical model for the computation of radiation transfer of discontinuous vegetation canopies. Some initial results of gap probability and bidirectional gap probability of discontinuous vegetation canopies, which are important parameters determining the radiative environment of the canopies, are given and compared with a 3- D computer simulation model. In the model, negative exponential attenuation of light within individual plant canopies is assumed. Then the computation of gap probability is resolved by determining the entry points and exiting points of the ray with the individual plants via their equations in space. For the bidirectional gap probability, which determines the single-scattering contribution of the canopy, a gap statistical analysis based model was adopted to correct the dependence of gap probabilities for both solar and viewing directions. The model incorporates the structural characteristics, such as plant sizes, leaf size, row spacing, foliage density, planting density, leaf inclination distribution. Available experimental data are inadequate for a complete validation of the model. So it was evaluated with a three dimensional computer simulation model for 3D vegetative scenes, which shows good agreement between these two models' results. This model should be useful to the quantification of light interception and the modeling of bidirectional reflectance distributions of discontinuous canopies.

  19. Metabolism of cholesteryl esters of rat very low density lipoproteins.

    Science.gov (United States)

    Faergeman, O; Havel, R J

    1975-06-01

    Rat very low density lipoproteins (d smaller than 1.006), biologically labeled in esterified and free cholesterol, were obtained form serum 6 h after intravenous injection of particulate (3-H) cholesterol. When injected into recipient animals, the esterified cholesterol was cleared form plasma with a half-life of 5 min. After 15 min, 71% of the injected esterified (3-H) cholesterol had been taken up by the liver, where it was rapidly hydrolyzed. After 60 min only 3.3% of the amount injected had been transferred, via lipoproteins of intermediate density, to the low density lipoproteins of plasma (d 1.019-1.063). Both uptake in the liver and transfer to low density lipoproteins occurred without change of distribution of 3-H in the various cholesteryl esters. 3-H appearing in esterified cholesterol of high density lipoproteins (d greater than 1.063) was derived from esterification, presumably by lecithin: cholesterol acyltransferase, of simultaneously injected free (3-H) cholesterol. Content of free (3-H) cholesterol in the very low density lipoproteins used for injection could be reduced substantially by incubation with erythrocytes. This procedure, however, increased the rate of clearance of the lipoproteins after injection into recipient rats. These studies show that hepatic removal is the major catabolic pathway for cholesteryl esters of rat very low density lipoproteins and that transfer to low density lipoproteins occurs to only a minor extent.

  20. Laparoscopic power morcellation of presumed fibroids.

    Science.gov (United States)

    Brolmann, Hans A; Sizzi, Ornella; Hehenkamp, Wouter J; Rossetti, Alfonso

    2016-06-01

    Uterine leiomyoma is a highly prevalent benign gynecologic neoplasm that affects women of reproductive age. Surgical procedures commonly employed to treat symptomatic uterine fibroids include myomectomy or total or sub-total hysterectomy. These procedures, when performed using minimally invasive techniques, reduce the risks of intraoperative and postoperative morbidity and mortality; however, in order to remove bulky lesions from the abdominal cavity through laparoscopic ports, a laparoscopic power morcellator must be used, a device with rapidly spinning blades to cut the uterine tissue into fragments so that it can be removed through a small incision. Although the minimal invasive approach in gynecological surgery has been firmly established now in terms of recovery and quality of life, morcellation is associated with rare but sometimes serious adverse events. Parts of the morcellated specimen may be spread into the abdominal cavity and enable implantation of cells on the peritoneum. In case of unexpected sarcoma the dissemination may upstage disease and affect survival. Myoma cells may give rise to 'parasitic' fibroids, but also implantation of adenomyotic cells and endometriosis has been reported. Finally the morcellation device may cause inadvertent injury to internal structures, such as bowel and vessels, with its rotating circular knife. In this article it is described how to estimate the risk of sarcoma in a presumed fibroid based on epidemiologic, imaging and laboratory data. Furthermore the first literature results of the in-bag morcellation are reviewed. With this procedure the specimen is contained in an insufflated sterile bag while being morcellated, potentially preventing spillage of tissue but also making direct morcellation injuries unlikely to happen.

  1. New evolution equations for the joint response-excitation probability density function of stochastic solutions to first-order nonlinear PDEs

    Science.gov (United States)

    Venturi, D.; Karniadakis, G. E.

    2012-08-01

    By using functional integral methods we determine new evolution equations satisfied by the joint response-excitation probability density function (PDF) associated with the stochastic solution to first-order nonlinear partial differential equations (PDEs). The theory is presented for both fully nonlinear and for quasilinear scalar PDEs subject to random boundary conditions, random initial conditions or random forcing terms. Particular applications are discussed for the classical linear and nonlinear advection equations and for the advection-reaction equation. By using a Fourier-Galerkin spectral method we obtain numerical solutions of the proposed response-excitation PDF equations. These numerical solutions are compared against those obtained by using more conventional statistical approaches such as probabilistic collocation and multi-element probabilistic collocation methods. It is found that the response-excitation approach yields accurate predictions of the statistical properties of the system. In addition, it allows to directly ascertain the tails of probabilistic distributions, thus facilitating the assessment of rare events and associated risks. The computational cost of the response-excitation method is order magnitudes smaller than the one of more conventional statistical approaches if the PDE is subject to high-dimensional random boundary or initial conditions. The question of high-dimensionality for evolution equations involving multidimensional joint response-excitation PDFs is also addressed.

  2. Family of probability distributions derived from maximal entropy principle with scale invariant restrictions.

    Science.gov (United States)

    Sonnino, Giorgio; Steinbrecher, György; Cardinali, Alessandro; Sonnino, Alberto; Tlidi, Mustapha

    2013-01-01

    Using statistical thermodynamics, we derive a general expression of the stationary probability distribution for thermodynamic systems driven out of equilibrium by several thermodynamic forces. The local equilibrium is defined by imposing the minimum entropy production and the maximum entropy principle under the scale invariance restrictions. The obtained probability distribution presents a singularity that has immediate physical interpretation in terms of the intermittency models. The derived reference probability distribution function is interpreted as time and ensemble average of the real physical one. A generic family of stochastic processes describing noise-driven intermittency, where the stationary density distribution coincides exactly with the one resulted from entropy maximization, is presented.

  3. Undiagnosed and comorbid disorders in patients with presumed chronic fatigue syndrome.

    Science.gov (United States)

    Mariman, An; Delesie, Liesbeth; Tobback, Els; Hanoulle, Ignace; Sermijn, Erica; Vermeir, Peter; Pevernagie, Dirk; Vogelaers, Dirk

    2013-11-01

    To assess undiagnosed and comorbid disorders in patients referred to a tertiary care center with a presumed diagnosis of chronic fatigue syndrome (CFS). Patients referred for chronic unexplained fatigue entered an integrated diagnostic pathway, including internal medicine assessment, psychodiagnostic screening, physiotherapeutic assessment and polysomnography+multiple sleep latency testing. Final diagnosis resulted from a multidisciplinary team discussion. Fukuda criteria were used for the diagnosis of CFS, DSM-IV-TR criteria for psychiatric disorders, ICSD-2 criteria for sleep disorders. Out of 377 patients referred, 279 (74.0%) were included in the study [84.9% female; mean age 38.8years (SD 10.3)]. A diagnosis of unequivocal CFS was made in 23.3%. In 21.1%, CFS was associated with a sleep disorder and/or psychiatric disorder, not invalidating the diagnosis of CFS. A predominant sleep disorder was found in 9.7%, 19.0% had a psychiatric disorder and 20.8% a combination of both. Only 2.2% was diagnosed with a classical internal disease. In the total sample, a sleep disorder was found in 49.8%, especially obstructive sleep apnea syndrome, followed by psychophysiologic insomnia and periodic limb movement disorder. A psychiatric disorder was diagnosed in 45.2%; mostly mood and anxiety disorder. A multidisciplinary approach to presumed CFS yields unequivocal CFS in only a minority of patients, and reveals a broad spectrum of exclusionary or comorbid conditions within the domains of sleep medicine and psychiatry. These findings favor a systematic diagnostic approach to CFS, suitable to identify a wide range of diagnostic categories that may be subject to dedicated care. © 2013. Published by Elsevier Inc. All rights reserved.

  4. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  5. How Life History Can Sway the Fixation Probability of Mutants

    Science.gov (United States)

    Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne

    2016-01-01

    In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737

  6. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  7. Shorter Leukocyte Telomere Length in Relation to Presumed Nonalcoholic Fatty Liver Disease in Mexican-American Men in NHANES 1999–2002

    Directory of Open Access Journals (Sweden)

    Janet M. Wojcicki

    2017-01-01

    Full Text Available Leukocyte telomere length is shorter in response to chronic disease processes associated with inflammation such as diabetes mellitus and coronary artery disease. Data from the National Health and Nutrition Examination Survey (NHANES from 1999 to 2002 was used to explore the relationship between leukocyte telomere length and presumed NAFLD, as indicated by elevated serum alanine aminotransferase (ALT levels, obesity, or abdominal obesity. Logistic regression models were used to evaluate the relationship between telomere length and presumed markers of NAFLD adjusting for possible confounders. There was no relationship between elevated ALT levels, abdominal obesity, or obesity and telomere length in adjusted models in NHANES (OR 1.13, 95% CI 0.48–2.65; OR 1.17, 95% CI 0.52–2.62, resp.. Mexican-American men had shorter telomere length in relation to presumed NAFLD (OR 0.07, 95% CI 0.006–0.79 and using different indicators of NAFLD (OR 0.012, 95% CI 0.0006–0.24. Mexican origin with presumed NAFLD had shorter telomere length than men in other population groups. Longitudinal studies are necessary to evaluate the role of telomere length as a potential predictor to assess pathogenesis of NALFD in Mexicans.

  8. Probability theory for 3-layer remote sensing in ideal gas law environment.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2013-08-26

    We extend the probability model for 3-layer radiative transfer [Opt. Express 20, 10004 (2012)] to ideal gas conditions where a correlation exists between transmission and temperature of each of the 3 layers. The effect on the probability density function for the at-sensor radiances is surprisingly small, and thus the added complexity of addressing the correlation can be avoided. The small overall effect is due to (a) small perturbations by the correlation on variance population parameters and (b) cancellation of perturbation terms that appear with opposite signs in the model moment expressions.

  9. Probabilities for gravitational lensing by point masses in a locally inhomogeneous universe

    International Nuclear Information System (INIS)

    Isaacson, J.A.; Canizares, C.R.

    1989-01-01

    Probability functions for gravitational lensing by point masses that incorporate Poisson statistics and flux conservation are formulated in the Dyer-Roeder construction. Optical depths to lensing for distant sources are calculated using both the method of Press and Gunn (1973) which counts lenses in an otherwise empty cone, and the method of Ehlers and Schneider (1986) which projects lensing cross sections onto the source sphere. These are then used as parameters of the probability density for lensing in the case of a critical (q0 = 1/2) Friedmann universe. A comparison of the probability functions indicates that the effects of angle-averaging can be well approximated by adjusting the average magnification along a random line of sight so as to conserve flux. 17 references

  10. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  11. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Echocardiographic findings in infants with presumed congenital Zika syndrome: Retrospective case series study

    Science.gov (United States)

    Santos, Cleusa C.; Feitosa, Fabiana G.; Ribeiro, Maria C.; Menge, Paulo; Lira, Izabelle M.

    2017-01-01

    Objective To report the echocardiographic evaluation of 103 infants with presumed congenital Zika syndrome. Methods An observational retrospective study was performed at Instituto de Medicina Integral Prof. Fernando Figueira (IMIP), Recife, Brazil. 103 infants with presumed congenital Zika syndrome. All infants had microcephaly and head computed tomography findings compatible with congenital Zika syndrome. Zika IgM antibody was detected in cerebrospinal fluid samples of 23 infants. In 80 infants, the test was not performed because it was not available at that time. All infants had negative serology for HIV, syphilis, rubella, cytomegalovirus and toxoplasmosis. A complete transthoracic two-dimensional, M-mode, continuous wave and pulsed wave Doppler and color Doppler echocardiographic (PHILIPS HD11XE or HD15) examination was performed on all infants. Results 14/103 (13.5%) echocardiograms were compatible with congenital heart disease: 5 with an ostium secundum atrial septal defect, 8 had a hemodynamically insignificant small apical muscular ventricular septal defect and one infant with dyspnea had a large membranous ventricular septal defect. The echocardiograms considered normal included 45 infants with a persistent foramen ovale and 16 with a minimum patent ductus arteriosus. Conclusions Preliminarily this study suggests that congenital Zika syndrome may be associated with an increase prevalence of congenital heart disease. However the types of defects noted were septal defects, a proportion of which would not be hemodynamically significant. PMID:28426680

  14. Echocardiographic findings in infants with presumed congenital Zika syndrome: Retrospective case series study.

    Directory of Open Access Journals (Sweden)

    Danielle Di Cavalcanti

    Full Text Available To report the echocardiographic evaluation of 103 infants with presumed congenital Zika syndrome.An observational retrospective study was performed at Instituto de Medicina Integral Prof. Fernando Figueira (IMIP, Recife, Brazil. 103 infants with presumed congenital Zika syndrome. All infants had microcephaly and head computed tomography findings compatible with congenital Zika syndrome. Zika IgM antibody was detected in cerebrospinal fluid samples of 23 infants. In 80 infants, the test was not performed because it was not available at that time. All infants had negative serology for HIV, syphilis, rubella, cytomegalovirus and toxoplasmosis. A complete transthoracic two-dimensional, M-mode, continuous wave and pulsed wave Doppler and color Doppler echocardiographic (PHILIPS HD11XE or HD15 examination was performed on all infants.14/103 (13.5% echocardiograms were compatible with congenital heart disease: 5 with an ostium secundum atrial septal defect, 8 had a hemodynamically insignificant small apical muscular ventricular septal defect and one infant with dyspnea had a large membranous ventricular septal defect. The echocardiograms considered normal included 45 infants with a persistent foramen ovale and 16 with a minimum patent ductus arteriosus.Preliminarily this study suggests that congenital Zika syndrome may be associated with an increase prevalence of congenital heart disease. However the types of defects noted were septal defects, a proportion of which would not be hemodynamically significant.

  15. Regularized Regression and Density Estimation based on Optimal Transport

    KAUST Repository

    Burger, M.

    2012-03-11

    The aim of this paper is to investigate a novel nonparametric approach for estimating and smoothing density functions as well as probability densities from discrete samples based on a variational regularization method with the Wasserstein metric as a data fidelity. The approach allows a unified treatment of discrete and continuous probability measures and is hence attractive for various tasks. In particular, the variational model for special regularization functionals yields a natural method for estimating densities and for preserving edges in the case of total variation regularization. In order to compute solutions of the variational problems, a regularized optimal transport problem needs to be solved, for which we discuss several formulations and provide a detailed analysis. Moreover, we compute special self-similar solutions for standard regularization functionals and we discuss several computational approaches and results. © 2012 The Author(s).

  16. Exploring effective interactions through transition charge density ...

    Indian Academy of Sciences (India)

    tematics like reduced transition probabilities B(E2) and static quadrupole moments Q(2) ... approximations of solving large scale shell model problems in Monte Carlo meth- ... We present the theoretical study of transition charge densities.

  17. Information geometry of density matrices and state estimation

    International Nuclear Information System (INIS)

    Brody, Dorje C

    2011-01-01

    Given a pure state vector |x) and a density matrix ρ-hat, the function p(x|ρ-hat)= defines a probability density on the space of pure states parameterised by density matrices. The associated Fisher-Rao information measure is used to define a unitary invariant Riemannian metric on the space of density matrices. An alternative derivation of the metric, based on square-root density matrices and trace norms, is provided. This is applied to the problem of quantum-state estimation. In the simplest case of unitary parameter estimation, new higher-order corrections to the uncertainty relations, applicable to general mixed states, are derived. (fast track communication)

  18. Evaluation of neutron flux density and power density with SPN-detectors and micro calorimeters

    International Nuclear Information System (INIS)

    Gehre, G.; Rindelhardt, U.; Seidenkranz, T.; Hogel, J.; Jirousek, V.; Vazek, J.

    1983-02-01

    During investigations with a special equipped fuel assembly in the Rheinsberg nuclear power station the neutron flux and the power density were evaluated from measurements with SPN-detectors and micro calorimeters. The reliability of both detector types, their measurement accuracy under different physical conditions and the usefulness of the developed calculation models are discussed in detail. The thermal flux and the power density evaluated with SPND's agree well with theoretical results. The values obtained through micro calorimeter measurements are systematic lower by about 18%. This deviation is probably a result of differences in the used calculation models. (author)

  19. [Consideration of algorithms to presume the lesion location by using X-ray images of the stomach--geometric analysis of four direction radiography for the U region].

    Science.gov (United States)

    Henmi, Shuichi

    2013-01-01

    The author considered algorithms to presume the lesion location from a series of X-ray images obtained by four direction radiography without blind area for the U region of the stomach. The objects of analysis were six cases that protruding lesions were noticed in the U region. Firstly, from the length of short axis and measure of the lateral width of U region projected on the film, we presumed the length of longitudinal axis and angle between short axis and the film. Secondly, we calculated the rate of length to stomach walls from right side and left side of every image to the lateral width at the height passing through the center of the lesion. Using the lesion location calculated from these values, we presumed that the values that almost agreed between two images to be the lesion location. As the result of analysis, there were some cases that the lesion location could be presumed certainly or un-certainly, on the other hand, there were some cases that the lesion location could not be presumed. Since the form of the U region can be distorted by a change of position, or the angle between longitudinal axis and sagittal plane was changed, the error might have been made in calculation, and so it was considered that the lesion location could not be presumed.

  20. Inference of core barrel motion from neutron noise spectral density. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.C.; Shahrokhi, F.; Kryter, R.C.

    1977-03-15

    A method was developed for inference of core barrel motion from the following statistical descriptors: cross-power spectral density, autopower spectral density, and amplitude probability density. To quantify the core barrel motion in a typical pressurized water reactor (PWR), a scale factor was calculated in both one- and two-dimensional geometries using forward, variational, and perturbation methods of discrete ordinates neutron transport. A procedure for selection of the proper frequency band limits for the statistical descriptors was developed. It was found that although perturbation theory is adequate for the calculation of the scale factor, two-dimensional geometric effects are important enough to rule out the use of a one-dimensional approximation for all but the crudest calculations. It was also found that contributions of gamma rays can be ignored and that the results are relatively insensitive to the cross-section set employed. The proper frequency band for the statistical descriptors is conveniently determined from the coherence and phase information from two ex-core power range neutron monitors positioned diametrically across the reactor vessel. Core barrel motion can then be quantified from the integral of the band-limited cross-power spectral density of two diametrically opposed ex-core monitors or, if the coherence between the pair is greater than or equal to 0.7, from a properly band-limited amplitude probability density function. Wide-band amplitude probability density functions were demonstrated to yield erroneous estimates for the magnitude of core barrel motion.

  1. White matter hyperintensities of presumed vascular origin: a population-based study in rural Ecuador (The Atahualpa Project).

    Science.gov (United States)

    Del Brutto, Oscar H; Mera, Robertino M; Del Brutto, Victor J; Zambrano, Mauricio; Lama, Julio

    2015-04-01

    Cerebral small vessel disease is probably one of the most common pathogenetic mechanisms underlying stroke in Latin America. However, the importance of silent markers of small vessel disease, including white matter hyperintensities of presumed vascular origin, has not been assessed so far. The study aims to evaluate prevalence and correlates of white matter hyperintensities in community-dwelling elders living in Atahualpa (rural Ecuador). Atahualpa residents aged ≥ 60 years were identified during a door-to-door survey and invited to undergo brain magnetic resonance imaging for identification and grading white matter hyperintensities and other markers of small vessel disease. Using multivariate logistic regression models, we evaluated whether white matter hyperintensities is associated with demographics, cardiovascular health status, stroke, cerebral microbleeds, and cortical atrophy, after adjusting for the other variables. Out of 258 enrolled persons (mean age, 70 ± 8 years; 59% women), 172 (67%) had white matter hyperintensities, which were moderate to severe in 63. Analyses showed significant associations of white matter hyperintensities presence and severity with age and cardiovascular health status, as well as with overt and silent strokes, and a trend for association with cerebral microbleeds and cortical atrophy. Prevalence and correlates of white matter hyperintensities in elders living in rural Ecuador is almost comparable with that reported from industrialized nations, reinforcing the concept that the burden of small vessel disease is on the rise in underserved Latin American populations. © 2014 World Stroke Organization.

  2. On the magnetization process and the associated probability in anisotropic cubic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Khedr, D.M., E-mail: doaamohammed88@gmail.com [Department of Basic Science, Modern Academy of Engineering and Technology at Maadi, Cairo (Egypt); Aly, Samy H.; Shabara, Reham M. [Department of Physics, Faculty of Science at Damietta, University of Damietta, Damietta (Egypt); Yehia, Sherif [Department of Physics, Faculty of Science at Helwan, University of Helwan, Helwan (Egypt)

    2017-05-15

    We present a theoretical method to calculate specific magnetic properties, e.g. magnetization curves, magnetic susceptibility and probability landscapes along the [100], [110] and [111] crystallographic directions of a crystal of cubic symmetry. The probability landscape displays the evolution of the most probable angular orientation of the magnetization vector, for selected temperatures and magnetic fields. Our method is based on the premises of classical statistical mechanics. The energy density, used in the partition function, is the sum of magnetic anisotropy and Zeeman energies, however no other energies e.g. elastic or magnetoelastic terms are considered in the present work. Model cubic systems of diverse anisotropies are analyzed first, and subsequently material magnetic systems of cubic symmetry; namely iron, nickel and Co{sub x} Fe{sub 100−x} compounds, are discussed. We highlight a correlation between magnetization curves and the associated probability landscapes. In addition, determination of easiest axes of magnetization, using energy consideration, is done and compared with the results of the present method.

  3. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  4. On the uniform convergence of the empirical density of an ergodic diffusion

    NARCIS (Netherlands)

    Zanten, van J.H.

    2000-01-01

    We investigate the uniform convergence of the density of the empirical measure of an ergodic diffusion. It is known that under certain conditions on the drift and diffusion coefficients of the diffusion, the empirical density f t converges in probability to the invariant density f, uniformly on the

  5. An evaluation method for tornado missile strike probability with stochastic correction

    Energy Technology Data Exchange (ETDEWEB)

    Eguchi, Yuzuru; Murakami, Takahiro; Hirakuchi, Hiromaru; Sugimoto, Soichiro; Hattori, Yasuo [Nuclear Risk Research Center (External Natural Event Research Team), Central Research Institute of Electric Power Industry, Abiko (Japan)

    2017-03-15

    An efficient evaluation method for the probability of a tornado missile strike without using the Monte Carlo method is proposed in this paper. A major part of the proposed probability evaluation is based on numerical results computed using an in-house code, Tornado-borne missile analysis code, which enables us to evaluate the liftoff and flight behaviors of unconstrained objects on the ground driven by a tornado. Using the Tornado-borne missile analysis code, we can obtain a stochastic correlation between local wind speed and flight distance of each object, and this stochastic correlation is used to evaluate the conditional strike probability, QV(r), of a missile located at position r, where the local wind speed is V. In contrast, the annual exceedance probability of local wind speed, which can be computed using a tornado hazard analysis code, is used to derive the probability density function, p(V). Then, we finally obtain the annual probability of tornado missile strike on a structure with the convolutional integration of product of QV(r) and p(V) over V. The evaluation method is applied to a simple problem to qualitatively confirm the validity, and to quantitatively verify the results for two extreme cases in which an object is located just in the vicinity of or far away from the structure.

  6. An evaluation method for tornado missile strike probability with stochastic correction

    International Nuclear Information System (INIS)

    Eguchi, Yuzuru; Murakami, Takahiro; Hirakuchi, Hiromaru; Sugimoto, Soichiro; Hattori, Yasuo

    2017-01-01

    An efficient evaluation method for the probability of a tornado missile strike without using the Monte Carlo method is proposed in this paper. A major part of the proposed probability evaluation is based on numerical results computed using an in-house code, Tornado-borne missile analysis code, which enables us to evaluate the liftoff and flight behaviors of unconstrained objects on the ground driven by a tornado. Using the Tornado-borne missile analysis code, we can obtain a stochastic correlation between local wind speed and flight distance of each object, and this stochastic correlation is used to evaluate the conditional strike probability, QV(r), of a missile located at position r, where the local wind speed is V. In contrast, the annual exceedance probability of local wind speed, which can be computed using a tornado hazard analysis code, is used to derive the probability density function, p(V). Then, we finally obtain the annual probability of tornado missile strike on a structure with the convolutional integration of product of QV(r) and p(V) over V. The evaluation method is applied to a simple problem to qualitatively confirm the validity, and to quantitatively verify the results for two extreme cases in which an object is located just in the vicinity of or far away from the structure

  7. Density of American black bears in New Mexico

    Science.gov (United States)

    Gould, Matthew J.; Cain, James W.; Roemer, Gary W.; Gould, William R.; Liley, Stewart

    2018-01-01

    Considering advances in noninvasive genetic sampling and spatially explicit capture–recapture (SECR) models, the New Mexico Department of Game and Fish sought to update their density estimates for American black bear (Ursus americanus) populations in New Mexico, USA, to aide in setting sustainable harvest limits. We estimated black bear density in the Sangre de Cristo, Sandia, and Sacramento Mountains, New Mexico, 2012–2014. We collected hair samples from black bears using hair traps and bear rubs and used a sex marker and a suite of microsatellite loci to individually genotype hair samples. We then estimated density in a SECR framework using sex, elevation, land cover type, and time to model heterogeneity in detection probability and the spatial scale over which detection probability declines. We sampled the populations using 554 hair traps and 117 bear rubs and collected 4,083 hair samples. We identified 725 (367 male, 358 female) individuals. Our density estimates varied from 16.5 bears/100 km2 (95% CI = 11.6–23.5) in the southern Sacramento Mountains to 25.7 bears/100 km2 (95% CI = 13.2–50.1) in the Sandia Mountains. Overall, detection probability at the activity center (g0) was low across all study areas and ranged from 0.00001 to 0.02. The low values of g0 were primarily a result of half of all hair samples for which genotypes were attempted failing to produce a complete genotype. We speculate that the low success we had genotyping hair samples was due to exceedingly high levels of ultraviolet (UV) radiation that degraded the DNA in the hair. Despite sampling difficulties, we were able to produce density estimates with levels of precision comparable to those estimated for black bears elsewhere in the United States.

  8. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. A semi-mechanistic approach to calculate the probability of fuel defects

    International Nuclear Information System (INIS)

    Tayal, M.; Millen, E.; Sejnoha, R.

    1992-10-01

    In this paper the authors describe the status of a semi-mechanistic approach to the calculation of the probability of fuel defects. This approach expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The calculations of defect probability continue to reflect the influences of the conventional parameters like power ramp, burnup and CANLUB. In addition, the new approach provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation, for example pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, and coolant temperature and pressure. The approach has been validated against a previous empirical correlation. AN illustrative example shows how the defect thresholds are influenced by changes in the internal design of the element and in the coolant pressure. (Author) (7 figs., tab., 12 refs.)

  10. Probability Density Functions for the CALIPSO Lidar Version 4 Cloud-Aerosol Discrimination (CAD) Algorithm

    Science.gov (United States)

    Liu, Z.; Kar, J.; Zeng, S.; Tackett, J. L.; Vaughan, M.; Trepte, C. R.; Omar, A. H.; Hu, Y.; Winker, D. M.

    2017-12-01

    In the CALIPSO retrieval algorithm, detection layers in the lidar measurements is followed by their classification as a "cloud" or "aerosol" using 5-dimensional probability density functions (PDFs). The five dimensions are the mean attenuated backscatter at 532 nm, the layer integrated total attenuated color ratio, the mid-layer altitude, integrated volume depolarization ratio and latitude. The new version 4 (V4) level 2 (L2) data products, released in November 2016, are the first major revision to the L2 product suite since May 2010. Significant calibration changes in the V4 level 1 data necessitated substantial revisions to the V4 L2 CAD algorithm. Accordingly, a new set of PDFs was generated to derive the V4 L2 data products. The V4 CAD algorithm is now applied to layers detected in the stratosphere, where volcanic layers and occasional cloud and smoke layers are observed. Previously, these layers were designated as `stratospheric', and not further classified. The V4 CAD algorithm is also applied to all layers detected at single shot (333 m) resolution. In prior data releases, single shot detections were uniformly classified as clouds. The CAD PDFs used in the earlier releases were generated using a full year (2008) of CALIPSO measurements. Because the CAD algorithm was not applied to stratospheric features, the properties of these layers were not incorporated into the PDFs. When building the V4 PDFs, the 2008 data were augmented with additional data from June 2011, and all stratospheric features were included. The Nabro and Puyehue-Cordon volcanos erupted in June 2011, and volcanic aerosol layers were observed in the upper troposphere and lower stratosphere in both the northern and southern hemispheres. The June 2011 data thus provides the stratospheric aerosol properties needed for comprehensive PDF generation. In contrast to earlier versions of the PDFs, which were generated based solely on observed distributions, construction of the V4 PDFs considered the

  11. Planar-channeling spatial density under statistical equilibrium

    International Nuclear Information System (INIS)

    Ellison, J.A.; Picraux, S.T.

    1978-01-01

    The phase-space density for planar channeled particles has been derived for the continuum model under statistical equilibrium. This is used to obtain the particle spatial probability density as a function of incident angle. The spatial density is shown to depend on only two parameters, a normalized incident angle and a normalized planar spacing. This normalization is used to obtain, by numerical calculation, a set of universal curves for the spatial density and also for the channeled-particle wavelength as a function of amplitude. Using these universal curves, the statistical-equilibrium spatial density and the channeled-particle wavelength can be easily obtained for any case for which the continuum model can be applied. Also, a new one-parameter analytic approximation to the spatial density is developed. This parabolic approximation is shown to give excellent agreement with the exact calculations

  12. Effects of adult stocking density on egg production and viability in cultures of the calanoid copepod Acartia tonsa (Dana)

    DEFF Research Database (Denmark)

    Jepsen, Per Meyer; Andersen, Nikolaj; Holm, Thue

    2007-01-01

    was 84.7±4.8% and was never observed below 76.1%, with no significant differences across the stocking densities. Conclusively, as a practical recommendation for the aquaculture industry, copepod cultures with densities ranging from 100 to 600 adults L−1 and presumably even more dense cultures......The effect of stocking density of the calanoid copepod Acartia tonsa was evaluated in a 96 h rearing experiment. Possible density-dependent egg production and egg viability were analysed at stocking densities of 100, 200, 300, 400 and 600 adults L−1. Temperature, oxygen saturation and algal...... concentration were kept optimal. A non-density-dependent mortality rate of 15–19% day−1 was documented. A non-significant density-dependent egg production was observed between 100 and 600 adults L−1. The average egg production was 22.5±8.8 egg female−1 day−1 in all densities. The average egg hatching success...

  13. Formulas for Rational-Valued Separability Probabilities of Random Induced Generalized Two-Qubit States

    Directory of Open Access Journals (Sweden)

    Paul B. Slater

    2015-01-01

    Full Text Available Previously, a formula, incorporating a 5F4 hypergeometric function, for the Hilbert-Schmidt-averaged determinantal moments ρPTnρk/ρk of 4×4 density-matrices (ρ and their partial transposes (|ρPT|, was applied with k=0 to the generalized two-qubit separability probability question. The formula can, furthermore, be viewed, as we note here, as an averaging over “induced measures in the space of mixed quantum states.” The associated induced-measure separability probabilities (k=1,2,… are found—via a high-precision density approximation procedure—to assume interesting, relatively simple rational values in the two-re[al]bit (α=1/2, (standard two-qubit (α=1, and two-quater[nionic]bit (α=2 cases. We deduce rather simple companion (rebit, qubit, quaterbit, … formulas that successfully reproduce the rational values assumed for general  k. These formulas are observed to share certain features, possibly allowing them to be incorporated into a single master formula.

  14. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  15. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  16. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  17. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  18. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  19. Epidemiology of ocular disorders presumed to be inherited in three large Italian dog breeds in Italy.

    Science.gov (United States)

    Guandalini, Adolfo; Di Girolamo, Nicola; Santillo, Daniele; Andreani, Valentina; Corvi, Roberta; Bandini, Marina; Peruccio, Claudio

    2017-09-01

    To describe the epidemiology and the types of eye disorders that are presumed to be inherited (PIED) in three large Italian dog breeds. Three large Italian dog breeds: Neapolitan Mastiff (FCI code: 197), Maremma Sheepdog (FCI code: 201), and Italian Corso dog (FCI code: 343). All dogs that underwent a complete ophthalmic examination between 1992 and 2012 were included in this prospective observational study. The prevalence of eye disorders with 95% confidence intervals was reported for presumed healthy dogs and for dogs referred to a veterinary center for an ophthalmic consultation. Univariate and multivariate logistic regression techniques were used to generate odds ratios. Of 605 dogs examined during the study period, 351 dogs were affected by at least one PIED (58%; 95% CI: 54-62%). The prevalence of PIED was significantly lower in dogs presented for ophthalmic examination (53.8%) as compared to presumed healthy dogs (62.2%)(OR: 1.4; 95% CI: 1.02-1.9; P = 0.037). Also after multivariate adjustment for the period of observation, the odds of Neapolitan Mastiff (92.1%; OR: 21.4; 95% CI: 11.1-41.4) and of Cane Corso (57.7%; OR: 2.5; 95% CI: 1.7-3.6) suffering a PIED were greater than the Maremma Sheepdog (35.4%). The most common PIED in each breed were entropion (24.3% of all the PIED) in the Neapolitan Mastiff, ectropion (36.6%) in the Corso dog, and cataract (27.9%) in the Maremma Sheepdog. Clinicians should be aware that three large Italian dog breeds frequently suffer PIED. Breed standards should be reconsidered, and breeding programs should be directed at limiting such disorders. © 2016 American College of Veterinary Ophthalmologists.

  20. Exact capture probability analysis of GSC receivers over i.n.d. Rayleigh fading channels

    KAUST Repository

    Nam, Sungsik; Yang, Hongchuan; Alouini, Mohamed-Slim; Kim, Dongin

    2013-01-01

    variates. With this motivation in mind, we first provide in this paper some new order statistics results in terms of both moment generating function (MGF) and probability density function (PDF) expressions under an i.n.d. assumption and then derive a new

  1. Estimating diurnal primate densities using distance sampling ...

    African Journals Online (AJOL)

    SARAH

    2016-03-31

    Mar 31, 2016 ... In the second session, we used 10 transect adjusted to transect (Grid 17 ... session transect was visited 20 times while at the second session transect ... probability, the density of the group and the group size of each species ...

  2. Electron densities in planetary nebulae

    International Nuclear Information System (INIS)

    Stanghellini, L.; Kaler, J.B.

    1989-01-01

    Electron densities for 146 planetary nebulae have been obtained for analyzing a large sample of forbidden lines by interpolating theoretical curves obtained from solutions of the five-level atoms using up-to-date collision strengths and transition probabilities. Electron temperatures were derived from forbidden N II and/or forbidden O III lines or were estimated from the He II 4686 A line strengths. The forbidden O II densities are generally lower than those from forbidden Cl III by an average factor of 0.65. For data sets in which forbidden O II and forbidden S II were observed in common, the forbidden O II values drop to 0.84 that of the forbidden S II, implying that the outermost parts of the nebulae might have elevated densities. The forbidden Cl II and forbidden Ar IV densities show the best correlation, especially where they have been obtained from common data sets. The data give results within 30 percent of one another, assuming homogeneous nebulae. 106 refs

  3. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    Science.gov (United States)

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and

  4. Joint probability discrimination between stationary tissue and blood velocity signals

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2001-01-01

    before and after echo-canceling, and (b) the amplitude variations between samples in consecutive RF-signals before and after echo-canceling. The statistical discriminator was obtained by computing the probability density functions (PDFs) for each feature through histogram analysis of data....... This study presents a new statistical discriminator. Investigation of the RF-signals reveals that features can be derived that distinguish the segments of the signal, which do an do not carry information on the blood flow. In this study 4 features, have been determined: (a) the energy content in the segments....... The discrimination is performed by determining the joint probability of the features for the segment under investigation and choosing the segment type that is most likely. The method was tested on simulated data resembling RF-signals from the carotid artery....

  5. The effects of forward speed and depth of conservation tillage on soil bulk density

    Directory of Open Access Journals (Sweden)

    A Mahmoudi

    2015-09-01

    , besides the importance of tillage depth and speed in different tiller performance. Materials and methods: This investigation was carried out based on random blocks in the form of split plot experimental design. The main factor, tillage depth, (was 10 and 20cm at both levels and the second factor, tillage speed, (was 6, 8, 10, 12 km h-1 in four levels for Bostan-Abad and 8,10,12,14 km h-1 for Hashtrood with four repetitions. It was carried out using complex tillage made in Sazeh Keshte Bukan Company, which is mostly used in Eastern Azerbaijanand using Massey Ferguson 285 and 399 tractors in Bostab-Abad and Hashtrood, respectively. In this investigation, the characteristics of soil bulk density were studied in two sampling depths of 7 and 17 centimeters. Bulk density is an indicator of soil compaction. It is calculated as the dry weight of soil divided by its volume. This volume includes the volume of soil particles and the volume of pores among soil particles. Bulk density is typically expressed in g cm-3. Results and Discussion: In this study, the effect of both factors on the feature of the soil bulk density at the sampling depth of 5-10 and 15-20 cm was examined. In Bostan-Abad, regarding tillage speed effect for studies characteristics at 1% probability level on soil bulk density was effective. The effect of tillage depth on the soil bulk density was significant at 5% probability level . The interaction effect of tillage speed and depth on soil bulk density was significant at probability level of 1%. Regarding sampling depth effect, the soil bulk density was significant at 5% probability level, respectively. In Hashtrood, the effect of tillage speed on soil bulk density at probability level of 1%, and also tillage depth effect on soil bulk density was significant at 5% level of probability. The interaction effect of tillage speed and depth on soil bulk density was significant at 5% level of probability. Regarding the depth of sampling it was significant on soil bulk

  6. Towards predicting wading bird densities from predicted prey densities in a post-barrage Severn estuary

    International Nuclear Information System (INIS)

    Goss-Custard, J.D.; McGrorty, S.; Clarke, R.T.; Pearson, B.; Rispin, W.E.; Durell, S.E.A. le V. dit; Rose, R.J.; Warwick, R.M.; Kirby, R.

    1991-01-01

    A winter survey of seven species of wading birds in six estuaries in south-west England was made to develop a method for predicting bird densities should a tidal power barrage be built on the Severn estuary. Within most estuaries, bird densities correlated with the densities of widely taken prey species. A barrage would substantially reduce the area of intertidal flats available at low water for the birds to feed but the invertebrate density could increase in the generally more benign post-barrage environmental conditions. Wader densities would have to increase approximately twofold to allow the same overall numbers of birds to remain post-barrage as occur on the Severn at present. Provisional estimates are given of the increases in prey density required to allow bird densities to increase by this amount. With the exception of the prey of dunlin, these fall well within the ranges of densities found in other estuaries, and so could in principle be attained in the post-barrage Severn. An attempt was made to derive equations with which to predict post-barrage densities of invertebrates from easily measured, static environmental variables. The fact that a site was in the Severn had a significant additional effect on invertebrate density in seven cases. This suggests that there is a special feature of the Severn, probably one associated with its highly dynamic nature. This factor must be identified if the post-barrage densities of invertebrates are to be successful predicted. (author)

  7. Estimation of Extreme Responses and Failure Probability of Wind Turbines under Normal Operation by Controlled Monte Carlo Simulation

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri

    of the evolution of the PDF of a stochastic process; hence an alternative to the FPK. The considerable advantage of the introduced method over FPK is that its solution does not require high computational cost which extends its range of applicability to high order structural dynamic problems. The problem...... an alternative approach for estimation of the first excursion probability of any system is based on calculating the evolution of the Probability Density Function (PDF) of the process and integrating it on the specified domain. Clearly this provides the most accurate results among the three classes of the methods....... The solution of the Fokker-Planck-Kolmogorov (FPK) equation for systems governed by a stochastic differential equation driven by Gaussian white noise will give the sought time variation of the probability density function. However the analytical solution of the FPK is available for only a few dynamic systems...

  8. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  9. Determination of probability density functions for parameters in the Munson-Dawson model for creep behavior of salt

    International Nuclear Information System (INIS)

    Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.

    1992-10-01

    The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed

  10. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  11. Thermodynamic fluctuations and the monopole density of the early Universe

    International Nuclear Information System (INIS)

    Diosi, L.; Lukacs, B.

    1984-10-01

    The probability of thermodynamic fluctuations is calculated by explicitly using the Riemannian structure of the thermodynamic state space. By means of this probability distribution, a correlation volume can be defined. Identifying this volume with one domain in the GUT continuum at the symmetry breaking phase transition in the early Universe, a prediction can be obtained for the primordial monopole density. (author)

  12. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  13. Toward accurate and precise estimates of lion density.

    Science.gov (United States)

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2017-08-01

    Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km 2 , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and

  14. Non-linear density-dependent effects of an intertidal ecosystem engineer.

    Science.gov (United States)

    Harley, Christopher D G; O'Riley, Jaclyn L

    2011-06-01

    Ecosystem engineering is an important process in a variety of ecosystems. However, the relationship between engineer density and engineering impact remains poorly understood. We used experiments and a mathematical model to examine the role of engineer density in a rocky intertidal community in northern California. In this system, the whelk Nucella ostrina preys on barnacles (Balanus glandula and Chthamalus dalli), leaving empty barnacle tests as a resource (favorable microhabitat) for other species. Field experiments demonstrated that N. ostrina predation increased the availability of empty tests of both barnacle species, reduced the density of the competitively dominant B. glandula, and indirectly increased the density of the competitively inferior C. dalli. Empty barnacle tests altered microhabitat humidity, but not temperature, and presumably provided a refuge from wave action. The herbivorous snail Littorina plena was positively associated with empty test availability in both observational comparisons and experimental manipulations of empty test availability, and L. plena density was elevated in areas with foraging N. ostrina. To explore the effects of variation in N. ostrina predation, we constructed a demographic matrix model for barnacles in which we varied predation intensity. The model predicted that number of available empty tests increases with predation intensity to a point, but declines when predation pressure was strong enough to severely reduce adult barnacle densities. The modeled number of available empty tests therefore peaked at an intermediate level of N. ostrina predation. Non-linear relationships between engineer density and engineer impact may be a generally important attribute of systems in which engineers influence the population dynamics of the species that they manipulate.

  15. A measurement of the turbulence-driven density distribution in a non-star-forming molecular cloud

    Energy Technology Data Exchange (ETDEWEB)

    Ginsburg, Adam; Darling, Jeremy [CASA, University of Colorado, 389-UCB, Boulder, CO 80309 (United States); Federrath, Christoph, E-mail: Adam.G.Ginsburg@gmail.com [Monash Centre for Astrophysics, School of Mathematical Sciences, Monash University, Vic 3800 (Australia)

    2013-12-10

    Molecular clouds are supersonically turbulent. This turbulence governs the initial mass function and the star formation rate. In order to understand the details of star formation, it is therefore essential to understand the properties of turbulence, in particular the probability distribution of density in turbulent clouds. We present H{sub 2}CO volume density measurements of a non-star-forming cloud along the line of sight toward W49A. We use these measurements in conjunction with total mass estimates from {sup 13}CO to infer the shape of the density probability distribution function. This method is complementary to measurements of turbulence via the column density distribution and should be applicable to any molecular cloud with detected CO. We show that turbulence in this cloud is probably compressively driven, with a compressive-to-total Mach number ratio b=M{sub C}/M>0.4. We measure the standard deviation of the density distribution, constraining it to the range 1.5 < σ {sub s} < 1.9, assuming that the density is lognormally distributed. This measurement represents an essential input into star formation laws. The method of averaging over different excitation conditions to produce a model of emission from a turbulent cloud is generally applicable to optically thin line observations.

  16. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    International Nuclear Information System (INIS)

    Bovy, Jo; Hogg, David W.; Weaver, Benjamin A.; Myers, Adam D.; Hennawi, Joseph F.; McMahon, Richard G.; Schiminovich, David; Sheldon, Erin S.; Brinkmann, Jon; Schneider, Donald P.

    2012-01-01

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  17. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  18. Examining how presumed media influence affects social norms and adolescents' attitudes and drinking behavior intentions in rural Thailand.

    Science.gov (United States)

    Ho, Shirley S; Poorisat, Thanomwong; Neo, Rachel L; Detenber, Benjamin H

    2014-01-01

    This study uses the influence of presumed media influence model as the theoretical framework to examine how perceived social norms (i.e., descriptive, subjective, and injunctive norms) will mediate the influence of pro- and antidrinking media messages on adolescents' intention to consume alcohol in rural Thailand. Data collected from 1,028 high school students indicate that different mechanisms underlie drinking intentions between nondrinkers and those who have consumed alcohol or currently drink. Among nondrinkers, perceived peer attention to prodrinking messages indirectly influenced adolescents' prodrinking attitudes and intentions to consume alcohol through all three types of perceived social norms. Among drinkers, perceived peer attention to pro- and antidrinking messages indirectly influenced adolescents' prodrinking attitudes and intentions to drink alcohol through perceived subjective norm. The findings provide support for the extended influence of presumed media influence model and have practical implications for how antidrinking campaigns targeted at teenagers in Thailand might be designed.

  19. Density Estimation in Several Populations With Uncertain Population Membership

    KAUST Repository

    Ma, Yanyuan

    2011-09-01

    We devise methods to estimate probability density functions of several populations using observations with uncertain population membership, meaning from which population an observation comes is unknown. The probability of an observation being sampled from any given population can be calculated. We develop general estimation procedures and bandwidth selection methods for our setting. We establish large-sample properties and study finite-sample performance using simulation studies. We illustrate our methods with data from a nutrition study.

  20. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  1. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  2. Consenting options for posthumous organ donation: presumed consent and incentives are not favored

    Directory of Open Access Journals (Sweden)

    Hammami Muhammad M

    2012-11-01

    Full Text Available Abstract Background Posthumous organ procurement is hindered by the consenting process. Several consenting systems have been proposed. There is limited information on public relative attitudes towards various consenting systems, especially in Middle Eastern/Islamic countries. Methods We surveyed 698 Saudi Adults attending outpatient clinics at a tertiary care hospital. Preference and perception of norm regarding consenting options for posthumous organ donation were explored. Participants ranked (1, most agreeable the following, randomly-presented, options from 1 to 11: no-organ-donation, presumed consent, informed consent by donor-only, informed consent by donor-or-surrogate, and mandatory choice; the last three options ± medical or financial incentive. Results Mean(SD age was 32(9 year, 27% were males, 50% were patients’ companions, 60% had ≥ college education, and 20% and 32%, respectively, knew an organ donor or recipient. Mandated choice was among the top three choices for preference of 54% of respondents, with an overall median[25%,75%] ranking score of 3[2,6], and was preferred over donor-or-surrogate informed consent (4[2,7], p vs. 11[6,11], respectively, p = 0.002. Compared to females, males more perceived donor-or-surrogate informed consent as the norm (3[1,6] vs. 5[3,7], p vs. 8[4,9], p vs. 5[2,7], p  Conclusions We conclude that: 1 most respondents were in favor of posthumous organ donation, 2 mandated choice system was the most preferred and presumed consent system was the least preferred, 3 there was no difference between preference and perception of norm in consenting systems ranking, and 4 financial (especially in females and medical (especially in males incentives reduced preference.

  3. The presumed central nervous system effects of rocuronium in a neonate and its reversal with sugammadex.

    Science.gov (United States)

    Langley, Ross J; McFadzean, Jillian; McCormack, Jon

    2016-01-01

    We describe a 2-day-old male infant who received rocuronium as part of general anesthesia for a tracheal esophageal fistula repair. Postoperatively, he had prolonged central and peripheral neuromuscular blockade despite cessation of the rocuronium infusion several hours previously. This case discusses the presumed central nervous system effects of rocuronium in a neonate and its effective reversal with sugammadex. © 2015 John Wiley & Sons Ltd.

  4. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  5. The considering of the slowing down effect in the formalism of probability tables. Application to the effective cross section calculation

    International Nuclear Information System (INIS)

    Bouhelal, O.K.A.

    1990-01-01

    The exact determination of the effective multigroup cross sections imposes the numerical solution of the slowing down equation on a very fine energy mesh. Given the complexity of these calculations, different approximation methods have been developed but without a satisfactory treatment of the slowing-down effect. The usual methods are essentially based on interpolations using precalculated tables. The models that use the probability tables allow to reduce the amount of data and the computational effort. A variety of methods proposed by Soviets, then by Americans, and finally the French method, based on the ''moments of a probability distribution'' are incontestably valid within the framework of the statistical hypothesis. This stipulates that the collision densities do not depend on cross section and there is no ambiguity in the effective cross section calculation. The objective of our work is to show that the non statistical phenomena, such as the slowing-down effect which is taken into account, can be described by probability tables which are able to represent the neutronic values and collision densities. The formalism involved in the statistical hypothesis, is based on the Gauss quadrature of the cross sections moments. In the non-statistical hypothesis we introduce the crossed probability tables using the quadratures of double integrals of cross sections, comments. Moreover, a mathematical formalism allowing to establish a relationship between the crossed probability tables and the collision densities was developed. This method was applied on uranium-238 in the range of resolved resonances where the slowing down effect is significant. Validity of the method and the analysis of the obtained results are studied through a reference calculation based on a solution of a discretized slowing down equation using a very fine mesh in which each microgroup can be correctly defined via the statistical probability tables. 42 figs., 32 tabs., 49 refs. (author)

  6. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  7. THE DENSITY DISTRIBUTION IN TURBULENT BISTABLE FLOWS

    International Nuclear Information System (INIS)

    Gazol, Adriana; Kim, Jongsoo

    2013-01-01

    We numerically study the volume density probability distribution function (n-PDF) and the column density probability distribution function (Σ-PDF) resulting from thermally bistable turbulent flows. We analyze three-dimensional hydrodynamic models in periodic boxes of 100 pc by side, where turbulence is driven in the Fourier space at a wavenumber corresponding to 50 pc. At low densities (n ∼ –3 ), the n-PDF is well described by a lognormal distribution for an average local Mach number ranging from ∼0.2 to ∼5.5. As a consequence of the nonlinear development of thermal instability (TI), the logarithmic variance of the distribution of the diffuse gas increases with M faster than in the well-known isothermal case. The average local Mach number for the dense gas (n ∼> 7.1 cm –3 ) goes from ∼1.1 to ∼16.9 and the shape of the high-density zone of the n-PDF changes from a power law at low Mach numbers to a lognormal at high M values. In the latter case, the width of the distribution is smaller than in the isothermal case and grows slower with M. At high column densities, the Σ-PDF is well described by a lognormal for all of the Mach numbers we consider and, due to the presence of TI, the width of the distribution is systematically larger than in the isothermal case but follows a qualitatively similar behavior as M increases. Although a relationship between the width of the distribution and M can be found for each one of the cases mentioned above, these relations are different from those of the isothermal case.

  8. Impact of stone density on outcomes in percutaneous nephrolithotomy (PCNL)

    DEFF Research Database (Denmark)

    Anastasiadis, Anastasios; Onal, Bulent; Modi, Pranjal

    2013-01-01

    were assigned to a low stone density [LSD, ≤ 1000 Hounsfield units (HU)] or high stone density (HSD, > 1000 HU) group based on the radiological density of the primary renal stone. Preoperative characteristics and outcomes were compared in the two groups. Results. Retreatment for residual stones...... was more frequent in the LSD group. The overall stone-free rate achieved was higher in the HSD group (79.3% vs 74.8%, p = 0.113). By univariate regression analysis, the probability of achieving a stone-free outcome peaked at approximately 1250 HU. Below or above this density resulted in lower treatment...

  9. Probable alpha and 14C cluster emission from hyper Ac nuclei

    International Nuclear Information System (INIS)

    Santhosh, K.P.

    2013-01-01

    A systematic study on the probability for the emission of 4 He and 14 C cluster from hyper Λ 207-234 Ac and non-strange normal 207-234 Ac nuclei are performed for the first time using our fission model, the Coulomb and proximity potential model (CPPM). The predicted half lives show that hyper Λ 207-234 Ac nuclei are unstable against 4 He emission and 14 C emission from hyper Λ 217-228 Ac are favorable for measurement. Our study also show that hyper Λ 207-234 Ac are stable against hyper Λ 4 He and Λ 14 C emission. The role of neutron shell closure (N = 126) in hyper Λ 214 Fr daughter and role of proton/neutron shell closure (Z ∼ 82, N = 126) in hyper Λ 210 Bi daughter are also revealed. As hyper-nuclei decays to normal nuclei by mesonic/non-mesonic decay and since most of the predicted half lives for 4 He and 14 C emission from normal Ac nuclei are favourable for measurement, we presume that alpha and 14 C cluster emission from hyper Ac nuclei can be detected in laboratory in a cascade (two-step) process. (orig.)

  10. Microfluidic engineered high cell density three-dimensional neural cultures

    Science.gov (United States)

    Cullen, D. Kacy; Vukasinovic, Jelena; Glezer, Ari; La Placa, Michelle C.

    2007-06-01

    Three-dimensional (3D) neural cultures with cells distributed throughout a thick, bioactive protein scaffold may better represent neurobiological phenomena than planar correlates lacking matrix support. Neural cells in vivo interact within a complex, multicellular environment with tightly coupled 3D cell-cell/cell-matrix interactions; however, thick 3D neural cultures at cell densities approaching that of brain rapidly decay, presumably due to diffusion limited interstitial mass transport. To address this issue, we have developed a novel perfusion platform that utilizes forced intercellular convection to enhance mass transport. First, we demonstrated that in thick (>500 µm) 3D neural cultures supported by passive diffusion, cell densities =104 cells mm-3), continuous medium perfusion at 2.0-11.0 µL min-1 improved viability compared to non-perfused cultures (p death and matrix degradation. In perfused cultures, survival was dependent on proximity to the perfusion source at 2.00-6.25 µL min-1 (p 90% viability in both neuronal cultures and neuronal-astrocytic co-cultures. This work demonstrates the utility of forced interstitial convection in improving the survival of high cell density 3D engineered neural constructs and may aid in the development of novel tissue-engineered systems reconstituting 3D cell-cell/cell-matrix interactions.

  11. Measurements of excited-state-to-excited-state transition probabilities and photoionization cross-sections using laser-induced fluorescence and photoionization signals

    International Nuclear Information System (INIS)

    Shah, M.L.; Sahoo, A.C.; Pulhani, A.K.; Gupta, G.P.; Dikshit, B.; Bhatia, M.S.; Suri, B.M.

    2014-01-01

    Laser-induced photoionization and fluorescence signals were simultaneously observed in atomic samarium using Nd:YAG-pumped dye lasers. Two-color, three-photon photoionization and two-color fluorescence signals were recorded simultaneously as a function of the second-step laser power for two photoionization pathways. The density matrix formalism has been employed to analyze these signals. Two-color laser-induced fluorescence signal depends on the laser powers used for the first and second-step transitions as well as the first and second-step transition probability whereas two-color, three-photon photoionization signal depends on the third-step transition cross-section at the second-step laser wavelength along with the laser powers and transition probability for the first and second-step transitions. Two-color laser-induced fluorescence was used to measure the second-step transition probability. The second-step transition probability obtained was used to infer the photoionization cross-section. Thus, the methodology combining two-color, three-photon photoionization and two-color fluorescence signals in a single experiment has been established for the first time to measure the second-step transition probability as well as the photoionization cross-section. - Highlights: • Laser-induced photoionization and fluorescence signals have been simultaneously observed. • The density matrix formalism has been employed to analyze these signals. • Two-color laser-induced fluorescence was used to measure the second-step transition probability. • The second-step transition probability obtained was used to infer the photoionization cross-section. • Transition probability and photoionization cross-section have been measured in a single experiment

  12. The dynamics of variable-density turbulence

    International Nuclear Information System (INIS)

    Sandoval, D.L.

    1995-11-01

    The dynamics of variable-density turbulent fluids are studied by direct numerical simulation. The flow is incompressible so that acoustic waves are decoupled from the problem, and implying that density is not a thermodynamic variable. Changes in density occur due to molecular mixing. The velocity field, is in general, divergent. A pseudo-spectral numerical technique is used to solve the equations of motion. Three-dimensional simulations are performed using a grid size of 128 3 grid points. Two types of problems are studied: (1) the decay of isotropic, variable-density turbulence, and (2) buoyancy-generated turbulence in a fluid with large density fluctuations. In the case of isotropic, variable-density turbulence, the overall statistical decay behavior, for the cases studied, is relatively unaffected by the presence of density variations when the initial density and velocity fields are statistically independent. The results for this case are in quantitative agreement with previous numerical and laboratory results. In this case, the initial density field has a bimodal probability density function (pdf) which evolves in time towards a Gaussian distribution. The pdf of the density field is symmetric about its mean value throughout its evolution. If the initial velocity and density fields are statistically dependent, however, the decay process is significantly affected by the density fluctuations. For the case of buoyancy-generated turbulence, variable-density departures from the Boussinesq approximation are studied. The results of the buoyancy-generated turbulence are compared with variable-density model predictions. Both a one-point (engineering) model and a two-point (spectral) model are tested against the numerical data. Some deficiencies in these variable-density models are discussed and modifications are suggested

  13. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  14. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  15. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  16. Consenting options for posthumous organ donation: presumed consent and incentives are not favored

    Science.gov (United States)

    2012-01-01

    Background Posthumous organ procurement is hindered by the consenting process. Several consenting systems have been proposed. There is limited information on public relative attitudes towards various consenting systems, especially in Middle Eastern/Islamic countries. Methods We surveyed 698 Saudi Adults attending outpatient clinics at a tertiary care hospital. Preference and perception of norm regarding consenting options for posthumous organ donation were explored. Participants ranked (1, most agreeable) the following, randomly-presented, options from 1 to 11: no-organ-donation, presumed consent, informed consent by donor-only, informed consent by donor-or-surrogate, and mandatory choice; the last three options ± medical or financial incentive. Results Mean(SD) age was 32(9) year, 27% were males, 50% were patients’ companions, 60% had ≥ college education, and 20% and 32%, respectively, knew an organ donor or recipient. Mandated choice was among the top three choices for preference of 54% of respondents, with an overall median[25%,75%] ranking score of 3[2,6], and was preferred over donor-or-surrogate informed consent (4[2,7], p < 0.001), donor-only informed consent (5[3,7], p < 0.001), and presumed consent (7[3,10], p < 0.001). The addition of a financial or medical incentive, respectively, reduced ranking of mandated choice to 7[4,9], p < 0.001, and 5[3,8], p < 0.001; for donor-or-surrogate informed consent to 7[5,9], p < 0.001, and 5[3,7], p = 0.004; and for donor-only informed consent to 8[6,10], p < 0.001, and 5[3,7], p = 0.56. Distribution of ranking score of perception of norm and preference were similar except for no-organ donation (11[7,11] vs. 11[6,11], respectively, p = 0.002). Compared to females, males more perceived donor-or-surrogate informed consent as the norm (3[1,6] vs. 5[3,7], p < 0.001), more preferred mandated choice with financial incentive option (6[3,8] vs. 8[4,9], p < 0.001), and

  17. Proximity approach to study fusion probabilities in heavy-ion collisions

    International Nuclear Information System (INIS)

    Raj Kumari

    2013-01-01

    The fusion cross-sections at the sub-barrier energies are found to be enhanced compared to the predictions of the barrier penetration model. The aim is to test Bass 80, Aage Winther (AW) 95, Denisov DP, Proximity 2010 and Skyrme Energy Density Formalism (SEDF) at energies above as well as below barrier height. For the present systematic study, the fusion probabilities for the reactions of 28 Si+ 24,26 Mg 30 Si+ 24 Mg and 28,30 Si+ 58,62 Ni have been calculated

  18. Probability of primordial black hole formation and its dependence on the radial profile of initial configurations

    International Nuclear Information System (INIS)

    Hidalgo, J. C.; Polnarev, A. G.

    2009-01-01

    In this paper we derive the probability of the radial profiles of spherically symmetric inhomogeneities in order to provide an improved estimation of the number density of primordial black holes (PBHs). We demonstrate that the probability of PBH formation depends sensitively on the radial profile of the initial configuration. We do this by characterizing this profile with two parameters chosen heuristically: the amplitude of the inhomogeneity and the second radial derivative, both evaluated at the center of the configuration. We calculate the joint probability of initial cosmological inhomogeneities as a function of these two parameters and then find a correspondence between these parameters and those used in numerical computations of PBH formation. Finally, we extend our heuristic study to evaluate the probability of PBH formation taking into account for the first time the radial profile of curvature inhomogeneities.

  19. Improving experimental phases for strong reflections prior to density modification

    International Nuclear Information System (INIS)

    Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.; Read, Randy J.

    2013-01-01

    A genetic algorithm has been developed to optimize the phases of the strongest reflections in SIR/SAD data. This is shown to facilitate density modification and model building in several test cases. Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005 ▶), Acta Cryst. D61, 899–902], the impact of identifying optimized phases for a small number of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. A computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography

  20. The implementation of nuclear methods for density measurements on Romanian roads

    International Nuclear Information System (INIS)

    Tripadus, V.; Craciun, L.; Peticila, M.; Florea, N.

    2000-01-01

    The implementation of nuclear methods in field measurements presumes steps concerning the fulfillment of the many requirements to be undertaken. First of all the owner of the nuclear equipment must obtain all the documents imposed by the Romanian laws. The second step is connected with the recalibration of the equipment in order to obtain an improved precision of the measurements. In the last few years National Administration of Roads, Research Institute of Roads together with National Institute of Physics and Nuclear Engineering, made many efforts in order to implement nuclear methods destined to determine both the density and the moisture content either on asphalt or on compact soils. The American companies CPN and Troxler produced the equipment. On the basis of the comparison between nuclear and core density measurements the correction factor of the equipment was established. A special attention was paid to the definitions of different physical quantities occurring in Romanian Standards in order to connect them properly with the American ones. (authors)

  1. Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation

    Energy Technology Data Exchange (ETDEWEB)

    Barajas-Solano, David A.; Tartakovsky, Alexandre M.

    2018-01-01

    We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.

  2. Probability of lek collapse is lower inside sage-grouse Core Areas: Effectiveness of conservation policy for a landscape species.

    Science.gov (United States)

    Spence, Emma Suzuki; Beck, Jeffrey L; Gregory, Andrew J

    2017-01-01

    Greater sage-grouse (Centrocercus urophasianus) occupy sagebrush (Artemisia spp.) habitats in 11 western states and 2 Canadian provinces. In September 2015, the U.S. Fish and Wildlife Service announced the listing status for sage-grouse had changed from warranted but precluded to not warranted. The primary reason cited for this change of status was that the enactment of new regulatory mechanisms was sufficient to protect sage-grouse populations. One such plan is the 2008, Wyoming Sage Grouse Executive Order (SGEO), enacted by Governor Freudenthal. The SGEO identifies "Core Areas" that are to be protected by keeping them relatively free from further energy development and limiting other forms of anthropogenic disturbances near active sage-grouse leks. Using the Wyoming Game and Fish Department's sage-grouse lek count database and the Wyoming Oil and Gas Conservation Commission database of oil and gas well locations, we investigated the effectiveness of Wyoming's Core Areas, specifically: 1) how well Core Areas encompass the distribution of sage-grouse in Wyoming, 2) whether Core Area leks have a reduced probability of lek collapse, and 3) what, if any, edge effects intensification of oil and gas development adjacent to Core Areas may be having on Core Area populations. Core Areas contained 77% of male sage-grouse attending leks and 64% of active leks. Using Bayesian binomial probability analysis, we found an average 10.9% probability of lek collapse in Core Areas and an average 20.4% probability of lek collapse outside Core Areas. Using linear regression, we found development density outside Core Areas was related to the probability of lek collapse inside Core Areas. Specifically, probability of collapse among leks >4.83 km from inside Core Area boundaries was significantly related to well density within 1.61 km (1-mi) and 4.83 km (3-mi) outside of Core Area boundaries. Collectively, these data suggest that the Wyoming Core Area Strategy has benefited sage

  3. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  4. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  5. Reduction of density-modification bias by β correction

    International Nuclear Information System (INIS)

    Skubák, Pavol; Pannu, Navraj S.

    2011-01-01

    A cross-validation-based method for bias reduction in ‘classical’ iterative density modification of experimental X-ray crystallography maps provides significantly more accurate phase-quality estimates and leads to improved automated model building. Density modification often suffers from an overestimation of phase quality, as seen by escalated figures of merit. A new cross-validation-based method to address this estimation bias by applying a bias-correction parameter ‘β’ to maximum-likelihood phase-combination functions is proposed. In tests on over 100 single-wavelength anomalous diffraction data sets, the method is shown to produce much more reliable figures of merit and improved electron-density maps. Furthermore, significantly better results are obtained in automated model building iterated with phased refinement using the more accurate phase probability parameters from density modification

  6. Multi-objective mixture-based iterated density estimation evolutionary algorithms

    NARCIS (Netherlands)

    Thierens, D.; Bosman, P.A.N.

    2001-01-01

    We propose an algorithm for multi-objective optimization using a mixture-based iterated density estimation evolutionary algorithm (MIDEA). The MIDEA algorithm is a prob- abilistic model building evolutionary algo- rithm that constructs at each generation a mixture of factorized probability

  7. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  8. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  9. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  10. Effect of plant population density on the growth and yield of sorghum ...

    African Journals Online (AJOL)

    Improvement of resource use efficiency and yields is probably possible through the use of appropriate plant densities. Field trials were therefore conducted to study the effects of four plant densities, varying from 2.0 to 12.5 plants m-2 on water and radiation use and performance of two Masakwa sorghum varieties grown on ...

  11. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  12. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  13. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  14. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  15. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  16. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  17. Exact capture probability analysis of GSC receivers over i.n.d. Rayleigh fading channels

    KAUST Repository

    Nam, Sungsik

    2013-07-01

    A closed-form expression of the capture probability of generalized selection combining (GSC) RAKE receivers was introduced in [1]. The idea behind this new performance metric is to quantify how the remaining set of uncombined paths affects the overall performance both in terms of loss in power and increase in interference levels. In this previous work, the assumption was made that the fading is both independent and identically distributed from path to path. However, the average strength of each path is different in reality. In order to derive a closed-form expression of the capture probability over independent and non-identically distributed (i.n.d.) fading channels, we need to derive the joint statistics of ordered non-identical exponential variates. With this motivation in mind, we first provide in this paper some new order statistics results in terms of both moment generating function (MGF) and probability density function (PDF) expressions under an i.n.d. assumption and then derive a new exact closed-form expression for the capture probability GSC RAKE receivers in this more realistic scenario. © 2013 IEEE.

  18. Increased Incidence of Benign Pancreatic Pathology following Pancreaticoduodenectomy for Presumed Malignancy over 10 Years despite Increased Use of Endoscopic Ultrasound

    Directory of Open Access Journals (Sweden)

    Shadi S. Yarandi

    2014-01-01

    Full Text Available Despite using imaging studies, tissue sampling, and serologic tests about 5–10% of surgeries done for presumed pancreatic malignancies will have benign findings on final pathology. Endoscopic ultrasound (EUS is used with increasing frequency to study pancreatic masses. The aim of this study is to examine the effect of EUS on prevalence of benign diseases undergoing Whipple over the last decade. Patients who underwent Whipple procedure for presumed malignancy at Emory University Hospital from 1998 to 2011 were selected. Demographic data, history of smoking and drinking, history of diabetes and pancreatitis, imaging data, pathology reports, and tumor markers were extracted. 878 patients were found. 95 (10.82% patients had benign disease. Prevalence of benign finding had increased over the recent years despite using more EUS. Logistic regression models showed that abdominal pain (OR: 5.829, 95% CI 2.681–12.674, P ≤ 0.001 and alcohol abuse (OR: 3.221, CI 95%: 1.362–7.261, P: 0.002 were predictors of benign diseases. Jaundice (OR: 0.221, 95% CI: 0.084–0.58, P: 0.002, mass (OR: 0.145, 95% CI: 0.043–0.485, P: 0.008, and ductal dilation (OR: 0.297, 95% CI 0.134–0.657, P: 0.003 were associated with malignancy. Use of imaging studies, ERCP, and EUS has not decreased the percentage of benign findings after surgery for presumed pancreatic malignancy.

  19. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  20. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  1. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  2. Optimization of continuous ranked probability score using PSO

    Directory of Open Access Journals (Sweden)

    Seyedeh Atefeh Mohammadi

    2015-07-01

    Full Text Available Weather forecast has been a major concern in various industries such as agriculture, aviation, maritime, tourism, transportation, etc. A good weather prediction may reduce natural disasters and unexpected events. This paper presents an empirical investigation to predict weather temperature using continuous ranked probability score (CRPS. The mean and standard deviation of normal density function are linear combination of the components of ensemble system. The resulted optimization model has been solved using particle swarm optimization (PSO and the results are compared with Broyden–Fletcher–Goldfarb–Shanno (BFGS method. The preliminary results indicate that the proposed PSO provides better results in terms of root-mean-square deviation criteria than the alternative BFGS method.

  3. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  4. Change-in-ratio density estimator for feral pigs is less biased than closed mark-recapture estimates

    Science.gov (United States)

    Hanson, L.B.; Grand, J.B.; Mitchell, M.S.; Jolley, D.B.; Sparklin, B.D.; Ditchkoff, S.S.

    2008-01-01

    Closed-population capture-mark-recapture (CMR) methods can produce biased density estimates for species with low or heterogeneous detection probabilities. In an attempt to address such biases, we developed a density-estimation method based on the change in ratio (CIR) of survival between two populations where survival, calculated using an open-population CMR model, is known to differ. We used our method to estimate density for a feral pig (Sus scrofa) population on Fort Benning, Georgia, USA. To assess its validity, we compared it to an estimate of the minimum density of pigs known to be alive and two estimates based on closed-population CMR models. Comparison of the density estimates revealed that the CIR estimator produced a density estimate with low precision that was reasonable with respect to minimum known density. By contrast, density point estimates using the closed-population CMR models were less than the minimum known density, consistent with biases created by low and heterogeneous capture probabilities for species like feral pigs that may occur in low density or are difficult to capture. Our CIR density estimator may be useful for tracking broad-scale, long-term changes in species, such as large cats, for which closed CMR models are unlikely to work. ?? CSIRO 2008.

  5. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  6. The burden of presumed tuberculosis in hospitalized children in a resource-limited setting in Papua New Guinea: a prospective observational study.

    Science.gov (United States)

    Watch, Villa; Aipit, Jimmy; Kote-Yarong, Tina; Rero, Allanie; Bolnga, John W; Lufele, Elvin; Laman, Moses

    2017-11-01

    In Papua New Guinea, TB is considered to be a major public health problem, but little is known about the prevalence and prognosis of presumed TB in children. As part of a prospective hospital-based surveillance on the northern coast of mainland Papua New Guinea, the authors investigated the admission prevalence and case fatality rate associated with presumed TB over a 6-year period (2011-2016). All children admitted who were diagnosed with TB were followed-up until discharge or death. Of 8992 paediatric admissions, 734 patients (8.2%) were diagnosed with presumed TB and there were 825 deaths, with TB accounting for 102 (12.4%). Extrapulmonary TB was the final diagnosis in 384 admissions {prevalence 4.3% [384/8992 (95% CI 3.9-4.7)]} with a case fatality rate of 21.4% [82/384 (95% CI 17.4-25.9)]. TB meningitis, disseminated TB and pericardial TB had high case fatality rates of 29.0% (53/183), 28.9% (11/38) and 25% (4/16), respectively. Severe malnutrition was more common in patients with pulmonary compared with extrapulmonary TB (25.4% vs 15.6%; pPapua New Guinea. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Integrating spatial, temporal, and size probabilities for the annual landslide hazard maps in the Shihmen watershed, Taiwan

    Directory of Open Access Journals (Sweden)

    C. Y. Wu

    2013-09-01

    Full Text Available Landslide spatial, temporal, and size probabilities were used to perform a landslide hazard assessment in this study. Eleven intrinsic geomorphological, and two extrinsic rainfall factors were evaluated as landslide susceptibility related factors as they related to the success rate curves, landslide ratio plots, frequency distributions of landslide and non-landslide groups, as well as probability–probability plots. Data on landslides caused by Typhoon Aere in the Shihmen watershed were selected to train the susceptibility model. The landslide area probability, based on the power law relationship between the landslide area and a noncumulative number, was analyzed using the Pearson type 5 probability density function. The exceedance probabilities of rainfall with various recurrence intervals, including 2, 5, 10, 20, 50, 100 and 200 yr, were used to determine the temporal probabilities of the events. The study was conducted in the Shihmen watershed, which has an area of 760 km2 and is one of the main water sources for northern Taiwan. The validation result of Typhoon Krosa demonstrated that this landslide hazard model could be used to predict the landslide probabilities. The results suggested that integration of spatial, area, and exceedance probabilities to estimate the annual probability of each slope unit is feasible. The advantage of this annual landslide probability model lies in its ability to estimate the annual landslide risk, instead of a scenario-based risk.

  8. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  9. Density fluctuation effects on collective neutrino oscillations in O-Ne-Mg core-collapse supernovae

    International Nuclear Information System (INIS)

    Cherry, John F.; Fuller, George M.; Wu Mengru; Qian Yongzhong; Carlson, J.; Duan Huaiyu

    2011-01-01

    We investigate the effect of matter density fluctuations on supernova collective neutrino flavor oscillations. In particular, we use full multiangle, three-flavor, self-consistent simulations of the evolution of the neutrino flavor field in the envelope of an O-Ne-Mg core-collapse supernova at shock breakout (neutronization neutrino burst) to study the effect of the matter density ''bump'' left by the He-burning shell. We find a seemingly counterintuitive increase in the overall ν e survival probability created by this matter density feature. We discuss this behavior in terms of the interplay between the matter density profile and neutrino collective effects. While our results give new insights into this interplay, they also suggest an immediate consequence for supernova neutrino burst detection: it will be difficult to use a burst signal to extract information on fossil burning shells or other fluctuations of this scale in the matter density profile. Consistent with previous studies, our results also show that the interplay of neutrino self-coupling and matter fluctuation could cause a significant increase in the ν e survival probability at very low energy.

  10. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  11. Absolute transition probabilities of 5s-5p transitions of Kr I from interferometric measurements in LTE-plasmas

    International Nuclear Information System (INIS)

    Kaschek, K.; Ernst, G.K.; Boetticher, W.

    1984-01-01

    Absolute transition probabilities of nine 5s-5p transitions of Kr I have been evaluated by using the hook method. The plasma was produced in a shock tube. The population density of the 5s-levels was calculated, under the assumption of LTE, from the electron density and the ground state number measured by means of a dual wavelength interferometer. An evaluation is given which proves the validity of the LTE assumption. (orig.)

  12. Multivariate quantile mapping bias correction: an N-dimensional probability density function transform for climate model simulations of multiple variables

    Science.gov (United States)

    Cannon, Alex J.

    2018-01-01

    Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin

  13. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  14. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  15. Edge Probability and Pixel Relativity-Based Speckle Reducing Anisotropic Diffusion.

    Science.gov (United States)

    Mishra, Deepak; Chaudhury, Santanu; Sarkar, Mukul; Soin, Arvinder Singh; Sharma, Vivek

    2018-02-01

    Anisotropic diffusion filters are one of the best choices for speckle reduction in the ultrasound images. These filters control the diffusion flux flow using local image statistics and provide the desired speckle suppression. However, inefficient use of edge characteristics results in either oversmooth image or an image containing misinterpreted spurious edges. As a result, the diagnostic quality of the images becomes a concern. To alleviate such problems, a novel anisotropic diffusion-based speckle reducing filter is proposed in this paper. A probability density function of the edges along with pixel relativity information is used to control the diffusion flux flow. The probability density function helps in removing the spurious edges and the pixel relativity reduces the oversmoothing effects. Furthermore, the filtering is performed in superpixel domain to reduce the execution time, wherein a minimum of 15% of the total number of image pixels can be used. For performance evaluation, 31 frames of three synthetic images and 40 real ultrasound images are used. In most of the experiments, the proposed filter shows a better performance as compared to the state-of-the-art filters in terms of the speckle region's signal-to-noise ratio and mean square error. It also shows a comparative performance for figure of merit and structural similarity measure index. Furthermore, in the subjective evaluation, performed by the expert radiologists, the proposed filter's outputs are preferred for the improved contrast and sharpness of the object boundaries. Hence, the proposed filtering framework is suitable to reduce the unwanted speckle and improve the quality of the ultrasound images.

  16. Generation of Stationary Non-Gaussian Time Histories with a Specified Cross-spectral Density

    Directory of Open Access Journals (Sweden)

    David O. Smallwood

    1997-01-01

    Full Text Available The paper reviews several methods for the generation of stationary realizations of sampled time histories with non-Gaussian distributions and introduces a new method which can be used to control the cross-spectral density matrix and the probability density functions (pdfs of the multiple input problem. Discussed first are two methods for the specialized case of matching the auto (power spectrum, the skewness, and kurtosis using generalized shot noise and using polynomial functions. It is then shown that the skewness and kurtosis can also be controlled by the phase of a complex frequency domain description of the random process. The general case of matching a target probability density function using a zero memory nonlinear (ZMNL function is then covered. Next methods for generating vectors of random variables with a specified covariance matrix for a class of spherically invariant random vectors (SIRV are discussed. Finally the general case of matching the cross-spectral density matrix of a vector of inputs with non-Gaussian marginal distributions is presented.

  17. Ant-inspired density estimation via random walks.

    Science.gov (United States)

    Musco, Cameron; Su, Hsin-Hao; Lynch, Nancy A

    2017-10-03

    Many ant species use distributed population density estimation in applications ranging from quorum sensing, to task allocation, to appraisal of enemy colony strength. It has been shown that ants estimate local population density by tracking encounter rates: The higher the density, the more often the ants bump into each other. We study distributed density estimation from a theoretical perspective. We prove that a group of anonymous agents randomly walking on a grid are able to estimate their density within a small multiplicative error in few steps by measuring their rates of encounter with other agents. Despite dependencies inherent in the fact that nearby agents may collide repeatedly (and, worse, cannot recognize when this happens), our bound nearly matches what would be required to estimate density by independently sampling grid locations. From a biological perspective, our work helps shed light on how ants and other social insects can obtain relatively accurate density estimates via encounter rates. From a technical perspective, our analysis provides tools for understanding complex dependencies in the collision probabilities of multiple random walks. We bound the strength of these dependencies using local mixing properties of the underlying graph. Our results extend beyond the grid to more general graphs, and we discuss applications to size estimation for social networks, density estimation for robot swarms, and random walk-based sampling for sensor networks.

  18. Charge densities and charge noise in mesoscopic conductors

    Indian Academy of Sciences (India)

    This generalization leads to a local Wigner–Smith life-time matrix. Keywords. Density ... Of interest is the charge distribution in such a conductor and ..... is the transmission probability of the scattering problem without absorption if .... as a voltage probe which has its potential adjusted in such a way that there is no net current.

  19. Multi-scale evaluation of the environmental controls on burn probability in a southern Sierra Nevada landscape

    Science.gov (United States)

    Sean A. Parks; Marc-Andre Parisien; Carol Miller

    2011-01-01

    We examined the scale-dependent relationship between spatial fire likelihood or burn probability (BP) and some key environmental controls in the southern Sierra Nevada, California, USA. Continuous BP estimates were generated using a fire simulation model. The correspondence between BP (dependent variable) and elevation, ignition density, fuels and aspect was evaluated...

  20. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  1. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximate...... any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest...... such as the renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation...

  2. Computing thermal Wigner densities with the phase integration method

    International Nuclear Information System (INIS)

    Beutier, J.; Borgis, D.; Vuilleumier, R.; Bonella, S.

    2014-01-01

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems

  3. Computing thermal Wigner densities with the phase integration method.

    Science.gov (United States)

    Beutier, J; Borgis, D; Vuilleumier, R; Bonella, S

    2014-08-28

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems.

  4. Probability of lek collapse is lower inside sage-grouse Core Areas: Effectiveness of conservation policy for a landscape species.

    Directory of Open Access Journals (Sweden)

    Emma Suzuki Spence

    Full Text Available Greater sage-grouse (Centrocercus urophasianus occupy sagebrush (Artemisia spp. habitats in 11 western states and 2 Canadian provinces. In September 2015, the U.S. Fish and Wildlife Service announced the listing status for sage-grouse had changed from warranted but precluded to not warranted. The primary reason cited for this change of status was that the enactment of new regulatory mechanisms was sufficient to protect sage-grouse populations. One such plan is the 2008, Wyoming Sage Grouse Executive Order (SGEO, enacted by Governor Freudenthal. The SGEO identifies "Core Areas" that are to be protected by keeping them relatively free from further energy development and limiting other forms of anthropogenic disturbances near active sage-grouse leks. Using the Wyoming Game and Fish Department's sage-grouse lek count database and the Wyoming Oil and Gas Conservation Commission database of oil and gas well locations, we investigated the effectiveness of Wyoming's Core Areas, specifically: 1 how well Core Areas encompass the distribution of sage-grouse in Wyoming, 2 whether Core Area leks have a reduced probability of lek collapse, and 3 what, if any, edge effects intensification of oil and gas development adjacent to Core Areas may be having on Core Area populations. Core Areas contained 77% of male sage-grouse attending leks and 64% of active leks. Using Bayesian binomial probability analysis, we found an average 10.9% probability of lek collapse in Core Areas and an average 20.4% probability of lek collapse outside Core Areas. Using linear regression, we found development density outside Core Areas was related to the probability of lek collapse inside Core Areas. Specifically, probability of collapse among leks >4.83 km from inside Core Area boundaries was significantly related to well density within 1.61 km (1-mi and 4.83 km (3-mi outside of Core Area boundaries. Collectively, these data suggest that the Wyoming Core Area Strategy has benefited

  5. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset.

    Science.gov (United States)

    O'Connor, Kelly M; Nathan, Lucas R; Liberati, Marjorie R; Tingley, Morgan W; Vokoun, Jason C; Rittenhouse, Tracy A G

    2017-01-01

    Camera trapping is a standard tool in ecological research and wildlife conservation. Study designs, particularly for small-bodied or cryptic wildlife species often attempt to boost low detection probabilities by using non-random camera placement or baited cameras, which may bias data, or incorrectly estimate detection and occupancy. We investigated the ability of non-baited, multi-camera arrays to increase detection probabilities of wildlife. Study design components were evaluated for their influence on wildlife detectability by iteratively parsing an empirical dataset (1) by different sizes of camera arrays deployed (1-10 cameras), and (2) by total season length (1-365 days). Four species from our dataset that represented a range of body sizes and differing degrees of presumed detectability based on life history traits were investigated: white-tailed deer (Odocoileus virginianus), bobcat (Lynx rufus), raccoon (Procyon lotor), and Virginia opossum (Didelphis virginiana). For all species, increasing from a single camera to a multi-camera array significantly improved detection probability across the range of season lengths and number of study sites evaluated. The use of a two camera array increased survey detection an average of 80% (range 40-128%) from the detection probability of a single camera across the four species. Species that were detected infrequently benefited most from a multiple-camera array, where the addition of up to eight cameras produced significant increases in detectability. However, for species detected at high frequencies, single cameras produced a season-long (i.e, the length of time over which cameras are deployed and actively monitored) detectability greater than 0.75. These results highlight the need for researchers to be critical about camera trap study designs based on their intended target species, as detectability for each focal species responded differently to array size and season length. We suggest that researchers a priori identify

  6. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset.

    Directory of Open Access Journals (Sweden)

    Kelly M O'Connor

    Full Text Available Camera trapping is a standard tool in ecological research and wildlife conservation. Study designs, particularly for small-bodied or cryptic wildlife species often attempt to boost low detection probabilities by using non-random camera placement or baited cameras, which may bias data, or incorrectly estimate detection and occupancy. We investigated the ability of non-baited, multi-camera arrays to increase detection probabilities of wildlife. Study design components were evaluated for their influence on wildlife detectability by iteratively parsing an empirical dataset (1 by different sizes of camera arrays deployed (1-10 cameras, and (2 by total season length (1-365 days. Four species from our dataset that represented a range of body sizes and differing degrees of presumed detectability based on life history traits were investigated: white-tailed deer (Odocoileus virginianus, bobcat (Lynx rufus, raccoon (Procyon lotor, and Virginia opossum (Didelphis virginiana. For all species, increasing from a single camera to a multi-camera array significantly improved detection probability across the range of season lengths and number of study sites evaluated. The use of a two camera array increased survey detection an average of 80% (range 40-128% from the detection probability of a single camera across the four species. Species that were detected infrequently benefited most from a multiple-camera array, where the addition of up to eight cameras produced significant increases in detectability. However, for species detected at high frequencies, single cameras produced a season-long (i.e, the length of time over which cameras are deployed and actively monitored detectability greater than 0.75. These results highlight the need for researchers to be critical about camera trap study designs based on their intended target species, as detectability for each focal species responded differently to array size and season length. We suggest that researchers a priori

  7. Density-Based 3D Shape Descriptors

    Directory of Open Access Journals (Sweden)

    Schmitt Francis

    2007-01-01

    Full Text Available We propose a novel probabilistic framework for the extraction of density-based 3D shape descriptors using kernel density estimation. Our descriptors are derived from the probability density functions (pdf of local surface features characterizing the 3D object geometry. Assuming that the shape of the 3D object is represented as a mesh consisting of triangles with arbitrary size and shape, we provide efficient means to approximate the moments of geometric features on a triangle basis. Our framework produces a number of 3D shape descriptors that prove to be quite discriminative in retrieval applications. We test our descriptors and compare them with several other histogram-based methods on two 3D model databases, Princeton Shape Benchmark and Sculpteur, which are fundamentally different in semantic content and mesh quality. Experimental results show that our methodology not only improves the performance of existing descriptors, but also provides a rigorous framework to advance and to test new ones.

  8. A cellular automata model of traffic flow with variable probability of randomization

    International Nuclear Information System (INIS)

    Zheng Wei-Fan; Zhang Ji-Ye

    2015-01-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)

  9. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  10. Fishnet model for failure probability tail of nacre-like imbricated lamellar materials

    Science.gov (United States)

    Luo, Wen; Bažant, Zdeněk P.

    2017-12-01

    Nacre, the iridescent material of the shells of pearl oysters and abalone, consists mostly of aragonite (a form of CaCO3), a brittle constituent of relatively low strength (≈10 MPa). Yet it has astonishing mean tensile strength (≈150 MPa) and fracture energy (≈350 to 1,240 J/m2). The reasons have recently become well understood: (i) the nanoscale thickness (≈300 nm) of nacre's building blocks, the aragonite lamellae (or platelets), and (ii) the imbricated, or staggered, arrangement of these lamellea, bound by biopolymer layers only ≈25 nm thick, occupying engineering applications, however, the failure probability of ≤10-6 is generally required. To guarantee it, the type of probability density function (pdf) of strength, including its tail, must be determined. This objective, not pursued previously, is hardly achievable by experiments alone, since >10^8 tests of specimens would be needed. Here we outline a statistical model of strength that resembles a fishnet pulled diagonally, captures the tail of pdf of strength and, importantly, allows analytical safety assessments of nacreous materials. The analysis shows that, in terms of safety, the imbricated lamellar structure provides a major additional advantage—˜10% strength increase at tail failure probability 10^-6 and a 1 to 2 orders of magnitude tail probability decrease at fixed stress. Another advantage is that a high scatter of microstructure properties diminishes the strength difference between the mean and the probability tail, compared with the weakest link model. These advantages of nacre-like materials are here justified analytically and supported by millions of Monte Carlo simulations.

  11. Presuming the influence of the media: teenagers′ constructions of gender identity through sexual/romantic relationships and alcohol consumption

    Science.gov (United States)

    Hartley, Jane E K; Wight, Daniel; Hunt, Kate

    2014-01-01

    Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers’ alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's ‘presumed media influence’ theory. Media portrayals of romantic/sexual relationships appeared to influence teenagers’ constructions of gender-appropriate sexual behaviour more than did media portrayals of drinking behaviour, perhaps because the teenagers had more firsthand experience of observing drinking than of observing sexual relationships. Presumed media influence may be less influential if one has experience of the behaviour portrayed. Drinking and sexual behaviour were highly interrelated: sexual negotiation and activities were reportedly often accompanied by drinking. For teenagers, being drunk or, importantly, pretending to be drunk, may be a useful way to try out what they perceived to be gender-appropriate identities. In sum, teenagers’ drinking and sexual/romantic relationships are primary ways in which they do gender and the media's influence on their perceptions of appropriate gendered behaviour is mediated through peer relationships. PMID:24443822

  12. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  13. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  14. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  15. A hierarchical model for estimating density in camera-trap studies

    Science.gov (United States)

    Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.

    2009-01-01

    Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.

  16. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  17. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  18. Application of a few orthogonal polynomials to the assessment of the fracture failure probability of a spherical tank

    International Nuclear Information System (INIS)

    Cao Tianjie; Zhou Zegong

    1993-01-01

    This paper presents some methods to assess the fracture failure probability of a spherical tank. These methods convert the assessment of the fracture failure probability into the calculation of the moment of cracks and a one-dimensional integral. In the paper, we first derive series' formulae to calculation the moments of cracks on the occasion of the crack fatigue growth and the moments of crack opening displacements according to JWES-2805 code. We then use the first n moments of crack opening displacements and a few orthogonal polynomials to compose the probability density function of the crack opening displacement. Lastly, the fracture failure probability is obtained according to the interference theory. An example proves that these methods are simpler, quicker, and more accurate. At the same time, these methods avoid the disadvantage of Edgeworth's series method. (author)

  19. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  20. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  1. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  2. Bounded Densities and Their Derivatives

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, V.

    2009-01-01

    This paper describes how one can compute interval-valued statistical measures given limited information about the underlying distribution. The particular focus is on a bounded derivative of a probability density function and its combination with other available statistical evidence for computing ...... quantities of interest. To be able to utilise the evidence about the derivative it is suggested to adapt the ‘conventional’ problem statement to variational calculus and the way to do so is demonstrated. A number of examples are given throughout the paper....

  3. Estimation of the probability of success in petroleum exploration

    Science.gov (United States)

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum

  4. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  5. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  6. A Method to Estimate the Probability That Any Individual Lightning Stroke Contacted the Surface Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.

    2010-01-01

    A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].

  7. Task 4.1: Development of a framework for creating a databank to generate probability density functions for process parameters

    International Nuclear Information System (INIS)

    Burgazzi, Luciano

    2011-01-01

    PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest

  8. Self-similar density turbulence in the TCV tokamak scrape-off layer

    International Nuclear Information System (INIS)

    Graves, J P; Horacek, J; Pitts, R A; Hopcraft, K I

    2005-01-01

    Plasma fluctuations in the scrape-off layer (SOL) of the TCV tokamak exhibit statistical properties which are universal across a broad range of discharge conditions. Electron density fluctuations, from just inside the magnetic separatrix to the plasma-wall interface, are described well by a gamma distributed random variable. The density fluctuations exhibit clear evidence of self-similarity in the far SOL, such that the corresponding probability density functions collapse upon renormalization solely by the mean particle density. This constitutes a demonstration that the amplitude of the density fluctuations is simply proportional to the mean density and is consistent with the further observation that the radial particle flux fluctuations scale solely with the mean density over two orders of magnitude. Such findings indicate that it may be possible to improve the prediction of transport in the critical plasma-wall interaction region of future large scale tokamaks. (letter to the editor)

  9. Estimating population density and connectivity of American mink using spatial capture-recapture.

    Science.gov (United States)

    Fuller, Angela K; Sutherland, Chris S; Royle, J Andrew; Hare, Matthew P

    2016-06-01

    Estimating the abundance or density of populations is fundamental to the conservation and management of species, and as landscapes become more fragmented, maintaining landscape connectivity has become one of the most important challenges for biodiversity conservation. Yet these two issues have never been formally integrated together in a model that simultaneously models abundance while accounting for connectivity of a landscape. We demonstrate an application of using capture-recapture to develop a model of animal density using a least-cost path model for individual encounter probability that accounts for non-Euclidean connectivity in a highly structured network. We utilized scat detection dogs (Canis lupus familiaris) as a means of collecting non-invasive genetic samples of American mink (Neovison vison) individuals and used spatial capture-recapture models (SCR) to gain inferences about mink population density and connectivity. Density of mink was not constant across the landscape, but rather increased with increasing distance from city, town, or village centers, and mink activity was associated with water. The SCR model allowed us to estimate the density and spatial distribution of individuals across a 388 km² area. The model was used to investigate patterns of space usage and to evaluate covariate effects on encounter probabilities, including differences between sexes. This study provides an application of capture-recapture models based on ecological distance, allowing us to directly estimate landscape connectivity. This approach should be widely applicable to provide simultaneous direct estimates of density, space usage, and landscape connectivity for many species.

  10. Estimating population density and connectivity of American mink using spatial capture-recapture

    Science.gov (United States)

    Fuller, Angela K.; Sutherland, Christopher S.; Royle, Andy; Hare, Matthew P.

    2016-01-01

    Estimating the abundance or density of populations is fundamental to the conservation and management of species, and as landscapes become more fragmented, maintaining landscape connectivity has become one of the most important challenges for biodiversity conservation. Yet these two issues have never been formally integrated together in a model that simultaneously models abundance while accounting for connectivity of a landscape. We demonstrate an application of using capture–recapture to develop a model of animal density using a least-cost path model for individual encounter probability that accounts for non-Euclidean connectivity in a highly structured network. We utilized scat detection dogs (Canis lupus familiaris) as a means of collecting non-invasive genetic samples of American mink (Neovison vison) individuals and used spatial capture–recapture models (SCR) to gain inferences about mink population density and connectivity. Density of mink was not constant across the landscape, but rather increased with increasing distance from city, town, or village centers, and mink activity was associated with water. The SCR model allowed us to estimate the density and spatial distribution of individuals across a 388 km2 area. The model was used to investigate patterns of space usage and to evaluate covariate effects on encounter probabilities, including differences between sexes. This study provides an application of capture–recapture models based on ecological distance, allowing us to directly estimate landscape connectivity. This approach should be widely applicable to provide simultaneous direct estimates of density, space usage, and landscape connectivity for many species.

  11. Do non-gaussian effects decrease tunneling probabilities? Three-loop instanton density for the double-well potential

    International Nuclear Information System (INIS)

    Olejnik, S.

    1989-01-01

    It is shown that the leading and next-to-leading non-gaussian effects have a minor inlfuence on the instanton density for the double-well potential: it is slightly increased, contrary to the claims of other authors. We point out a connection to recent quantitative studies of topological effects in gauge theories. (orig.)

  12. Presumed appendiceal abscess discovered to be ruptured Meckel diverticulum following percutaneous drainage

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jeannie C.; Ostlie, Daniel J. [Children' s Mercy Hospital, Department of Surgery, Kansas City, MO (United States); Rivard, Douglas C.; Morello, Frank P. [Children' s Mercy Hospital, Department of Radiology, Kansas City, MO (United States)

    2008-08-15

    A Meckel diverticulum is an embryonic remnant of the omphalomesenteric duct that occurs in approximately 2% of the population. Most are asymptomatic; however, they are vulnerable to inflammation with subsequent consequences including diverticulitis and perforation. We report an 11-year-old boy who underwent laparoscopic appendectomy for perforated appendicitis at an outside institution. During his convalescence he underwent percutaneous drainage of a presumed postoperative abscess. A follow-up drain study demonstrated an enteric fistula. The drain was slowly removed from the abdomen over a period of 1 week. Three weeks following drain removal the patient reported recurrent nausea and abdominal pain. A CT scan demonstrated a 3.7-cm rim-enhancing air-fluid level with dependent contrast consistent with persistent enteric fistula and abscess. Exploratory laparoscopy was performed, at which time a Meckel diverticulum was identified and resected. This case highlights the diagnostic challenge and limitations of conventional radiology in complicated Meckel diverticulum. (orig.)

  13. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  14. Protein-protein interaction site predictions with three-dimensional probability distributions of interacting atoms on protein surfaces.

    Directory of Open Access Journals (Sweden)

    Ching-Tai Chen

    Full Text Available Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins and were tested on an independent dataset (consisting of 142 proteins. The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted

  15. Protein-Protein Interaction Site Predictions with Three-Dimensional Probability Distributions of Interacting Atoms on Protein Surfaces

    Science.gov (United States)

    Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei

    2012-01-01

    Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with

  16. Matter Density Profile Shape Effects at DUNE

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, Kevin J. [Northwestern U.; Parke, Stephen J. [Fermilab

    2018-02-19

    Quantum mechanical interactions between neutrinos and matter along the path of propagation, the Wolfenstein matter effect, are of particular importance for the upcoming long-baseline neutrino oscillation experiments, specifically the Deep Underground Neutrino Experiment (DUNE). Here, we explore specifically what about the matter density profile can be measured by DUNE, considering both the shape and normalization of the profile between the neutrinos' origin and detection. Additionally, we explore the capability of a perturbative method for calculating neutrino oscillation probabilities and whether this method is suitable for DUNE. We also briefly quantitatively explore the ability of DUNE to measure the Earth's matter density, and the impact of performing this measurement on measuring standard neutrino oscillation parameters.

  17. The Impact of the Prior Density on a Minimum Relative Entropy Density: A Case Study with SPX Option Data

    Directory of Open Access Journals (Sweden)

    Cassio Neri

    2014-05-01

    Full Text Available We study the problem of finding probability densities that match given European call option prices. To allow prior information about such a density to be taken into account, we generalise the algorithm presented in Neri and Schneider (Appl. Math. Finance 2013 to find the maximum entropy density of an asset price to the relative entropy case. This is applied to study the impact of the choice of prior density in two market scenarios. In the first scenario, call option prices are prescribed at only a small number of strikes, and we see that the choice of prior, or indeed its omission, yields notably different densities. The second scenario is given by CBOE option price data for S&P500 index options at a large number of strikes. Prior information is now considered to be given by calibrated Heston, Schöbel–Zhu or Variance Gamma models. We find that the resulting digital option prices are essentially the same as those given by the (non-relative Buchen–Kelly density itself. In other words, in a sufficiently liquid market, the influence of the prior density seems to vanish almost completely. Finally, we study variance swaps and derive a simple formula relating the fair variance swap rate to entropy. Then we show, again, that the prior loses its influence on the fair variance swap rate as the number of strikes increases.

  18. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  19. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  20. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  1. Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence

    International Nuclear Information System (INIS)

    Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del

    2009-01-01

    Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.

  2. Presuming the influence of the media: teenagers' constructions of gender identity through sexual/romantic relationships and alcohol consumption.

    Science.gov (United States)

    Hartley, Jane E K; Wight, Daniel; Hunt, Kate

    2014-06-01

    Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers' alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's 'presumed media influence' theory. Media portrayals of romantic/sexual relationships appeared to influence teenagers' constructions of gender-appropriate sexual behaviour more than did media portrayals of drinking behaviour, perhaps because the teenagers had more firsthand experience of observing drinking than of observing sexual relationships. Presumed media influence may be less influential if one has experience of the behaviour portrayed. Drinking and sexual behaviour were highly interrelated: sexual negotiation and activities were reportedly often accompanied by drinking. For teenagers, being drunk or, importantly, pretending to be drunk, may be a useful way to try out what they perceived to be gender-appropriate identities. In sum, teenagers' drinking and sexual/romantic relationships are primary ways in which they do gender and the media's influence on their perceptions of appropriate gendered behaviour is mediated through peer relationships. © 2014 The Authors. Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for SHIL (SHIL).

  3. A study on current density distribution reproduction by bounded-eigenfunction expansion for a tokamak plasma

    International Nuclear Information System (INIS)

    Kurihara, Kenichi

    1997-11-01

    Plasma current density distribution is one of the most important controlled variables to determine plasma performance of energy confinement and stability in a tokamak. However, its reproduction by using magnetic measurements solely is recognized to yield an ill-posed problem. A method to presume the formulas giving profiles of plasma pressure and current has been adopted to regularize the ill-posedness, and hence it has been reported the current density distribution can be reproduced as a solution of Grad-Shafranov equation within a certain accuracy. In order to investigate its strict reproducibility from magnetic measurements in this inverse problem, a new method of 'bounded-eigenfunction expansion' is introduced, and it was found that the reproducibility directly corresponds to the independence of a series of the special function. The results from various investigations in an aspect of applied mathematics concerning this inverse problem are presented in detail. (author)

  4. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  5. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  6. Deep convolutional neural network for mammographic density segmentation

    Science.gov (United States)

    Wei, Jun; Li, Songfeng; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir; Samala, Ravi K.

    2018-02-01

    Breast density is one of the most significant factors for cancer risk. In this study, we proposed a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammography (DM). The deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD). PD was calculated as the ratio of the dense area to the breast area based on the probability of each pixel belonging to dense region or fatty region at a decision threshold of 0.5. The DCNN estimate was compared to a feature-based statistical learning approach, in which gray level, texture and morphological features were extracted from each ROI and the least absolute shrinkage and selection operator (LASSO) was used to select and combine the useful features to generate the PMD. The reference PD of each image was provided by two experienced MQSA radiologists. With IRB approval, we retrospectively collected 347 DMs from patient files at our institution. The 10-fold cross-validation results showed a strong correlation r=0.96 between the DCNN estimation and interactive segmentation by radiologists while that of the feature-based statistical learning approach vs radiologists' segmentation had a correlation r=0.78. The difference between the segmentation by DCNN and by radiologists was significantly smaller than that between the feature-based learning approach and radiologists (p approach has the potential to replace radiologists' interactive thresholding in PD estimation on DMs.

  7. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  8. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  9. Computational complexity of time-dependent density functional theory

    International Nuclear Information System (INIS)

    Whitfield, J D; Yung, M-H; Tempel, D G; Aspuru-Guzik, A; Boixo, S

    2014-01-01

    Time-dependent density functional theory (TDDFT) is rapidly emerging as a premier method for solving dynamical many-body problems in physics and chemistry. The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn–Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn–Sham system can be efficiently obtained given the time-dependent density. We introduce a V-representability parameter which diverges at the boundary of the existence domain and serves to quantify the numerical difficulty of constructing the Kohn-Sham potential. For bounded values of V-representability, we present a polynomial time quantum algorithm to generate the time-dependent Kohn–Sham potential with controllable error bounds. (paper)

  10. High-Density Signal Interface Electromagnetic Radiation Prediction for Electromagnetic Compatibility Evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Halligan, Matthew

    2017-11-01

    Radiated power calculation approaches for practical scenarios of incomplete high- density interface characterization information and incomplete incident power information are presented. The suggested approaches build upon a method that characterizes power losses through the definition of power loss constant matrices. Potential radiated power estimates include using total power loss information, partial radiated power loss information, worst case analysis, and statistical bounding analysis. A method is also proposed to calculate radiated power when incident power information is not fully known for non-periodic signals at the interface. Incident data signals are modeled from a two-state Markov chain where bit state probabilities are derived. The total spectrum for windowed signals is postulated as the superposition of spectra from individual pulses in a data sequence. Statistical bounding methods are proposed as a basis for the radiated power calculation due to the statistical calculation complexity to find a radiated power probability density function.

  11. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  12. Density profiles of the exclusive queuing process

    Science.gov (United States)

    Arita, Chikashi; Schadschneider, Andreas

    2012-12-01

    The exclusive queuing process (EQP) incorporates the exclusion principle into classic queuing models. It is characterized by, in addition to the entrance probability α and exit probability β, a third parameter: the hopping probability p. The EQP can be interpreted as an exclusion process of variable system length. Its phase diagram in the parameter space (α,β) is divided into a convergent phase and a divergent phase by a critical line which consists of a curved part and a straight part. Here we extend previous studies of this phase diagram. We identify subphases in the divergent phase, which can be distinguished by means of the shape of the density profile, and determine the velocity of the system length growth. This is done for EQPs with different update rules (parallel, backward sequential and continuous time). We also investigate the dynamics of the system length and the number of customers on the critical line. They are diffusive or subdiffusive with non-universal exponents that also depend on the update rules.

  13. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  14. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  15. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  16. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  17. The car parking problem at high densities

    Science.gov (United States)

    Burgos, E.; Bonadeo, H.

    1989-04-01

    The radial distribution functions of random 1-D systems of sequential hard rods have been studied in the range of very high densities. It is found that as the number of samples rejected before completion increases, anomalies in the pairwise distribution functions arise. These are discussed using analytical solutions for systems of three rods and numerical simulations with twelve rods. The probabilities of different spatial orderings with respect to the sequential order are examined.

  18. Classical many-body theory with retarded interactions: Dynamical irreversibility and determinism without probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zakharov, A.Yu., E-mail: Anatoly.Zakharov@novsu.ru; Zakharov, M.A., E-mail: ma_zakharov@list.ru

    2016-01-28

    The exact equations of motion for microscopic density of classical many-body system with account of inter-particle retarded interactions is derived. It is shown that interactions retardation leads to irreversible behavior of many-body systems. - Highlights: • A new form of equation of motion of classical many-body system is proposed. • Interactions retardation as one of the mechanisms of many-body system irreversibility. • Irreversibility and determinism without probabilities. • The possible way to microscopic foundation of thermodynamics.

  19. Transition probabilities of some Si II lines obtained by laser produced plasma emission

    International Nuclear Information System (INIS)

    Blanco, F.; Botho, B.; Campos, J.

    1995-01-01

    The absolute transition probabilities for 28 Si II spectral lines have been determined by measurement of emission line intensities from laser-produced plasmas of Si in Ar and Kr atmospheres. The studied plasma has a temperature of about 2 . 10 4 K and 10 17 cm -3 electron density. The local thermodynamic equilibrium conditions and plasma homogeneity have been checked. The results are compared with the available experimental and theoretical data and with present Hartree-Fock calculations in LS coupling. (orig.)

  20. Ligand identification using electron-density map correlations

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Adams, Paul D.; Moriarty, Nigel W.; Cohn, Judith D.

    2007-01-01

    An automated ligand-fitting procedure is applied to (F o − F c )exp(iϕ c ) difference density for 200 commonly found ligands from macromolecular structures in the Protein Data Bank to identify ligands from density maps. A procedure for the identification of ligands bound in crystal structures of macromolecules is described. Two characteristics of the density corresponding to a ligand are used in the identification procedure. One is the correlation of the ligand density with each of a set of test ligands after optimization of the fit of that ligand to the density. The other is the correlation of a fingerprint of the density with the fingerprint of model density for each possible ligand. The fingerprints consist of an ordered list of correlations of each the test ligands with the density. The two characteristics are scored using a Z-score approach in which the correlations are normalized to the mean and standard deviation of correlations found for a variety of mismatched ligand-density pairs, so that the Z scores are related to the probability of observing a particular value of the correlation by chance. The procedure was tested with a set of 200 of the most commonly found ligands in the Protein Data Bank, collectively representing 57% of all ligands in the Protein Data Bank. Using a combination of these two characteristics of ligand density, ranked lists of ligand identifications were made for representative (F o − F c )exp(iϕ c ) difference density from entries in the Protein Data Bank. In 48% of the 200 cases, the correct ligand was at the top of the ranked list of ligands. This approach may be useful in identification of unknown ligands in new macromolecular structures as well as in the identification of which ligands in a mixture have bound to a macromolecule

  1. Energy density functional analysis of shape coexistence in 44S

    International Nuclear Information System (INIS)

    Li, Z. P.; Yao, J. M.; Vretenar, D.; Nikšić, T.; Meng, J.

    2012-01-01

    The structure of low-energy collective states in the neutron-rich nucleus 44 S is analyzed using a microscopic collective Hamiltonian model based on energy density functionals (EDFs). The calculated triaxial energy map, low-energy spectrum and corresponding probability distributions indicate a coexistence of prolate and oblate shapes in this nucleus.

  2. Sequence diversities of serine-aspartate repeat genes among Staphylococcus aureus isolates from different hosts presumably by horizontal gene transfer.

    Directory of Open Access Journals (Sweden)

    Huping Xue

    Full Text Available BACKGROUND: Horizontal gene transfer (HGT is recognized as one of the major forces for bacterial genome evolution. Many clinically important bacteria may acquire virulence factors and antibiotic resistance through HGT. The comparative genomic analysis has become an important tool for identifying HGT in emerging pathogens. In this study, the Serine-Aspartate Repeat (Sdr family has been compared among different sources of Staphylococcus aureus (S. aureus to discover sequence diversities within their genomes. METHODOLOGY/PRINCIPAL FINDINGS: Four sdr genes were analyzed for 21 different S. aureus strains and 218 mastitis-associated S. aureus isolates from Canada. Comparative genomic analyses revealed that S. aureus strains from bovine mastitis (RF122 and mastitis isolates in this study, ovine mastitis (ED133, pig (ST398, chicken (ED98, and human methicillin-resistant S. aureus (MRSA (TCH130, MRSA252, Mu3, Mu50, N315, 04-02981, JH1 and JH9 were highly associated with one another, presumably due to HGT. In addition, several types of insertion and deletion were found in sdr genes of many isolates. A new insertion sequence was found in mastitis isolates, which was presumably responsible for the HGT of sdrC gene among different strains. Moreover, the sdr genes could be used to type S. aureus. Regional difference of sdr genes distribution was also indicated among the tested S. aureus isolates. Finally, certain associations were found between sdr genes and subclinical or clinical mastitis isolates. CONCLUSIONS: Certain sdr gene sequences were shared in S. aureus strains and isolates from different species presumably due to HGT. Our results also suggest that the distributional assay of virulence factors should detect the full sequences or full functional regions of these factors. The traditional assay using short conserved regions may not be accurate or credible. These findings have important implications with regard to animal husbandry practices that may

  3. Two-dimensional electron density characterisation of arc interruption phenomenon in current-zero phase

    Science.gov (United States)

    Inada, Yuki; Kamiya, Tomoki; Matsuoka, Shigeyasu; Kumada, Akiko; Ikeda, Hisatoshi; Hidaka, Kunihiko

    2018-01-01

    Two-dimensional electron density imaging over free burning SF6 arcs and SF6 gas-blast arcs was conducted at current zero using highly sensitive Shack-Hartmann type laser wavefront sensors in order to experimentally characterise electron density distributions for the success and failure of arc interruption in the thermal reignition phase. The experimental results under an interruption probability of 50% showed that free burning SF6 arcs with axially asymmetric electron density profiles were interrupted with a success rate of 88%. On the other hand, the current interruption of SF6 gas-blast arcs was reproducibly achieved under locally reduced electron densities and the interruption success rate was 100%.

  4. 概率密度函数法研究重构吸引子的结构%Probability Density Function Method for Observing Reconstructed Attractor Structure

    Institute of Scientific and Technical Information of China (English)

    陆宏伟; 陈亚珠; 卫青

    2004-01-01

    Probability density function (PDF) method is proposed for analysing the structure of the reconstructed attractor in computing the correlation dimensions of RR intervals of ten normal old men.PDF contains important information about the spatial distribution of the phase points in the reconstructed attractor.To the best of our knowledge, it is the first time that the PDF method is put forward for the analysis of the reconstructed attractor structure.Numerical simulations demonstrate that the cardiac systems of healthy old men are about 6-6.5 dimensional complex dynamical systems.It is found that PDF is not symmetrically distributed when time delay is small, while PDF satisfies Gaussian distribution when time delay is big enough.A cluster effect mechanism is presented to explain this phenomenon.By studying the shape of PDFs, that the roles played by time delay are more important than embedding dimension in the reconstruction is clearly indicated.Results have demonstrated that the PDF method represents a promising numerical approach for the observation of the reconstructed attractor structure and may provide more information and new diagnostic potential of the analyzed cardiac system.

  5. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  6. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  8. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  9. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  10. Sensor Fusion Based on an Integrated Neural Network and Probability Density Function (PDF) Dual Kalman Filter for On-Line Estimation of Vehicle Parameters and States.

    Science.gov (United States)

    Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente

    2017-04-29

    Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.

  11. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  12. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  13. The trans-generational impact of population density signals on host-parasite interactions.

    Science.gov (United States)

    Michel, Jessica; Ebert, Dieter; Hall, Matthew D

    2016-11-25

    The density of a host population is a key parameter underlying disease transmission, but it also has implications for the expression of disease through its effect on host physiology. In response to higher densities, individuals are predicted to either increase their immune investment in response to the elevated risk of parasitism, or conversely to decrease their immune capacity as a consequence of the stress of a crowded environment. However, an individual's health is shaped by many different factors, including their genetic background, current environmental conditions, and maternal effects. Indeed, population density is often sensed through the presence of info-chemicals in the environment, which may influence a host's interaction with parasites, and also those of its offspring. All of which may alter the expression of disease, and potentially uncouple the presumed link between changes in host density and disease outcomes. In this study, we used the water flea Daphnia magna and its obligate bacterial parasite Pasteuria ramosa, to investigate how signals of high host density impact on host-parasite interactions over two consecutive generations. We found that the chemical signals from crowded treatments induced phenotypic changes in both the parental and offspring generations. In the absence of a pathogen, life-history changes were genotype-specific, but consistent across generations, even when the signal of density was removed. In contrast, the influence of density on infected animals depended on the trait and generation of exposure. When directly exposed to signals of high-density, host genotypes responded differently in how they minimised the severity of disease. Yet, in the subsequent generation, the influence of density was rarely genotype-specific and instead related to ability of the host to minimise the onset of infection. Our findings reveal that population level correlations between host density and infection capture only part of the complex relationship

  14. Time-averaged probability density functions of soot nanoparticles along the centerline of a piloted turbulent diffusion flame using a scanning mobility particle sizer

    KAUST Repository

    Chowdhury, Snehaunshu

    2017-01-23

    In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating existing soot models are reported at intervals of ∆x/D∆x/D = 5 along the centerline of turbulent, non-premixed, C2H4/N2 flames. The jet exit Reynolds numbers of the flames investigated were 10,000 and 20,000. A simplified burner geometry based on a published design was chosen to aid modelers. Soot was sampled directly from the flame using a sampling probe with a 0.5-mm diameter orifice and diluted with N2 by a two-stage dilution process. The overall dilution ratio was not evaluated. An SMPS system was used to analyze soot particle concentrations in the diluted samples. Sampling conditions were optimized over a wide range of dilution ratios to eliminate the effect of agglomeration in the sampling probe. Two differential mobility analyzers (DMAs) with different size ranges were used separately in the SMPS measurements to characterize the entire size range of particles. In both flames, the PDFs were found to be mono-modal in nature near the jet exit. Further downstream, the profiles were flatter with a fall-off at larger particle diameters. The geometric mean of the soot size distributions was less than 10 nm for all cases and increased monotonically with axial distance in both flames.

  15. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  16. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  18. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  19. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  1. Millimeter-wave Line Ratios and Sub-beam Volume Density Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Leroy, Adam K.; Gallagher, Molly [Department of Astronomy, The Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Usero, Antonio [Observatorio Astronmico Nacional (IGN), C/Alfonso XII, 3, E-28014 Madrid (Spain); Schruba, Andreas [Max-Planck-Institut für extraterrestrische Physik, Giessenbachstraße 1, D-85748 Garching (Germany); Bigiel, Frank [Institute für theoretische Astrophysik, Zentrum für Astronomie der Universität Heidelberg, Albert-Ueberle Str. 2, D-69120 Heidelberg (Germany); Kruijssen, J. M. Diederik; Schinnerer, Eva [Astronomisches Rechen-Institut, Zentrum für Astronomie der Universität Heidelberg, Mönchhofstraße 12-14, D-69120 Heidelberg (Germany); Kepley, Amanda [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Blanc, Guillermo A. [Departamento de Astronomía, Universidad de Chile, Casilla 36-D, Santiago (Chile); Bolatto, Alberto D. [Department of Astronomy, Laboratory for Millimeter-wave Astronomy, and Joint Space Institute, University of Maryland, College Park, MD 20742 (United States); Cormier, Diane; Jiménez-Donaire, Maria J. [Max Planck Institute für Astronomie, Königstuhl 17, D-69117, Heidelberg (Germany); Hughes, Annie [CNRS, IRAP, 9 av. du Colonel Roche, BP 44346, F-31028 Toulouse cedex 4 (France); Rosolowsky, Erik [Department of Physics, University of Alberta, Edmonton, AB (Canada)

    2017-02-01

    We explore the use of mm-wave emission line ratios to trace molecular gas density when observations integrate over a wide range of volume densities within a single telescope beam. For observations targeting external galaxies, this case is unavoidable. Using a framework similar to that of Krumholz and Thompson, we model emission for a set of common extragalactic lines from lognormal and power law density distributions. We consider the median density of gas that produces emission and the ability to predict density variations from observed line ratios. We emphasize line ratio variations because these do not require us to know the absolute abundance of our tracers. Patterns of line ratio variations have the potential to illuminate the high-end shape of the density distribution, and to capture changes in the dense gas fraction and median volume density. Our results with and without a high-density power law tail differ appreciably; we highlight better knowledge of the probability density function (PDF) shape as an important area. We also show the implications of sub-beam density distributions for isotopologue studies targeting dense gas tracers. Differential excitation often implies a significant correction to the naive case. We provide tabulated versions of many of our results, which can be used to interpret changes in mm-wave line ratios in terms of adjustments to the underlying density distributions.

  2. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  3. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  4. A discussion supporting presumed consent for posthumous sperm procurement and conception.

    Science.gov (United States)

    Tremellen, Kelton; Savulescu, Julian

    2015-01-01

    Conception of a child using cryopreserved sperm from a deceased man is generally considered ethically sound provided explicit consent for its use has been made, thereby protecting the man's autonomy. When death is sudden (trauma, unexpected illness), explicit consent is not possible, thereby preventing posthumous sperm procurement (PSP) and conception according to current European Society of Human Reproduction and Embryology and the American Society for Reproductive Medicine guidelines. Here, we argue that autonomy of a deceased person should not be considered the paramount ethical concern, but rather consideration of the welfare of the living (widow and prospective child) should be the primary focus. Posthumous conception can bring significant advantages to the widow and her resulting child, with most men supporting such practice. We suggest that a deceased man can benefit from posthumous conception (continuation of his 'bloodline', allowing his widow's wishes for a child to be satisfied), and has a moral duty to allow his widow access to his sperm, if she so wishes, unless he clearly indicated that he did not want children when alive. We outline the arguments favouring presumed consent over implied or proxy consent, plus practical considerations for recording men's wishes to opt-out of posthumous conception. Copyright © 2014 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  5. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  6. The role of demographic compensation theory in incidental take assessments for endangered species

    Science.gov (United States)

    McGowan, Conor P.; Ryan, Mark R.; Runge, Michael C.; Millspaugh, Joshua J.; Cochrane, Jean Fitts

    2011-01-01

    Many endangered species laws provide exceptions to legislated prohibitions through incidental take provisions as long as take is the result of unintended consequences of an otherwise legal activity. These allowances presumably invoke the theory of demographic compensation, commonly applied to harvested species, by allowing limited harm as long as the probability of the species' survival or recovery is not reduced appreciably. Demographic compensation requires some density-dependent limits on survival or reproduction in a species' annual cycle that can be alleviated through incidental take. Using a population model for piping plovers in the Great Plains, we found that when the population is in rapid decline or when there is no density dependence, the probability of quasi-extinction increased linearly with increasing take. However, when the population is near stability and subject to density-dependent survival, there was no relationship between quasi-extinction probability and take rates. We note however, that a brief examination of piping plover demography and annual cycles suggests little room for compensatory capacity. We argue that a population's capacity for demographic compensation of incidental take should be evaluated when considering incidental allowances because compensation is the only mechanism whereby a population can absorb the negative effects of take without incurring a reduction in the probability of survival in the wild. With many endangered species there is probably little known about density dependence and compensatory capacity. Under these circumstances, using multiple system models (with and without compensation) to predict the population's response to incidental take and implementing follow-up monitoring to assess species response may be valuable in increasing knowledge and improving future decision making.

  7. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  8. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  9. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  10. Indexing Density Models for Incremental Learning and Anytime Classification on Data Streams

    DEFF Research Database (Denmark)

    Seidl, Thomas; Assent, Ira; Kranen, Philipp

    2009-01-01

    Classification of streaming data faces three basic challenges: it has to deal with huge amounts of data, the varying time between two stream data items must be used best possible (anytime classification) and additional training data must be incrementally learned (anytime learning) for applying...... to the individual object to be classified) a hierarchy of mixture densities that represent kernel density estimators at successively coarser levels. Our probability density queries together with novel classification improvement strategies provide the necessary information for very effective classification at any...... point of interruption. Moreover, we propose a novel evaluation method for anytime classification using Poisson streams and demonstrate the anytime learning performance of the Bayes tree....

  11. The Precise Time Course of Lexical Activation: MEG Measurements of the Effects of Frequency, Probability, and Density in Lexical Decision

    Science.gov (United States)

    Stockall, Linnaea; Stringfellow, Andrew; Marantz, Alec

    2004-01-01

    Visually presented letter strings consistently yield three MEG response components: the M170, associated with letter-string processing (Tarkiainen, Helenius, Hansen, Cornelissen, & Salmelin, 1999); the M250, affected by phonotactic probability, (Pylkkanen, Stringfellow, & Marantz, 2002); and the M350, responsive to lexical frequency (Embick,…

  12. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  13. NASA Lewis Launch Collision Probability Model Developed and Analyzed

    Science.gov (United States)

    Bollenbacher, Gary; Guptill, James D

    1999-01-01

    There are nearly 10,000 tracked objects orbiting the earth. These objects encompass manned objects, active and decommissioned satellites, spent rocket bodies, and debris. They range from a few centimeters across to the size of the MIR space station. Anytime a new satellite is launched, the launch vehicle with its payload attached passes through an area of space in which these objects orbit. Although the population density of these objects is low, there always is a small but finite probability of collision between the launch vehicle and one or more of these space objects. Even though the probability of collision is very low, for some payloads even this small risk is unacceptable. To mitigate the small risk of collision associated with launching at an arbitrary time within the daily launch window, NASA performs a prelaunch mission assurance Collision Avoidance Analysis (or COLA). For the COLA of the Cassini spacecraft, the NASA Lewis Research Center conducted an in-house development and analysis of a model for launch collision probability. The model allows a minimum clearance criteria to be used with the COLA analysis to ensure an acceptably low probability of collision. If, for any given liftoff time, the nominal launch vehicle trajectory would pass a space object with less than the minimum required clearance, launch would not be attempted at that time. The model assumes that the nominal positions of the orbiting objects and of the launch vehicle can be predicted as a function of time, and therefore, that any tracked object that comes within close proximity of the launch vehicle can be identified. For any such pair, these nominal positions can be used to calculate a nominal miss distance. The actual miss distances may differ substantially from the nominal miss distance, due, in part, to the statistical uncertainty of the knowledge of the objects positions. The model further assumes that these position uncertainties can be described with position covariance matrices

  14. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  15. Probability density function of the number of embryos collected from superovulated Nelore breed donors Função de densidade de probabilidade do número de embriões produzidos por doadoras da raça Nelore

    Directory of Open Access Journals (Sweden)

    Renato Travassos Beltrame

    2009-08-01

    Full Text Available Several models have been developed to evaluate reproductive status of cows through concentration of progesterone in milk, the effect of sex selection in the commercial production of herds and bioeconomic performance of the multiple ovulation and embryo transfer system in select herds. However, models describing the production of embryos in superovulated females have yet to be developed. A probability density function of the number of embryos collected by donors of the Nelore breed was determined. Records of 61,928 embryo collections from 26,767 donors from 1991 to 2005 were analyzed. Data were provided by the Brazilian Association of Creators of Zebu and Controlmax Consultoria e Sistemas Ltda. The probability density function of the number of viable embryos was modeled using exponential and gamma distributions. Parameter fitting was carried out for maximum likelihood using a non-linear gradient method. Both distributions presented similar level of precision: root mean square error (RMSE = 0.0072 and 0.0071 for the exponential and gamma distributions, respectively; both distributions are thus deemed suitable for representing the probability density function of embryo production by Nelore females.Diversos modelos têm sido desenvolvidos para avaliar o estado reprodutivo de vacas por meio da concentração de progesterona no leite, o efeito da seleção do sexo na produção comercial de rebanhos e o desempenho bioeconômico da ovulação múltipla e transferência de embriões em rebanhos selecionados. No entanto, modelos que descrevem a produção de embriões em fêmeas superovulados ainda têm de ser desenvolvidos. Uma função de densidade probabilidade para o número de embriões viáveis recuperados de doadoras da raça Nelore foi determinada. Dados de 61.928 coletas de 26.767 doadoras entre 1991 e 2005 foram analisados. Os resultados foram fornecidos pela Associação Brasileira de Criadores de Zebu (ABCZ e pela empresa Controlmax

  16. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  17. Probability distribution of wave packet delay time for strong overlapping of resonance levels

    International Nuclear Information System (INIS)

    Lyuboshits, V.L.

    1983-01-01

    Time behaviour of nuclear reactions in the case of high level densities is investigated basing on the theory of overlapping resonances. In the framework of a model of n equivalent channels an analytical expression is obtained for the probability distribution function for wave packet delay time at the compound nucleus production. It is shown that at strong overlapping of the resonance levels the relative fluctuation of the delay time is small at the stage of compound nucleus production. A possible increase in the duration of nuclear reactions with the excitation energy rise is discussed

  18. RADIOGRAPHIC APPEARANCE OF PRESUMED NONCARDIOGENIC PULMONARY EDEMA AND CORRELATION WITH THE UNDERLYING CAUSE IN DOGS AND CATS.

    Science.gov (United States)

    Bouyssou, Sarah; Specchi, Swan; Desquilbet, Loïc; Pey, Pascaline

    2017-05-01

    Noncardiogenic pulmonary edema is an important cause of respiratory disease in dogs and cats but few reports describe its radiographic appearance. The purpose of this retrospective case series study was to describe radiographic findings in a large cohort of dogs and cats with presumed noncardiogenic pulmonary edema and to test associations among radiographic findings versus cause of edema. Medical records were retrieved for dogs and cats with presumed noncardiogenic edema based on history, radiographic findings, and outcome. Radiographs were reviewed to assess lung pattern and distribution of the edema. Correlation with the cause of noncardiogenic pulmonary edema was evaluated with a Fisher's exact test. A total of 49 dogs and 11 cats were included. Causes for the noncardiogenic edema were airway obstruction (n = 23), direct pulmonary injury (n = 13), severe neurologic stimulation (n = 12), systemic disease (n = 6), near-drowning (n = 3), anaphylaxis (n = 2) and blood transfusion (n = 1). Mixed, symmetric, peripheral, multifocal, bilateral, and dorsal lung patterns were observed in 44 (73.3%), 46 (76.7%), 55 (91.7%), 46 (76.7%), 46 (76.7%), and 34 (57.6%) of 60 animals, respectively. When the distribution was unilateral, pulmonary infiltration involved mainly the right lung lobes (12 of 14, 85.7%). Increased pulmonary opacity was more often asymmetric, unilateral, and dorsal for postobstructive pulmonary edema compared to other types of noncardiogenic pulmonary edema, but no other significant correlations could be identified. In conclusion, noncardiogenic pulmonary edema may present with a quite variable radiographic appearance in dogs and cats. © 2016 American College of Veterinary Radiology.

  19. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  1. Preliminary Evaluation of the Effects of Buried Volcanoes on Estimates of Volcano Probability for the Proposed Repository Site at Yucca Mountain, Nevada

    Science.gov (United States)

    Hill, B. E.; La Femina, P. C.; Stamatakos, J.; Connor, C. B.

    2002-12-01

    Probability models that calculate the likelihood of new volcano formation in the Yucca Mountain (YM) area depend on the timing and location of past volcanic activity. Previous spatio-temporal patterns indicated a 10-4 to 10-3 probability of volcanic disruption of the proposed radioactive waste repository site at YM during the 10,000 year post-closure performance period (Connor et al. 2000, JGR 105:1). A recent aeromagnetic survey (Blakely et al. 2000, USGS OFR 00-188), however, identified up to 20 anomalies in alluvium-filled basins, which have characteristics indicative of buried basalt (O'Leary et al. 2002, USGS OFR 02-020). Independent evaluation of these data, combined with new ground magnetic surveys, shows that these anomalies may represent at least ten additional buried basaltic volcanoes, which have not been included in previous probability calculations. This interpretation, if true, nearly doubles the number of basaltic volcanoes within 30 km [19 mi] of YM. Moreover, the magnetic signature of about half of the recognized basaltic volcanoes in the YM area cannot be readily identified in areas where bedrock also produces large amplitude magnetic anomalies, suggesting that additional volcanoes may be present but undetected in the YM area. In the absence of direct age information, we evaluate the potential effects of alternative age assumptions on spatio-temporal probability models. Interpreted burial depths of >50 m [164 ft] suggest ages >2 Ma, based on sedimentation rates typical for these alluvial basins (Stamatakos et al., 1997, J. Geol. 105). Defining volcanic events as individual points, previous probability models generally used recurrence rates of 2-5 volcanoes/million years (v/Myr). If the identified anomalies are buried volcanoes that are all >5 Ma or uniformly distributed between 2-10 Ma, calculated probabilities of future volcanic disruption at YM change by <30%. However, a uniform age distribution between 2-5 Ma for the presumed buried volcanoes

  2. The unsuspected prosthetic joint infection : incidence and consequences of positive intra-operative cultures in presumed aseptic knee and hip revisions.

    Science.gov (United States)

    Jacobs, A M E; Bénard, M; Meis, J F; van Hellemondt, G; Goosen, J H M

    2017-11-01

    Positive cultures are not uncommon in cases of revision total knee and hip arthroplasty (TKA and THA) for presumed aseptic causes. The purpose of this study was to assess the incidence of positive intra-operative cultures in presumed aseptic revision of TKA and THA, and to determine whether the presence of intra-operative positive cultures results in inferior survival in such cases. A retrospective cohort study was assembled with 679 patients undergoing revision knee (340 cases) or hip arthroplasty (339 cases) for presumed aseptic causes. For all patients three or more separate intra-operative cultures were obtained. Patients were diagnosed with a previously unsuspected prosthetic joint infection (PJI) if two or more cultures were positive with the same organism. Records were reviewed for demographic details, pre-operative laboratory results and culture results. The primary outcome measure was infection-free implant survival at two years. The incidence of unsuspected PJI was 27 out of 340 (7.9%) in TKA and 41 out of 339 (12.1%) in THA. Following revision TKA, the rate of infection-free implant survival in patients with an unsuspected PJI was 88% (95% confidence intervals (CI) 60 to 97) at two years compared with 98% (95% CI 94 to 99) in patients without PJI (p = 0.001). After THA, the rate of survival was similar in those with unsuspected PJI (92% (95% CI 73 to 98) at two years) and those without (94% (95% CI 89 to 97), p = 0.31). Following revision of TKA and THA for aseptic diagnoses, around 10% of cases were found to have positive cultures. In the knee, such cases had inferior infection-free survival at two years compared with those with negative cultures; there was no difference between the groups following THA. Cite this article: Bone Joint J 2017;99-B:1482-9. ©2017 The British Editorial Society of Bone & Joint Surgery.

  3. Synergy between pair coupled cluster doubles and pair density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Garza, Alejandro J.; Bulik, Ireneusz W. [Department of Chemistry, Rice University, Houston, Texas 77251-1892 (United States); Henderson, Thomas M. [Department of Chemistry and Department of Physics and Astronomy, Rice University, Houston, Texas 77251-1892 (United States); Scuseria, Gustavo E. [Department of Chemistry and Department of Physics and Astronomy, Rice University, Houston, Texas 77251-1892 (United States); Chemistry Department, Faculty of Science, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2015-01-28

    Pair coupled cluster doubles (pCCD) has been recently studied as a method capable of accounting for static correlation with low polynomial cost. We present three combinations of pCCD with Kohn–Sham functionals of the density and on-top pair density (the probability of finding two electrons on top of each other) to add dynamic correlation to pCCD without double counting. With a negligible increase in computational cost, these pCCD+DFT blends greatly improve upon pCCD in the description of typical problems where static and dynamic correlations are both important. We argue that—as a black-box method with low scaling, size-extensivity, size-consistency, and a simple quasidiagonal two-particle density matrix—pCCD is an excellent match for pair density functionals in this type of fusion of multireference wavefunctions with DFT.

  4. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  5. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  6. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  7. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  8. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  9. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  10. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  11. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  12. SU-G-JeP2-02: A Unifying Multi-Atlas Approach to Electron Density Mapping Using Multi-Parametric MRI for Radiation Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Ren, S [Stanford University, Stanford, CA (United States); Tianjin University, Tianjin (China); Hara, W; Le, Q; Wang, L; Xing, L; Li, R [Stanford University, Stanford, CA (United States)

    2016-06-15

    Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2) electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.

  13. SU-G-JeP2-02: A Unifying Multi-Atlas Approach to Electron Density Mapping Using Multi-Parametric MRI for Radiation Treatment Planning

    International Nuclear Information System (INIS)

    Ren, S; Hara, W; Le, Q; Wang, L; Xing, L; Li, R

    2016-01-01

    Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2) electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.

  14. Cylinders out of a top hat: counts-in-cells for projected densities

    Science.gov (United States)

    Uhlemann, Cora; Pichon, Christophe; Codis, Sandrine; L'Huillier, Benjamin; Kim, Juhan; Bernardeau, Francis; Park, Changbom; Prunet, Simon

    2018-06-01

    Large deviation statistics is implemented to predict the statistics of cosmic densities in cylinders applicable to photometric surveys. It yields few per cent accurate analytical predictions for the one-point probability distribution function (PDF) of densities in concentric or compensated cylinders; and also captures the density dependence of their angular clustering (cylinder bias). All predictions are found to be in excellent agreement with the cosmological simulation Horizon Run 4 in the quasi-linear regime where standard perturbation theory normally breaks down. These results are combined with a simple local bias model that relates dark matter and tracer densities in cylinders and validated on simulated halo catalogues. This formalism can be used to probe cosmology with existing and upcoming photometric surveys like DES, Euclid or WFIRST containing billions of galaxies.

  15. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  16. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  17. Modelling interactions of toxicants and density dependence in wildlife populations

    Science.gov (United States)

    Schipper, Aafke M.; Hendriks, Harrie W.M.; Kauffman, Matthew J.; Hendriks, A. Jan; Huijbregts, Mark A.J.

    2013-01-01

    1. A major challenge in the conservation of threatened and endangered species is to predict population decline and design appropriate recovery measures. However, anthropogenic impacts on wildlife populations are notoriously difficult to predict due to potentially nonlinear responses and interactions with natural ecological processes like density dependence. 2. Here, we incorporated both density dependence and anthropogenic stressors in a stage-based matrix population model and parameterized it for a density-dependent population of peregrine falcons Falco peregrinus exposed to two anthropogenic toxicants [dichlorodiphenyldichloroethylene (DDE) and polybrominated diphenyl ethers (PBDEs)]. Log-logistic exposure–response relationships were used to translate toxicant concentrations in peregrine falcon eggs to effects on fecundity. Density dependence was modelled as the probability of a nonbreeding bird acquiring a breeding territory as a function of the current number of breeders. 3. The equilibrium size of the population, as represented by the number of breeders, responded nonlinearly to increasing toxicant concentrations, showing a gradual decrease followed by a relatively steep decline. Initially, toxicant-induced reductions in population size were mitigated by an alleviation of the density limitation, that is, an increasing probability of territory acquisition. Once population density was no longer limiting, the toxicant impacts were no longer buffered by an increasing proportion of nonbreeders shifting to the breeding stage, resulting in a strong decrease in the equilibrium number of breeders. 4. Median critical exposure concentrations, that is, median toxicant concentrations in eggs corresponding with an equilibrium population size of zero, were 33 and 46 μg g−1 fresh weight for DDE and PBDEs, respectively. 5. Synthesis and applications. Our modelling results showed that particular life stages of a density-limited population may be relatively insensitive to

  18. Density of trapped gas in heavily-irradiated lithium hydride

    International Nuclear Information System (INIS)

    Bowman, R.C. Jr.; Attalla, A.; Souers, P.C.; Folkers, C.L.; McCreary, T.; Snider, G.D.; Vanderhoofven, F.; Tsugawa, R.T.

    1988-01-01

    We review old gamma-irradiated lithium hydride data and also display much new bulk and gas-displacement density and nuclear magnetic resonance data on Li(D, T) and LiT at 296 to 373 K. We find that: (1) Li(D, T) swells because of the formation of internal D-T and 3 He gas bubbles, but probably not because of the precipitation of lithium metal; (2) the gas bubbles are at densities of at least 3 to 4x10 4 mol/m 3 , i.e. thousands of atmospheres; (3) outgassing may be largely the result of bubbles rupturing, although diffusion of 3 He as atoms may occur at long times. (orig.)

  19. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  20. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.