Probability densities and Lévy densities
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Probability densities in strong turbulence
Yakhot, Victor
2006-03-01
In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.
Modulation Based on Probability Density Functions
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Therapeutic High-Density Barium Enema in a Case of Presumed Diverticular Hemorrhage
Directory of Open Access Journals (Sweden)
Nonthalee Pausawasdi
2011-02-01
Full Text Available Many patients with lower gastrointestinal bleeding do not have an identifiable source of bleeding at colonoscopy. A significant percentage of these patients will have recurrent bleeding. In many patients, the presence of multiple diverticula leads to a diagnosis of presumed diverticular bleeding. Current treatment options include therapeutic endoscopy, angiography, or surgical resection, all of which depend on the identification of the diverticular source of bleeding. This report describes a case of recurrent bleeding in an elderly patient with diverticula but no identifiable source treated successfully with barium impaction therapy. This therapeutic modality does not depend on the identification of the bleeding diverticular lesion and was well tolerated by our 86-year-old patient.
On the discretization of probability density functions and the ...
Indian Academy of Sciences (India)
function f (x) with respect to a probability density function (PDF) ρ(x) := |ψ (x) |2, where ψ(x) is the wave function in ... fields of science, the calculation of Rényi and Tsallis entropies [1–3] for probability density function ρ(x) ... on the second mean-value theorem (SMVT) for integrals by postulating that: (i) The PDF ρ(x) can be ...
On Farmer's line, probability density functions, and overall risk
International Nuclear Information System (INIS)
Munera, H.A.; Yadigaroglu, G.
1986-01-01
Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value
Probability density estimation in stochastic environmental models using reverse representations
Van den Berg, E.; Heemink, A.W.; Lin, H.X.; Schoenmakers, J.G.M.
2003-01-01
The estimation of probability densities of variables described by systems of stochastic dierential equations has long been done using forward time estimators, which rely on the generation of realizations of the model, forward in time. Recently, an estimator based on the combination of forward and
Visualization techniques for spatial probability density function data
Directory of Open Access Journals (Sweden)
Udeepta D Bordoloi
2006-01-01
Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.
Vehicle Detection Based on Probability Hypothesis Density Filter
Directory of Open Access Journals (Sweden)
Feihu Zhang
2016-04-01
Full Text Available In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art.
Continuation of probability density functions using a generalized Lyapunov approach
Energy Technology Data Exchange (ETDEWEB)
Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)
2017-05-01
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.
Best Probability Density Function for Random Sampled Data.
Jacobs, Donald J
2009-12-04
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments are generally unstable in the presence of noise due to limited sampling. A robust method is presented that always returns the best pdf, where tradeoff in smoothing a highly varying function due to noise can be controlled. An unconventional adaptive simulated annealing technique, called funnel diffusion, determines expansion coefficients for Chebyshev polynomials in the exponential function.
Best Probability Density Function for Random Sampled Data
Directory of Open Access Journals (Sweden)
Donald J. Jacobs
2009-12-01
Full Text Available The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments are generally unstable in the presence of noise due to limited sampling. A robust method is presented that always returns the best pdf, where tradeoff in smoothing a highly varying function due to noise can be controlled. An unconventional adaptive simulated annealing technique, called funnel diffusion, determines expansion coefficients for Chebyshev polynomials in the exponential function.
A Balanced Approach to Adaptive Probability Density Estimation
Directory of Open Access Journals (Sweden)
Julio A. Kovacs
2017-04-01
Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.
A Balanced Approach to Adaptive Probability Density Estimation.
Kovacs, Julio A; Helmick, Cailee; Wriggers, Willy
2017-01-01
Our development of a Fast (Mutual) Information Matching (FIM) of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE) method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.
INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS
Potter, Kristin
2012-01-01
The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.
Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging
Energy Technology Data Exchange (ETDEWEB)
Clark, G A
2004-09-21
The work reported here shows the proof of principle (using a small data set) for a suite of algorithms designed to estimate the probability density function of hyperspectral background data and compute the appropriate Constant False Alarm Rate (CFAR) matched filter decision threshold for a chemical plume detector. Future work will provide a thorough demonstration of the algorithms and their performance with a large data set. The LASI (Large Aperture Search Initiative) Project involves instrumentation and image processing for hyperspectral images of chemical plumes in the atmosphere. The work reported here involves research and development on algorithms for reducing the false alarm rate in chemical plume detection and identification algorithms operating on hyperspectral image cubes. The chemical plume detection algorithms to date have used matched filters designed using generalized maximum likelihood ratio hypothesis testing algorithms [1, 2, 5, 6, 7, 12, 10, 11, 13]. One of the key challenges in hyperspectral imaging research is the high false alarm rate that often results from the plume detector [1, 2]. The overall goal of this work is to extend the classical matched filter detector to apply Constant False Alarm Rate (CFAR) methods to reduce the false alarm rate, or Probability of False Alarm P{sub FA} of the matched filter [4, 8, 9, 12]. A detector designer is interested in minimizing the probability of false alarm while simultaneously maximizing the probability of detection P{sub D}. This is summarized by the Receiver Operating Characteristic Curve (ROC) [10, 11], which is actually a family of curves depicting P{sub D} vs. P{sub FA}parameterized by varying levels of signal to noise (or clutter) ratio (SNR or SCR). Often, it is advantageous to be able to specify a desired P{sub FA} and develop a ROC curve (P{sub D} vs. decision threshold r{sub 0}) for that case. That is the purpose of this work. Specifically, this work develops a set of algorithms and MATLAB
Probability density functions for CP-violating rephasing invariants
Fortin, Jean-François; Giasson, Nicolas; Marleau, Luc
2018-05-01
The implications of the anarchy principle on CP violation in the lepton sector are investigated. A systematic method is introduced to compute the probability density functions for the CP-violating rephasing invariants of the PMNS matrix from the Haar measure relevant to the anarchy principle. Contrary to the CKM matrix which is hierarchical, it is shown that the Haar measure, and hence the anarchy principle, are very likely to lead to the observed PMNS matrix. Predictions on the CP-violating Dirac rephasing invariant |jD | and Majorana rephasing invariant |j1 | are also obtained. They correspond to 〈 |jD | 〉 Haar = π / 105 ≈ 0.030 and 〈 |j1 | 〉 Haar = 1 / (6 π) ≈ 0.053 respectively, in agreement with the experimental hint from T2K of |jDexp | ≈ 0.032 ± 0.005 (or ≈ 0.033 ± 0.003) for the normal (or inverted) hierarchy.
Structural Reliability Using Probability Density Estimation Methods Within NESSUS
Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric
2003-01-01
A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been
The probability density function (PDF) of Lagrangian Turbulence
Birnir, B.
2012-12-01
The statistical theory of Lagrangian turbulence is derived from the stochastic Navier-Stokes equation. Assuming that the noise in fully-developed turbulence is a generic noise determined by the general theorems in probability, the central limit theorem and the large deviation principle, we are able to formulate and solve the Kolmogorov-Hopf equation for the invariant measure of the stochastic Navier-Stokes equations. The intermittency corrections to the scaling exponents of the structure functions require a multiplicative (multipling the fluid velocity) noise in the stochastic Navier-Stokes equation. We let this multiplicative noise, in the equation, consists of a simple (Poisson) jump process and then show how the Feynmann-Kac formula produces the log-Poissonian processes, found by She and Leveque, Waymire and Dubrulle. These log-Poissonian processes give the intermittency corrections that agree with modern direct Navier-Stokes simulations (DNS) and experiments. The probability density function (PDF) plays a key role when direct Navier-Stokes simulations or experimental results are compared to theory. The statistical theory of turbulence is determined, including the scaling of the structure functions of turbulence, by the invariant measure of the Navier-Stokes equation and the PDFs for the various statistics (one-point, two-point, N-point) can be obtained by taking the trace of the corresponding invariant measures. Hopf derived in 1952 a functional equation for the characteristic function (Fourier transform) of the invariant measure. In distinction to the nonlinear Navier-Stokes equation, this is a linear functional differential equation. The PDFs obtained from the invariant measures for the velocity differences (two-point statistics) are shown to be the four parameter generalized hyperbolic distributions, found by Barndorff-Nilsen. These PDF have heavy tails and a convex peak at the origin. A suitable projection of the Kolmogorov-Hopf equations is the
Guo, L. M.; Zhu, H. B.; Zhang, N. X.
The probability density distribution of the traffic density is analyzed based on the empirical data. It is found that the beta distribution can fit the result obtained from the measured traffic density perfectly. Then a modified traffic model is proposed to simulate the microscopic traffic flow, in which the probability density distribution of the traffic density is taken into account. The model also contains the behavior of drivers’ speed adaptation by taking into account the driving behavior difference and the dynamic headway. Accompanied by presenting the flux-density diagrams, the velocity evolution diagrams and the spatial-temporal profiles of vehicles are also given. The synchronized flow phase and the wide moving jam phase are indicated, which is the challenge for the cellular automata traffic model. Furthermore the phenomenon of the high speed car-following is exhibited, which has been observed in the measured data previously. The results set demonstrate the effectiveness of the proposed model in detecting the complicated dynamic phenomena of the traffic flow.
Interactive design of probability density functions for shape grammars
Dang, Minh
2015-11-02
A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density function (pdf) over such a shape space and to sample models according to the designed pdf. First, we propose a user interface that enables a user to quickly provide preference scores for selected shapes and suggest sampling strategies to decide which models to present to the user to evaluate. Second, we propose a novel kernel function to encode the similarity between two procedural models. Third, we propose a framework to interpolate user preference scores by combining multiple techniques: function factorization, Gaussian process regression, autorelevance detection, and l1 regularization. Fourth, we modify the original grammars to generate models with a pdf proportional to the user preference scores. Finally, we provide evaluations of our user interface and framework parameters and a comparison to other exploratory modeling techniques using modeling tasks in five example shape spaces: furniture, low-rise buildings, skyscrapers, airplanes, and vegetation.
Efficiency issues related to probability density function comparison
Energy Technology Data Exchange (ETDEWEB)
Kelly, P.M.; Cannon, M.; Barros, J.E.
1996-03-01
The CANDID project (Comparison Algorithm for Navigating Digital Image Databases) employs probability density functions (PDFs) of localized feature information to represent the content of an image for search and retrieval purposes. A similarity measure between PDFs is used to identify database images that are similar to a user-provided query image. Unfortunately, signature comparison involving PDFs is a very time-consuming operation. In this paper, we look into some efficiency considerations when working with PDFS. Since PDFs can take on many forms, we look into tradeoffs between accurate representation and efficiency of manipulation for several data sets. In particular, we typically represent each PDF as a Gaussian mixture (e.g. as a weighted sum of Gaussian kernels) in the feature space. We find that by constraining all Gaussian kernels to have principal axes that are aligned to the natural axes of the feature space, computations involving these PDFs are simplified. We can also constrain the Gaussian kernels to be hyperspherical rather than hyperellipsoidal, simplifying computations even further, and yielding an order of magnitude speedup in signature comparison. This paper illustrates the tradeoffs encountered when using these constraints.
Accurate photometric redshift probability density estimation - method comparison and application
Rau, Markus Michael; Seitz, Stella; Brimioulle, Fabrice; Frank, Eibe; Friedrich, Oliver; Gruen, Daniel; Hoyle, Ben
2015-10-01
We introduce an ordinal classification algorithm for photometric redshift estimation, which significantly improves the reconstruction of photometric redshift probability density functions (PDFs) for individual galaxies and galaxy samples. As a use case we apply our method to CFHTLS galaxies. The ordinal classification algorithm treats distinct redshift bins as ordered values, which improves the quality of photometric redshift PDFs, compared with non-ordinal classification architectures. We also propose a new single value point estimate of the galaxy redshift, which can be used to estimate the full redshift PDF of a galaxy sample. This method is competitive in terms of accuracy with contemporary algorithms, which stack the full redshift PDFs of all galaxies in the sample, but requires orders of magnitude less storage space. The methods described in this paper greatly improve the log-likelihood of individual object redshift PDFs, when compared with a popular neural network code (ANNZ). In our use case, this improvement reaches 50 per cent for high-redshift objects (z ≥ 0.75). We show that using these more accurate photometric redshift PDFs will lead to a reduction in the systematic biases by up to a factor of 4, when compared with less accurate PDFs obtained from commonly used methods. The cosmological analyses we examine and find improvement upon are the following: gravitational lensing cluster mass estimates, modelling of angular correlation functions and modelling of cosmic shear correlation functions.
Probability density functions for use when calculating standardised drought indices
Svensson, Cecilia; Prosdocimi, Ilaria; Hannaford, Jamie
2015-04-01
Time series of drought indices like the standardised precipitation index (SPI) and standardised flow index (SFI) require a statistical probability density function to be fitted to the observed (generally monthly) precipitation and river flow data. Once fitted, the quantiles are transformed to a Normal distribution with mean = 0 and standard deviation = 1. These transformed data are the SPI/SFI, which are widely used in drought studies, including for drought monitoring and early warning applications. Different distributions were fitted to rainfall and river flow data accumulated over 1, 3, 6 and 12 months for 121 catchments in the United Kingdom. These catchments represent a range of catchment characteristics in a mid-latitude climate. Both rainfall and river flow data have a lower bound at 0, as rains and flows cannot be negative. Their empirical distributions also tend to have positive skewness, and therefore the Gamma distribution has often been a natural and suitable choice for describing the data statistically. However, after transformation of the data to Normal distributions to obtain the SPIs and SFIs for the 121 catchments, the distributions are rejected in 11% and 19% of cases, respectively, by the Shapiro-Wilk test. Three-parameter distributions traditionally used in hydrological applications, such as the Pearson type 3 for rainfall and the Generalised Logistic and Generalised Extreme Value distributions for river flow, tend to make the transformed data fit better, with rejection rates of 5% or less. However, none of these three-parameter distributions have a lower bound at zero. This means that the lower tail of the fitted distribution may potentially go below zero, which would result in a lower limit to the calculated SPI and SFI values (as observations can never reach into this lower tail of the theoretical distribution). The Tweedie distribution can overcome the problems found when using either the Gamma or the above three-parameter distributions. The
On the evolution of the density probability density function in strongly self-gravitating systems
International Nuclear Information System (INIS)
Girichidis, Philipp; Konstandin, Lukas; Klessen, Ralf S.; Whitworth, Anthony P.
2014-01-01
The time evolution of the probability density function (PDF) of the mass density is formulated and solved for systems in free-fall using a simple approximate function for the collapse of a sphere. We demonstrate that a pressure-free collapse results in a power-law tail on the high-density side of the PDF. The slope quickly asymptotes to the functional form P V (ρ)∝ρ –1.54 for the (volume-weighted) PDF and P M (ρ)∝ρ –0.54 for the corresponding mass-weighted distribution. From the simple approximation of the PDF we derive analytic descriptions for mass accretion, finding that dynamically quiet systems with narrow density PDFs lead to retarded star formation and low star formation rates (SFRs). Conversely, strong turbulent motions that broaden the PDF accelerate the collapse causing a bursting mode of star formation. Finally, we compare our theoretical work with observations. The measured SFRs are consistent with our model during the early phases of the collapse. Comparison of observed column density PDFs with those derived from our model suggests that observed star-forming cores are roughly in free-fall.
Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin
2003-01-01
A logistic equation is the basis for a model that predicts the probability of obtaining regeneration at specified densities. The density of regeneration (trees/ha) for which an estimate of probability is desired can be specified by means of independent variables in the model. When estimating parameters, the dependent variable is set to 1 if the regeneration density (...
Probability density fittings of corrosion test-data
Indian Academy of Sciences (India)
In this study, corrosion test-data of steel-rebar in concrete were subjected to the fittings of the Normal, Gumbel and the Weibull probability distribution functions. This was done to investigate the suitability of the results of the fitted test-data, by these distributions, for modelling the effectiveness of C6H15NO3, triethanolamine ...
The probability density function of completed length of service (CLS ...
African Journals Online (AJOL)
... functions this paper estimates the functions for some secondary schools in Enugu State. Wastage probabilities are calculated, survivor functions estimated. The accompanying standard errors are also obtained. Key words: Manpower Planning, Length of Service, Modelling, Survivor Functions. [Global Jnl Mathematical Sci ...
What is presumed when we presume consent?
Directory of Open Access Journals (Sweden)
Pierscionek Barbara K
2008-04-01
Full Text Available Abstract Background The organ donor shortfall in the UK has prompted calls to introduce legislation to allow for presumed consent: if there is no explicit objection to donation of an organ, consent should be presumed. The current debate has not taken in account accepted meanings of presumption in law and science and the consequences for rights of ownership that would arise should presumed consent become law. In addition, arguments revolve around the rights of the competent autonomous adult but do not always consider the more serious implications for children or the disabled. Discussion Any action or decision made on a presumption is accepted in law and science as one based on judgement of a provisional situation. It should therefore allow the possibility of reversing the action or decision. Presumed consent to organ donation will not permit such reversal. Placing prime importance on the functionality of body organs and their capacity to sustain life rather than on explicit consent of the individual will lead to further debate about rights of ownership and potentially to questions about financial incentives and to whom benefits should accrue. Factors that influence donor rates are not fully understood and attitudes of the public to presumed consent require further investigation. Presuming consent will also necessitate considering how such a measure would be applied in situations involving children and mentally incompetent adults. Summary The presumption of consent to organ donation cannot be understood in the same way as is presumption when applied to science or law. Consideration should be given to the consequences of presuming consent and to the questions of ownership and organ monetary value as these questions are likely to arise should presumed consent be permitted. In addition, the implications of presumed consent on children and adults who are unable to object to organ donation, requires serious contemplation if these most vulnerable
Particle number and probability density functional theory and A-representability.
Pan, Xiao-Yin; Sahni, Viraht
2010-04-28
In Hohenberg-Kohn density functional theory, the energy E is expressed as a unique functional of the ground state density rho(r): E = E[rho] with the internal energy component F(HK)[rho] being universal. Knowledge of the functional F(HK)[rho] by itself, however, is insufficient to obtain the energy: the particle number N is primary. By emphasizing this primacy, the energy E is written as a nonuniversal functional of N and probability density p(r): E = E[N,p]. The set of functions p(r) satisfies the constraints of normalization to unity and non-negativity, exists for each N; N = 1, ..., infinity, and defines the probability density or p-space. A particle number N and probability density p(r) functional theory is constructed. Two examples for which the exact energy functionals E[N,p] are known are provided. The concept of A-representability is introduced, by which it is meant the set of functions Psi(p) that leads to probability densities p(r) obtained as the quantum-mechanical expectation of the probability density operator, and which satisfies the above constraints. We show that the set of functions p(r) of p-space is equivalent to the A-representable probability density set. We also show via the Harriman and Gilbert constructions that the A-representable and N-representable probability density p(r) sets are equivalent.
Kayser, Hartmut; Nimtz, Manfred; Ringler, Philippe; Müller, Shirley A
2016-01-01
Bilins in complex with specific proteins play key roles in many forms of life. Biliproteins have also been isolated from insects; however, structural details are rare and possible functions largely unknown. Recently, we identified a high-molecular weight biliprotein from a moth, Cerura vinula, as an arylphorin-type hexameric storage protein linked to a novel farnesyl biliverdin IXα; its unusual structure suggests formation by cleavage of mitochondrial heme A. In the present study of another moth, Spodoptera littoralis, we isolated two different biliproteins. These proteins were identified as a very high-density lipoprotein (VHDL) and as vitellin, respectively, by mass spectrometric sequencing. Both proteins are associated with three different farnesyl biliverdins IXα: the one bilin isolated from C. vinula and two new structurally closely related bilins, supposed to be intermediates of heme A degradation. The different bilin composition of the two biliproteins suggests that the presumed oxidations at the farnesyl side-chain take place mainly during egg development. The egg bilins are supposedly transferred from hemolymph VHDL to vitellin in the female. Both biliproteins show strong induced circular dichroism activity compatible with a predominance of the M-conformation of the bilins. This conformation is opposite to that of the arylphorin-type biliprotein from C. vinula. Electron microscopy of the VHDL-type biliprotein from S. littoralis provided a preliminary view of its structure as a homodimer and confirmed the biochemically determined molecular mass of ∼350 kDa. Further, images of S. littoralis hexamerins revealed a 2 × 3 construction identical to that known from the hexamerin from C. vinula. Copyright © 2015 Elsevier Ltd. All rights reserved.
A new method of joint nonparametric estimation of probability density and its support
Moriyama, Taku
2017-01-01
In this paper we propose a new method of joint nonparametric estimation of probability density and its support. As is well known, nonparametric kernel density estimator has "boundary bias problem" when the support of the population density is not the whole real line. To avoid the unknown boundary effects, our estimator detects the boundary, and eliminates the boundary-bias of the estimator simultaneously. Moreover, we refer an extension to a simple multivariate case, and propose an improved e...
Vasta, M.; Di Paola, M.
In this paper an approximate explicit probability density function for the analysis of external oscillations of a linear and geometric nonlinear simply supported beam driven by random pulses is proposed. The adopted impulsive loading model is the Poisson White Noise , that is a process having Dirac's delta occurrences with random intensity distributed in time according to Poisson's law. The response probability density function can be obtained solving the related Kolmogorov-Feller (KF) integro-differential equation. An approximated solution, using path integral method, is derived transforming the KF equation to a first order partial differential equation. The method of characteristic is then applied to obtain an explicit solution. Different levels of approximation, depending on the physical assumption on the transition probability density function, are found and the solution for the response density is obtained as series expansion using convolution integrals.
Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation
Directory of Open Access Journals (Sweden)
Michal Halas
2012-01-01
Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.
Keiter, David A; Davis, Amy J; Rhodes, Olin E; Cunningham, Fred L; Kilgo, John C; Pepin, Kim M; Beasley, James C
2017-08-25
Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. In this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movement had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.
DEFF Research Database (Denmark)
Falk, Anne Katrine Vinther; Gryning, Sven-Erik
1997-01-01
In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials. ...
Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function
Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.
2011-01-01
In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.
Probability density of wave function of excited photoelectron: understanding XANES features
Czech Academy of Sciences Publication Activity Database
Šipr, Ondřej
2001-01-01
Roč. 8, - (2001), s. 232-234 ISSN 0909-0495 R&D Projects: GA ČR GA202/99/0404 Institutional research plan: CEZ:A02/98:Z1-010-914 Keywords : XANES * PED - probability density of wave function Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.519, year: 2001
Energy Technology Data Exchange (ETDEWEB)
Wampler, William R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Myers, Samuel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Modine, Normand A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-09-01
The energy-dependent probability density of tunneled carrier states for arbitrarily specified longitudinal potential-energy profiles in planar bipolar devices is numerically computed using the scattering method. Results agree accurately with a previous treatment based on solution of the localized eigenvalue problem, where computation times are much greater. These developments enable quantitative treatment of tunneling-assisted recombination in irradiated heterojunction bipolar transistors, where band offsets may enhance the tunneling effect by orders of magnitude. The calculations also reveal the density of non-tunneled carrier states in spatially varying potentials, and thereby test the common approximation of uniform- bulk values for such densities.
Lura, Derek; Wernke, Matthew; Alqasemi, Redwan; Carey, Stephanie; Dubey, Rajiv
2012-01-01
This paper presents the probability density based gradient projection (GP) of the null space of the Jacobian for a 25 degree of freedom bilateral robotic human body model (RHBM). This method was used to predict the inverse kinematics of the RHBM and maximize the similarity between predicted inverse kinematic poses and recorded data of 10 subjects performing activities of daily living. The density function was created for discrete increments of the workspace. The number of increments in each direction (x, y, and z) was varied from 1 to 20. Performance of the method was evaluated by finding the root mean squared (RMS) of the difference between the predicted joint angles relative to the joint angles recorded from motion capture. The amount of data included in the creation of the probability density function was varied from 1 to 10 subjects, creating sets of for subjects included and excluded from the density function. The performance of the GP method for subjects included and excluded from the density function was evaluated to test the robustness of the method. Accuracy of the GP method varied with amount of incremental division of the workspace, increasing the number of increments decreased the RMS error of the method, with the error of average RMS error of included subjects ranging from 7.7° to 3.7°. However increasing the number of increments also decreased the robustness of the method.
DEFF Research Database (Denmark)
Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri
2016-01-01
The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...
Unification of field theory and maximum entropy methods for learning probability densities.
Kinney, Justin B
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Assessment of alveolar bone mineral density as a predictor of lumbar fracture probability.
Takaishi, Yoshitomo; Arita, Seizaburo; Honda, Mitsugi; Sugishita, Takeshi; Kamada, Aiko; Ikeo, Takashi; Miki, Takami; Fujita, Takuo
2013-05-01
Osteoporosis and tooth loss have been linked with advancing age, but no clear relationship between these conditions has been proven. Several studies of bone mineral density measurements of the jaw and spine have shown similarities in their rate of age-related deterioration. Thus, measurements of jawbone density may predict lumbar vertebral bone density. Using jawbone density as a proxy marker would circumvent the need for lumbar bone measurements and facilitate prediction of osteoporotic spinal fracture susceptibility at dental clinics. We aimed to characterize the correlation between bone density in the jaw and spine and the incidence of osteoporotic spinal fractures. We used computerized radiogrammetry to measure alveolar bone mineral density (al-BMD) and dual-energy X-ray absorptiometry to measure lumbar bone mineral density (L-BMD). L-BMD and al-BMD in 30 female patients (average age: 59 ± 5 years) were correlated with various patient attributes. Statistical analysis included area under the curve (AUC) and probability of asymptomatic significance (PAS) in a receiver operating characteristic curve. The predictive strength of L-BMD T-scores (L-BMD[T]) and al-BMD measurements for fracture occurrence was then compared using multivariate analysis with category weight scoring. L-BMD and al-BMD were significantly correlated with age, years since menopause, and alveolar bone thickness. Both were also negatively correlated with fracture incidence. Category weight scores were -0.275 for a L-BMD(T) fracture occurrence than L-BMD. Our results suggest the possible association between al-BMD and vertebral fracture risk. Assessment of alveolar bone density may be useful in patients receiving routine dental exams to monitor the clinical picture and the potential course of osteoporosis in patients who may be at a higher risk of developing osteoporosis.
Górska, K.; Horzela, A.; Bratek, Ł.; Dattoli, G.; Penson, K. A.
2018-04-01
We study functions related to the experimentally observed Havriliak–Negami dielectric relaxation pattern proportional in the frequency domain to [1+(iωτ0){\\hspace{0pt}}α]-β with τ0 > 0 being some characteristic time. For α = l/k 0 we furnish exact and explicit expressions for response and relaxation functions in the time domain and suitable probability densities in their domain dual in the sense of the inverse Laplace transform. All these functions are expressed as finite sums of generalized hypergeometric functions, convenient to handle analytically and numerically. Introducing a reparameterization β = (2-q)/(q-1) and τ0 = (q-1){\\hspace{0pt}}1/α (1 < q < 2) we show that for 0 < α < 1 the response functions fα, β(t/τ0) go to the one-sided Lévy stable distributions when q tends to one. Moreover, applying the self-similarity property of the probability densities gα, β(u) , we introduce two-variable densities and show that they satisfy the integral form of the evolution equation.
Spectral Discrete Probability Density Function of Measured Wind Turbine Noise in the Far Field
Ashtiani, Payam; Denison, Adelaide
2015-01-01
Of interest is the spectral character of wind turbine noise at typical residential set-back distances. In this paper, a spectral statistical analysis has been applied to immission measurements conducted at three locations. This method provides discrete probability density functions for the Turbine ONLY component of the measured noise. This analysis is completed for one-third octave sound levels, at integer wind speeds, and is compared to existing metrics for measuring acoustic comfort as well as previous discussions on low-frequency noise sources. PMID:25905097
Taylor, John G.; Husmeier, Dirk
1998-01-01
Predicting conditional probability densities with neural networks requires complex (at least two-hidden-layer) architectures, which normally leads to rather long training times. By adopting the RVFL concept and constraining a subset of the parameters to randomly chosen initial values (such that the EM-algorithm can be applied), the training process can be accelerated by about two orders of magnitude. This allows training of a whole ensemble of networks at the same computational costs as would be required otherwise for training a single model. The simulations performed suggest that in this way a significant improvement of the generalization performance can be achieved. Copyright 1997 Elsevier Science Ltd.
Weatherbee, Andrew; Sugita, Mitsuro; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex
2016-06-15
The distribution of backscattered intensities as described by the probability density function (PDF) of tissue-scattered light contains information that may be useful for tissue assessment and diagnosis, including characterization of its pathology. In this Letter, we examine the PDF description of the light scattering statistics in a well characterized tissue-like particulate medium using optical coherence tomography (OCT). It is shown that for low scatterer density, the governing statistics depart considerably from a Gaussian description and follow the K distribution for both OCT amplitude and intensity. The PDF formalism is shown to be independent of the scatterer flow conditions; this is expected from theory, and suggests robustness and motion independence of the OCT amplitude (and OCT intensity) PDF metrics in the context of potential biomedical applications.
Multiple Streaming and the Probability Distribution of Density in Redshift Space
Hui, Lam; Kofman, Lev; Shandarin, Sergei F.
2000-07-01
We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude (σlreal-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which are physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S3, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated.
A Priori Knowledge and Probability Density Based Segmentation Method for Medical CT Image Sequences
Directory of Open Access Journals (Sweden)
Huiyan Jiang
2014-01-01
Full Text Available This paper briefly introduces a novel segmentation strategy for CT images sequences. As first step of our strategy, we extract a priori intensity statistical information from object region which is manually segmented by radiologists. Then we define a search scope for object and calculate probability density for each pixel in the scope using a voting mechanism. Moreover, we generate an optimal initial level set contour based on a priori shape of object of previous slice. Finally the modified distance regularity level set method utilizes boundaries feature and probability density to conform final object. The main contributions of this paper are as follows: a priori knowledge is effectively used to guide the determination of objects and a modified distance regularization level set method can accurately extract actual contour of object in a short time. The proposed method is compared to other seven state-of-the-art medical image segmentation methods on abdominal CT image sequences datasets. The evaluated results demonstrate our method performs better and has the potential for segmentation in CT image sequences.
Audio Query by Example Using Similarity Measures between Probability Density Functions of Features
Directory of Open Access Journals (Sweden)
Marko Helén
2010-01-01
Full Text Available This paper proposes a query by example system for generic audio. We estimate the similarity of the example signal and the samples in the queried database by calculating the distance between the probability density functions (pdfs of their frame-wise acoustic features. Since the features are continuous valued, we propose to model them using Gaussian mixture models (GMMs or hidden Markov models (HMMs. The models parametrize each sample efficiently and retain sufficient information for similarity measurement. To measure the distance between the models, we apply a novel Euclidean distance, approximations of Kullback-Leibler divergence, and a cross-likelihood ratio test. The performance of the measures was tested in simulations where audio samples are automatically retrieved from a general audio database, based on the estimated similarity to a user-provided example. The simulations show that the distance between probability density functions is an accurate measure for similarity. Measures based on GMMs or HMMs are shown to produce better results than that of the existing methods based on simpler statistics or histograms of the features. A good performance with low computational cost is obtained with the proposed Euclidean distance.
Directory of Open Access Journals (Sweden)
Han Liwei
2014-07-01
Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.
International Nuclear Information System (INIS)
Moriya, Netzer
2010-01-01
A method, based on binomial filtering, to estimate the noise level of an arbitrary, smoothed pure signal, contaminated with an additive, uncorrelated noise component is presented. If the noise characteristics of the experimental spectrum are known, as for instance the type of the corresponding probability density function (e.g., Gaussian), the noise properties can be extracted. In such cases, both the noise level, as may arbitrarily be defined, and a simulated white noise component can be generated, such that the simulated noise component is statistically indistinguishable from the true noise component present in the original signal. In this paper we present a detailed analysis of the noise level extraction when the additive noise is Gaussian or Lorentzian. We show that the statistical parameters in these cases (mainly the variance and the half width at half maximum, respectively) can directly be obtained from the experimental spectrum even when the pure signal is erratic. Further discussion is given for cases where the noise probability density function is initially unknown.
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based
Riggs, Peter J.
2013-01-01
Students often wrestle unsuccessfully with the task of correctly calculating momentum probability densities and have difficulty in understanding their interpretation. In the case of a particle in an "infinite" potential well, its momentum can take values that are not just those corresponding to the particle's quantised energies but…
Ballesteros-Paredes, Javier; Vázquez-Semadeni, Enrique; Gazol, Adriana; Hartmann, Lee W.; Heitsch, Fabian; Colín, Pedro
2011-09-01
It has been recently shown that molecular clouds do not exhibit a unique shape for the column density probability distribution function (N-PDF). Instead, clouds without star formation seem to possess a lognormal distribution, while clouds with active star formation develop a power-law tail at high column densities. The lognormal behaviour of the N-PDF has been interpreted in terms of turbulent motions dominating the dynamics of the clouds, while the power-law behaviour occurs when the cloud is dominated by gravity. In the present contribution, we use thermally bi-stable numerical simulations of cloud formation and evolution to show that, indeed, these two regimes can be understood in terms of the formation and evolution of molecular clouds: a very narrow lognormal regime appears when the cloud is being assembled. However, as the global gravitational contraction occurs, the initial density fluctuations are enhanced, resulting, first, in a wider lognormal N-PDF, and later, in a power-law N-PDF. We thus suggest that the observed N-PDF of molecular clouds are a manifestation of their global gravitationally contracting state. We also show that, contrary to recent suggestions, the exact value of the power-law slope is not unique, as it depends on the projection in which the cloud is being observed.
A study of shear sprays using probability density function techniques and laser-based diagnostics
Energy Technology Data Exchange (ETDEWEB)
Gitahi, A.; Kioni, P.N. [Jomo Kenyatta University of Agriculture and Technology (Kenya). Department of Mechanical Engineering
2009-07-01
Presented in this paper are preliminary experimental results from investigations carried out on a two-dimensional shear spray. These results are part of ongoing research of combustion in shear flows. Among the objectives is to include the effects of droplet-droplet interactions and turbulent dispersion. In the numerical work, use is made of Probability Density Function (pdf) techniques owing to the large dimensionality of the spray problem. For the experimental work, a burner has been developed and laser-based experiments carried out on it to characterize the spray. The results capture velocity evolution and droplet size distributions. At this stage a water spray is used, to bring out the quality of the burner as a precursor to spray combustion investigations in the ongoing research. (orig.)
Directory of Open Access Journals (Sweden)
Osmar Abílio de Carvalho Júnior
2014-04-01
Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.
Development and evaluation of probability density functions for a set of human exposure factors
International Nuclear Information System (INIS)
Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.
1999-01-01
The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors
Development and evaluation of probability density functions for a set of human exposure factors
Energy Technology Data Exchange (ETDEWEB)
Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.
1999-06-01
The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.
International Nuclear Information System (INIS)
Williams, M.M.R.
2003-01-01
An analytical expression is obtained for the probability density function of the multiplication factor of an array of spheres when each sphere is displaced in a random fashion from its initial position. Two cases are considered: (1) spheres in an infinite background medium in which the total cross section in spheres and medium is the same, and (2) spheres in a void. In all cases we use integral transport theory and cast the problem into one involving average fluxes in the spheres which interact via collision probabilities. The statistical aspects of the problem are treated by first order perturbation theory and the general conclusion is that, when the number of spheres exceeds about 5, the reduced multiplication factor ((ξ (k-k 0 ))/(k 0 )), where k 0 is the unperturbed value, is given accurately by the Gaussian distribution P (ξ)= (1)/(SQRT(2 π) σ D T ) exp-((ξ 2 )/(2 σ 2 D T 2 )).)) The partial standard deviation σ - 2 δ / SQRT (3), δ being the maximum movement of the sphere from its equilibrium position. D T is a function of the system properties and geometry. Some numerical results are given to illustrate the magnitude of the effects and also the accuracy of diffusion theory for this type of problem is assessed. The overall accuracy of the perturbation method is assessed by an essentially exact result obtained using simulation, thereby enabling the range of perturbation theory to be investigated
International Nuclear Information System (INIS)
Watterson, Ian G.
2007-01-01
Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections
Robust functional statistics applied to Probability Density Function shape screening of sEMG data.
Boudaoud, S; Rix, H; Al Harrach, M; Marin, F
2014-01-01
Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.
Beghein, Caroline; Trampert, Jeannot
2004-01-01
The presence of radial anisotropy in the upper mantle, transition zone and top of the lower mantle is investigated by applying a model space search technique to Rayleigh and Love wave phase velocity models. Probability density functions are obtained independently for S-wave anisotropy, P-wave anisotropy, intermediate parameter η, Vp, Vs and density anomalies. The likelihoods for P-wave and S-wave anisotropy beneath continents cannot be explained by a dry olivine-rich upper mantle at depths larger than 220 km. Indeed, while shear-wave anisotropy tends to disappear below 220 km depth in continental areas, P-wave anisotropy is still present but its sign changes compared to the uppermost mantle. This could be due to an increase with depth of the amount of pyroxene relative to olivine in these regions, although the presence of water, partial melt or a change in the deformation mechanism cannot be ruled out as yet. A similar observation is made for old oceans, but not for young ones where VSH> VSV appears likely down to 670 km depth and VPH> VPV down to 400 km depth. The change of sign in P-wave anisotropy seems to be qualitatively correlated with the presence of the Lehmann discontinuity, generally observed beneath continents and some oceans but not beneath ridges. Parameter η shows a similar age-related depth pattern as shear-wave anisotropy in the uppermost mantle and it undergoes the same change of sign as P-wave anisotropy at 220 km depth. The ratio between dln Vs and dln Vp suggests that a chemical component is needed to explain the anomalies in most places at depths greater than 220 km. More tests are needed to infer the robustness of the results for density, but they do not affect the results for anisotropy.
Watterson, I. G.
2008-06-01
There remains uncertainty in the projected climate change over the 21st century, in part because of the range of responses to rising greenhouse gas concentrations in current global climate models (GCMs). The representation of potential changes in the form of a probability density function (PDF) is increasingly sought for applications. This article presents a method of estimating PDFs for projections based on the "pattern scaling" technique, which separates the uncertainty in the global mean warming from that in the standardized regional change. A mathematical framework for the problem is developed, which includes a joint probability distribution for the product of these two factors. Several simple approaches are considered for representing the factors by PDFs using GCM results, allowing model weighting. The four-parameter beta distribution is found to provide a smooth PDF that can match the mean and range of GCM results, allowing skewness when appropriate. A beta representation of the range in global warming consistent with the Intergovernmental Panel on Climate Change Fourth Assessment Report is presented. The method is applied to changes in Australian temperature and precipitation, under the A1B scenario of concentrations, using results from 23 GCMs in the CMIP3 database. Statistical results, including percentiles and threshold exceedences, are compared for the case of southern Australian temperature change in summer. For the precipitation example, central Australian winter rainfall, the usual linear scaling assumption produces a net change PDF that extends to unphysically large decreases. This is avoided by assuming an exponential relationship between percentage decreases in rainfall and warming.
Dictionary-based probability density function estimation for high-resolution SAR data
Krylov, Vladimir; Moser, Gabriele; Serpico, Sebastiano B.; Zerubia, Josiane
2009-02-01
In the context of remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of pixel intensities. In this work, we develop a parametric finite mixture model for the statistics of pixel intensities in high resolution synthetic aperture radar (SAR) images. This method is an extension of previously existing method for lower resolution images. The method integrates the stochastic expectation maximization (SEM) scheme and the method of log-cumulants (MoLC) with an automatic technique to select, for each mixture component, an optimal parametric model taken from a predefined dictionary of parametric probability density functions (pdf). The proposed dictionary consists of eight state-of-the-art SAR-specific pdfs: Nakagami, log-normal, generalized Gaussian Rayleigh, Heavy-tailed Rayleigh, Weibull, K-root, Fisher and generalized Gamma. The designed scheme is endowed with the novel initialization procedure and the algorithm to automatically estimate the optimal number of mixture components. The experimental results with a set of several high resolution COSMO-SkyMed images demonstrate the high accuracy of the designed algorithm, both from the viewpoint of a visual comparison of the histograms, and from the viewpoint of quantitive accuracy measures such as correlation coefficient (above 99,5%). The method proves to be effective on all the considered images, remaining accurate for multimodal and highly heterogeneous scenes.
McKean, Cristina; Letts, Carolyn; Howard, David
2014-11-01
The effect of phonotactic probability (PP) and neighbourhood density (ND) on triggering word learning was examined in children with Language Impairment (3;04-6;09) and compared to Typically Developing children. Nonwords, varying PP and ND orthogonally, were presented in a story context and their learning tested using a referent identification task. Group comparisons with receptive vocabulary as a covariate found no group differences in overall scores or in the influence of PP or ND. Therefore, there was no evidence of atypical lexical or phonological processing. 'Convergent' PP/ND (High PP/High ND; Low PP/Low ND) was optimal for word learning in both groups. This bias interacted with vocabulary knowledge. 'Divergent' PP/ND word scores (High PP/Low ND; Low PP/High ND) were positively correlated with vocabulary so the 'divergence disadvantage' reduced as vocabulary knowledge grew; an interaction hypothesized to represent developmental changes in lexical-phonological processing linked to the emergence of phonological representations.
Energy Technology Data Exchange (ETDEWEB)
Ovchinnikov, Mikhail [Pacific Northwest National Laboratory, Richland Washington USA; Lim, Kyo-Sun Sunny [Pacific Northwest National Laboratory, Richland Washington USA; Korea Atomic Energy Research Institute, Daejeon Republic of Korea; Larson, Vincent E. [Department of Mathematical Sciences, University of Wisconsin-Milwaukee, Milwaukee Wisconsin USA; Wong, May [Pacific Northwest National Laboratory, Richland Washington USA; National Center for Atmospheric Research, Boulder Colorado USA; Thayer-Calder, Katherine [National Center for Atmospheric Research, Boulder Colorado USA; Ghan, Steven J. [Pacific Northwest National Laboratory, Richland Washington USA
2016-11-05
Coarse-resolution climate models increasingly rely on probability density functions (PDFs) to represent subgrid-scale variability of prognostic variables. While PDFs characterize the horizontal variability, a separate treatment is needed to account for the vertical structure of clouds and precipitation. When sub-columns are drawn from these PDFs for microphysics or radiation parameterizations, appropriate vertical correlations must be enforced via PDF overlap specifications. This study evaluates the representation of PDF overlap in the Subgrid Importance Latin Hypercube Sampler (SILHS) employed in the assumed PDF turbulence and cloud scheme called the Cloud Layers Unified By Binormals (CLUBB). PDF overlap in CLUBB-SILHS simulations of continental and tropical oceanic deep convection is compared with overlap of PDF of various microphysics variables in cloud-resolving model (CRM) simulations of the same cases that explicitly predict the 3D structure of cloud and precipitation fields. CRM results show that PDF overlap varies significantly between different hydrometeor types, as well as between PDFs of mass and number mixing ratios for each species, - a distinction that the current SILHS implementation does not make. In CRM simulations that explicitly resolve cloud and precipitation structures, faster falling species, such as rain and graupel, exhibit significantly higher coherence in their vertical distributions than slow falling cloud liquid and ice. These results suggest that to improve the overlap treatment in the sub-column generator, the PDF correlations need to depend on hydrometeor properties, such as fall speeds, in addition to the currently implemented dependency on the turbulent convective length scale.
Hayward, Thomas J; Oba, Roger M
2013-07-01
Numerical methods are presented for approximating the probability density functions (pdf's) of acoustic fields and receiver-array responses induced by a given joint pdf of a set of acoustic environmental parameters. An approximation to the characteristic function of the random acoustic field (the inverse Fourier transform of the field pdf) is first obtained either by construction of the empirical characteristic function (ECF) from a random sample of the acoustic parameters, or by application of generalized Gaussian quadrature to approximate the integral defining the characteristic function. The Fourier transform is then applied to obtain an approximation of the pdf by a continuous function of the field variables. Application of both the ECF and generalized Gaussian quadrature is demonstrated in an example of a shallow-water ocean waveguide with two-dimensional uncertainty of sound speed and attenuation coefficient in the ocean bottom. Both approximations lead to a smoother estimate of the field pdf than that provided by a histogram, with generalized Gaussian quadrature providing a smoother estimate at the tails of the pdf. Potential applications to acoustic system performance quantification and to nonparametric acoustic signal processing are discussed.
Representation of Probability Density Functions from Orbit Determination using the Particle Filter
Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell
2012-01-01
Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.
Power probability density function control and performance assessment of a nuclear research reactor
International Nuclear Information System (INIS)
Abharian, Amir Esmaeili; Fadaei, Amir Hosein
2014-01-01
Highlights: • In this paper, the performance assessment of static PDF control system is discussed. • The reactor PDF model is set up based on the B-spline functions. • Acquaints of Nu, and Th-h. equations solve concurrently by reformed Hansen’s method. • A principle of performance assessment is put forward for the PDF of the NR control. - Abstract: One of the main issues in controlling a system is to keep track of the conditions of the system function. The performance condition of the system should be inspected continuously, to keep the system in reliable working condition. In this study, the nuclear reactor is considered as a complicated system and a principle of performance assessment is used for analyzing the performance of the power probability density function (PDF) of the nuclear research reactor control. First, the model of the power PDF is set up, then the controller is designed to make the power PDF for tracing the given shape, that make the reactor to be a closed-loop system. The operating data of the closed-loop reactor are used to assess the control performance with the performance assessment criteria. The modeling, controller design and the performance assessment of the power PDF are all applied to the control of Tehran Research Reactor (TRR) power in a nuclear process. In this paper, the performance assessment of the static PDF control system is discussed, the efficacy and efficiency of the proposed method are investigated, and finally its reliability is proven
Diverging probability-density functions for flat-top solitary waves
Peleg, Avner; Chung, Yeojin; Dohnal, Tomáš; Nguyen, Quan M.
2009-08-01
We investigate the statistics of flat-top solitary wave parameters in the presence of weak multiplicative dissipative disorder. We consider first propagation of solitary waves of the cubic-quintic nonlinear Schrödinger equation (CQNLSE) in the presence of disorder in the cubic nonlinear gain. We show by a perturbative analytic calculation and by Monte Carlo simulations that the probability-density function (PDF) of the amplitude η exhibits loglognormal divergence near the maximum possible amplitude ηm , a behavior that is similar to the one observed earlier for disorder in the linear gain [A. Peleg , Phys. Rev. E 72, 027203 (2005)]. We relate the loglognormal divergence of the amplitude PDF to the superexponential approach of η to ηm in the corresponding deterministic model with linear/nonlinear gain. Furthermore, for solitary waves of the derivative CQNLSE with weak disorder in the linear gain both the amplitude and the group velocity β become random. We therefore study analytically and by Monte Carlo simulations the PDF of the parameter p , where p=η/(1-ɛsβ/2) and ɛs is the self-steepening coefficient. Our analytic calculations and numerical simulations show that the PDF of p is loglognormally divergent near the maximum p value.
A measurement-driven adaptive probability hypothesis density filter for multitarget tracking
Directory of Open Access Journals (Sweden)
Si Weijian
2015-12-01
Full Text Available This paper studies the dynamic estimation problem for multitarget tracking. A novel gating strategy that is based on the measurement likelihood of the target state space is proposed to improve the overall effectiveness of the probability hypothesis density (PHD filter. Firstly, a measurement-driven mechanism based on this gating technique is designed to classify the measurements. In this mechanism, only the measurements for the existing targets are considered in the update step of the existing targets while the measurements of newborn targets are used for exploring newborn targets. Secondly, the gating strategy enables the development of a heuristic state estimation algorithm when sequential Monte Carlo (SMC implementation of the PHD filter is investigated, where the measurements are used to drive the particle clustering within the space gate. The resulting PHD filter can achieve a more robust and accurate estimation of the existing targets by reducing the interference from clutter. Moreover, the target birth intensity can be adaptive to detect newborn targets, which is in accordance with the birth measurements. Simulation results demonstrate the computational efficiency and tracking performance of the proposed algorithm.
Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei
2016-01-01
Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851
A joint probability density function of wind speed and direction for wind energy analysis
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Bueno, Celia
2008-01-01
A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy
Tenkès, Lucille-Marie; Hollerbach, Rainer; Kim, Eun-jin
2017-12-01
A probabilistic description is essential for understanding growth processes in non-stationary states. In this paper, we compute time-dependent probability density functions (PDFs) in order to investigate stochastic logistic and Gompertz models, which are two of the most popular growth models. We consider different types of short-correlated multiplicative and additive noise sources and compare the time-dependent PDFs in the two models, elucidating the effects of the additive and multiplicative noises on the form of PDFs. We demonstrate an interesting transition from a unimodal to a bimodal PDF as the multiplicative noise increases for a fixed value of the additive noise. A much weaker (leaky) attractor in the Gompertz model leads to a significant (singular) growth of the population of a very small size. We point out the limitation of using stationary PDFs, mean value and variance in understanding statistical properties of the growth in non-stationary states, highlighting the importance of time-dependent PDFs. We further compare these two models from the perspective of information change that occurs during the growth process. Specifically, we define an infinitesimal distance at any time by comparing two PDFs at times infinitesimally apart and sum these distances in time. The total distance along the trajectory quantifies the total number of different states that the system undergoes in time, and is called the information length. We show that the time-evolution of the two models become more similar when measured in units of the information length and point out the merit of using the information length in unifying and understanding the dynamic evolution of different growth processes.
Diop, C. A.
2009-09-01
In many studies discussing the statistical characterization of the rain rate, most of the authors have found that the probability density function (PDF) of the rain rate follows a lognormal law. However, a more careful analysis of the PDF of the radar reflectivity Z suggests that it is a question of a mixture of distributions. The purpose of this work is to identify precipitation types that can coexist in a continental thunderstorm from the PDF of the radar reflectivity. The data used come from the NEXRAD S-band radar network, notably the level II database. From reflectivity ranging from -10 dBZ to 70 dBZ, we compute the PDF. We find that the total distribution is a mixture of several populations adjusted by several gaussian distributions with known parameters : mean, standard deviation and proportion of each one in the mixture. Since it is known that the rainfall is a sum of its parts and is composed of hydrometeors of various sizes, these statistical findings are in accordance with the physical properties of the precipitation. Then each component of the mixed distribution is tentatively attributed to a physical character of the precipitation. The first distribution with low reflectivities is assumed to represent the background of the small sized particles. The second component centred around medium Z corresponds to stratiform rain, the third population located at larger Z is due to heavy rain. Eventually a fourth population is present for hail. *Observatoire Midi-Pyrénées, Laboratoire d'Aérologie, CNRS/Université Paul Sabatier, Toulouse , France **Université des Sciences et Technologies de Lille, UFR de Physique Fondamentale, Laboratoire d'Optique Atmosphérique, Lille, France
Liu, Z.; Kar, J.; Zeng, S.; Tackett, J. L.; Vaughan, M.; Trepte, C. R.; Omar, A. H.; Hu, Y.; Winker, D. M.
2017-12-01
In the CALIPSO retrieval algorithm, detection layers in the lidar measurements is followed by their classification as a "cloud" or "aerosol" using 5-dimensional probability density functions (PDFs). The five dimensions are the mean attenuated backscatter at 532 nm, the layer integrated total attenuated color ratio, the mid-layer altitude, integrated volume depolarization ratio and latitude. The new version 4 (V4) level 2 (L2) data products, released in November 2016, are the first major revision to the L2 product suite since May 2010. Significant calibration changes in the V4 level 1 data necessitated substantial revisions to the V4 L2 CAD algorithm. Accordingly, a new set of PDFs was generated to derive the V4 L2 data products. The V4 CAD algorithm is now applied to layers detected in the stratosphere, where volcanic layers and occasional cloud and smoke layers are observed. Previously, these layers were designated as `stratospheric', and not further classified. The V4 CAD algorithm is also applied to all layers detected at single shot (333 m) resolution. In prior data releases, single shot detections were uniformly classified as clouds. The CAD PDFs used in the earlier releases were generated using a full year (2008) of CALIPSO measurements. Because the CAD algorithm was not applied to stratospheric features, the properties of these layers were not incorporated into the PDFs. When building the V4 PDFs, the 2008 data were augmented with additional data from June 2011, and all stratospheric features were included. The Nabro and Puyehue-Cordon volcanos erupted in June 2011, and volcanic aerosol layers were observed in the upper troposphere and lower stratosphere in both the northern and southern hemispheres. The June 2011 data thus provides the stratospheric aerosol properties needed for comprehensive PDF generation. In contrast to earlier versions of the PDFs, which were generated based solely on observed distributions, construction of the V4 PDFs considered the
Rispens, Judith; Baker, Anne; Duinmeijer, Iris
2015-01-01
Purpose: The effects of neighborhood density (ND) and lexical frequency on word recognition and the effects of phonotactic probability (PP) on nonword repetition (NWR) were examined to gain insight into processing at the lexical and sublexical levels in typically developing (TD) children and children with developmental language problems. Method:…
van der Kleij, S.W.; Rispens, J.E.; Scheper, A.R.
2016-01-01
The aim of this study was to examine the influence of phonotactic probability (PP) and neighbourhood density (ND) on pseudoword learning in 17 Dutch-speaking typically developing children (mean age 7;2). They were familiarized with 16 one-syllable pseudowords varying in PP (high vs low) and ND (high
Conjunctivitis presumably due to Acanthamoeba
Ruthes, Ana Cristina de Carvalho; Wahab, Sâmia; Wahab, Najua; Moreira, Hamilton; Moreira, Luciane
2004-01-01
OBJETIVO: Abordar quatro casos de conjuntivite presumível por Acanthamoeba, descrevendo o diagnóstico, considerando sinais e sintomas e o tratamento instituído. MÉTODOS: Foram estudados casos de conjuntivite presumível por Acanthamoeba diagnosticados no Hospital de Olhos do Paraná (HOP), no período de setembro/1998 a janeiro/2002. Todos os olhos estudados foram submetidos a um protocolo de investigação que incluía exame oftalmológico completo, microbiologia e cultura de secreções conjuntivais...
PDE-Foam - a probability-density estimation method using self-adapting phase-space binning
Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter
2009-01-01
Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...
International Nuclear Information System (INIS)
Zhang Yumin; Lum, Kai-Yew; Wang Qingguo
2009-01-01
In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus, the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
Energy Technology Data Exchange (ETDEWEB)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
2018-01-01
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.
Complications of presumed ocular tuberculosis.
Hamade, Issam H; Tabbara, Khalid F
2010-12-01
To determine the effect of steroid treatment on visual outcome and ocular complications in patients with presumed ocular tuberculosis. Retrospective review of patients with presumptive ocular tuberculosis. The clinical diagnosis was made based on ocular findings, positive purified protein derivative (PPD) testing of more than 15 mm induration, exclusion of other causes of uveitis and positive ocular response to anti-tuberculous therapy (ATT) within 4 weeks. Group 1 included patients who had received oral prednisone or subtenon injection of triamcinolone acetonide prior to ATT. Group 2 included patients who did not receive corticosteroid therapy prior to administration of ATT. Among 500 consecutive new cases of uveitis encountered in 1997-2007 there were 49 (10%) patients with presumed ocular tuberculosis. These comprised 28 (57%) male and 21 (43%) female patients with a mean age of 45 years (range 12-76 years). Four (20%) patients in group 1 had initial visual acuity of 20/40 or better, in comparison to eight (28%) patients in group 2. At 1-year follow-up, six (30%) patients in group 1 had a visual acuity of 20/40 or better compared with 20 (69%) patients in group 2 (p = 0.007). Of 20 eyes (26%) in group 1 that had visual acuity of < 20/50 at 1-year follow up, 14 (70%) eyes developed severe chorioretinal lesion (p = 0.019). Early administration of corticosteroids without anti-tuberculous therapy in presumed ocular tuberculosis may lead to poor visual outcome compared with patients who did not receive corticosteroids prior to presentation. Furthermore, the severity of chorioretinitis lesion in the group of patients given corticosteroid prior to ATT may account for the poor visual outcome. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.
Probability density functions for the variable solar wind near the solar cycle minimum
Vörös, Z.; Leitner, M.; Narita, Y.; Consolini, G.; Kovács, P.; Tóth, A.; Lichtenberger, J.
2015-08-01
Unconditional and conditional statistics are used for studying the histograms of magnetic field multiscale fluctuations in the solar wind near the solar cycle minimum in 2008. The unconditional statistics involves the magnetic data during the whole year in 2008. The conditional statistics involves the magnetic field time series split into concatenated subsets of data according to a threshold in dynamic pressure. The threshold separates fast-stream leading edge compressional and trailing edge uncompressional fluctuations. The histograms obtained from these data sets are associated with both multiscale (B) and small-scale (δB) magnetic fluctuations, the latter corresponding to time-delayed differences. It is shown here that, by keeping flexibility but avoiding the unnecessary redundancy in modeling, the histograms can be effectively described by a limited set of theoretical probability distribution functions (PDFs), such as the normal, lognormal, kappa, and log-kappa functions. In a statistical sense the model PDFs correspond to additive and multiplicative processes exhibiting correlations. It is demonstrated here that the skewed small-scale histograms inherent in turbulent cascades are better described by the skewed log-kappa than by the symmetric kappa model. Nevertheless, the observed skewness is rather small, resulting in potential difficulties of estimation of the third-order moments. This paper also investigates the dependence of the statistical convergence of PDF model parameters, goodness of fit, and skewness on the data sample size. It is shown that the minimum lengths of data intervals required for the robust estimation of parameters is scale, process, and model dependent.
International Nuclear Information System (INIS)
Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.; Fitzgerald, R.
2010-01-01
Numerical values of charged-particle thermonuclear reaction rates for nuclei in the A=14 to 40 region are tabulated. The results are obtained using a method, based on Monte Carlo techniques, that has been described in the preceding paper of this issue (Paper I). We present a low rate, median rate and high rate which correspond to the 0.16, 0.50 and 0.84 quantiles, respectively, of the cumulative reaction rate distribution. The meaning of these quantities is in general different from the commonly reported, but statistically meaningless expressions, 'lower limit', 'nominal value' and 'upper limit' of the total reaction rate. In addition, we approximate the Monte Carlo probability density function of the total reaction rate by a lognormal distribution and tabulate the lognormal parameters μ and σ at each temperature. We also provide a quantitative measure (Anderson-Darling test statistic) for the reliability of the lognormal approximation. The user can implement the approximate lognormal reaction rate probability density functions directly in a stellar model code for studies of stellar energy generation and nucleosynthesis. For each reaction, the Monte Carlo reaction rate probability density functions, together with their lognormal approximations, are displayed graphically for selected temperatures in order to provide a visual impression. Our new reaction rates are appropriate for bare nuclei in the laboratory. The nuclear physics input used to derive our reaction rates is presented in the subsequent paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
WASP-17b: AN ULTRA-LOW DENSITY PLANET IN A PROBABLE RETROGRADE ORBIT
International Nuclear Information System (INIS)
Anderson, D. R.; Hellier, C.; Smalley, B.; Maxted, P. F. L.; Bentley, S. J.; Gillon, M.; Triaud, A. H. M. J.; Queloz, D.; Mayor, M.; Pepe, F.; Segransan, D.; Udry, S.; Hebb, L.; Cameron, A. Collier; Enoch, B.; Horne, K.; Parley, N. R.; West, R. G.; Lister, T. A.; Pollacco, D.
2010-01-01
We report the discovery of the transiting giant planet WASP-17b, the least-dense planet currently known. It is 1.6 Saturn masses, but 1.5-2 Jupiter radii, giving a density of 6%-14% that of Jupiter. WASP-17b is in a 3.7 day orbit around a sub-solar metallicity, V = 11.6, F6 star. Preliminary detection of the Rossiter-McLaughlin effect suggests that WASP-17b is in a retrograde orbit (λ ∼ -150 0 ), indicative of a violent history involving planet-planet or star-planet scattering. WASP-17b's bloated radius could be due to tidal heating resulting from recent or ongoing tidal circularization of an eccentric orbit, such as the highly eccentric orbits that typically result from scattering interactions. It will thus be important to determine more precisely the current orbital eccentricity by further high-precision radial velocity measurements or by timing the secondary eclipse, both to reduce the uncertainty on the planet's radius and to test tidal-heating models. Owing to its low surface gravity, WASP-17b's atmosphere has the largest scale height of any known planet, making it a good target for transmission spectroscopy.
Liang, Yingjie; Chen, Wen
2018-04-01
The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
International Nuclear Information System (INIS)
Varella, Marcio Teixeira do Nascimento
2001-12-01
We have calculated annihilation probability densities (APD) for positron collisions against He atom and H 2 molecule. It was found that direct annihilation prevails at low energies, while annihilation following virtual positronium (Ps) formation is the dominant mechanism at higher energies. In room-temperature collisions (10 -2 eV) the APD spread over a considerable extension, being quite similar to the electronic densities of the targets. The capture of the positron in an electronic Feshbach resonance strongly enhanced the annihilation rate in e + -H 2 collisions. We also discuss strategies to improve the calculation of the annihilation parameter (Z eff ), after debugging the computational codes of the Schwinger Multichannel Method (SMC). Finally, we consider the inclusion of the Ps formation channel in the SMC and show that effective configurations (pseudo eigenstates of the Hamiltonian of the collision ) are able to significantly reduce the computational effort in positron scattering calculations. Cross sections for electron scattering by polyatomic molecules were obtained in three different approximations: static-exchange (SE); tatic-exchange-plus-polarization (SEP); and multichannel coupling. The calculations for polar targets were improved through the rotational resolution of scattering amplitudes in which the SMC was combined with the first Born approximation (FBA). In general, elastic cross sections (SE and SEP approximations) showed good agreement with available experimental data for several targets. Multichannel calculations for e - -H 2 O scattering, on the other hand, presented spurious structures at the electronic excitation thresholds (author)
Storkel, Holly L; Hoover, Jill R
2011-06-01
The goal of this study was to examine the influence of part-word phonotactic probability/neighborhood density on word learning by preschool children with normal vocabularies that varied in size. Ninety-eight children (age 2 ; 11-6 ; 0) were taught consonant-vowel-consonant (CVC) nonwords orthogonally varying in the probability/density of the CV (i.e. body) and VC (i.e. rhyme). Learning was measured via picture naming. Children with the lowest expressive vocabulary scores showed no effect of either CV or VC probability/density, although floor effects could not be ruled out. In contrast, children with low or high expressive vocabulary scores demonstrated sensitivity to part-word probability/density with the nature of the effect varying by group. Children with the highest expressive vocabulary scores displayed yet a third pattern of part-word probability/density effects. Taken together, word learning by preschool children was influenced by part-word probability/density but the nature of this influence appeared to depend on the size of the lexicon.
de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander
2017-11-01
To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.
Energy Technology Data Exchange (ETDEWEB)
Kikuchi, Ryota; Misaka, Takashi; Obayashi, Shigeru, E-mail: rkikuchi@edge.ifs.tohoku.ac.jp [Institute of Fluid Science, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai, Miyagi 980-8577 (Japan)
2015-10-15
An integrated method of a proper orthogonal decomposition based reduced-order model (ROM) and data assimilation is proposed for the real-time prediction of an unsteady flow field. In this paper, a particle filter (PF) and an ensemble Kalman filter (EnKF) are compared for data assimilation and the difference in the predicted flow fields is evaluated focusing on the probability density function (PDF) of the model variables. The proposed method is demonstrated using identical twin experiments of an unsteady flow field around a circular cylinder at the Reynolds number of 1000. The PF and EnKF are employed to estimate temporal coefficients of the ROM based on the observed velocity components in the wake of the circular cylinder. The prediction accuracy of ROM-PF is significantly better than that of ROM-EnKF due to the flexibility of PF for representing a PDF compared to EnKF. Furthermore, the proposed method reproduces the unsteady flow field several orders faster than the reference numerical simulation based on the Navier–Stokes equations. (paper)
Afsar, Ozgur; Tirnakli, Ugur
2010-10-01
We investigate the probability density of rescaled sum of iterates of sine-circle map within quasiperiodic route to chaos. When the dynamical system is strongly mixing (i.e., ergodic), standard central limit theorem (CLT) is expected to be valid, but at the edge of chaos where iterates have strong correlations, the standard CLT is not necessarily valid anymore. We discuss here the main characteristics of the probability densities for the sums of iterates of deterministic dynamical systems which exhibit quasiperiodic route to chaos. At the golden-mean onset of chaos for the sine-circle map, we numerically verify that the probability density appears to converge to a q -Gaussian with q<1 as the golden mean value is approached.
Energy Technology Data Exchange (ETDEWEB)
Rullaud, M.
2004-06-01
A new modelization of turbulent combustion is proposed with detailed chemistry and probability density functions (PDFs). The objective is to capture temperature and species concentrations, mainly the CO. The PCM-FTC model, Presumed Conditional Moment - Flame Tabulated Chemistry, is based on the tabulation of laminar premixed and diffusion flames to capture partial pre-mixing present in aeronautical engines. The presumed PDFs is introduced to predict averaged values. The tabulation method is based on the analysis of the chemical structure of laminar premixed and diffusion flames. Hypothesis are presented, tested and validated with Sandia experimental data jet flames. Then, the model is introduced in a turbulent flow simulation software. Three configurations are retained to quantify the level of prediction of this formulation: the D and F-Flames of Sandia and lifted jet flames of methane/air of Stanford. A good agreement is observed between experiments and simulations. The validity of this method is then demonstrated. (author)
Cannon, Alex J.
2018-01-01
Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin
Chevalier, M.; Cheddadi, R.; Chase, B. M.
2014-11-01
Several methods currently exist to quantitatively reconstruct palaeoclimatic variables from fossil botanical data. Of these, probability density function (PDF)-based methods have proven valuable as they can be applied to a wide range of plant assemblages. Most commonly applied to fossil pollen data, their performance, however, can be limited by the taxonomic resolution of the pollen data, as many species may belong to a given pollen type. Consequently, the climate information associated with different species cannot always be precisely identified, resulting in less-accurate reconstructions. This can become particularly problematic in regions of high biodiversity. In this paper, we propose a novel PDF-based method that takes into account the different climatic requirements of each species constituting the broader pollen type. PDFs are fitted in two successive steps, with parametric PDFs fitted first for each species and then a combination of those individual species PDFs into a broader single PDF to represent the pollen type as a unit. A climate value for the pollen assemblage is estimated from the likelihood function obtained after the multiplication of the pollen-type PDFs, with each being weighted according to its pollen percentage. To test its performance, we have applied the method to southern Africa as a regional case study and reconstructed a suite of climatic variables (e.g. winter and summer temperature and precipitation, mean annual aridity, rainfall seasonality). The reconstructions are shown to be accurate for both temperature and precipitation. Predictable exceptions were areas that experience conditions at the extremes of the regional climatic spectra. Importantly, the accuracy of the reconstructed values is independent of the vegetation type where the method is applied or the number of species used. The method used in this study is publicly available in a software package entitled CREST (Climate REconstruction SofTware) and will provide the
Gray, Shelley; Pittman, Andrea; Weinhold, Juliet
2014-06-01
In this study, the authors assessed the effects of phonotactic probability and neighborhood density on word-learning configuration by preschoolers with specific language impairment (SLI) and typical language development (TD). One hundred thirty-one children participated: 48 with SLI, 44 with TD matched on age and gender, and 39 with TD matched on vocabulary and gender. Referent identification and naming were assessed in a computer-based learning context. For referent identification, preschoolers with TD benefited from high phonotactic probability, and the younger group also benefited from low neighborhood density. In contrast, the SLI group benefited only from high neighborhood density. For naming, older preschoolers with TD benefited most from low-density words, younger preschoolers with TD benefited most from words with high phonotactic probability, and the SLI group showed no advantage. Phonotactic probability and neighborhood density had different effects on each group that may be related to children's ability to store well-specified word forms and to the size of their extant lexicon. The authors argue that cross-study comparisons of word learning are needed; therefore, researchers should describe word, referent, and learner characteristics and the learning context and should situate their studies in a triggering → configuration + engagement model of word learning.
Tsai, Keng-Chang; Jian, Jhih-Wei; Yang, Ei-Wen; Hsu, Po-Chiang; Peng, Hung-Pin; Chen, Ching-Tai; Chen, Jun-Bo; Chang, Jeng-Yih; Hsu, Wen-Lian; Yang, An-Suei
2012-01-01
Non-covalent protein-carbohydrate interactions mediate molecular targeting in many biological processes. Prediction of non-covalent carbohydrate binding sites on protein surfaces not only provides insights into the functions of the query proteins; information on key carbohydrate-binding residues could suggest site-directed mutagenesis experiments, design therapeutics targeting carbohydrate-binding proteins, and provide guidance in engineering protein-carbohydrate interactions. In this work, we show that non-covalent carbohydrate binding sites on protein surfaces can be predicted with relatively high accuracy when the query protein structures are known. The prediction capabilities were based on a novel encoding scheme of the three-dimensional probability density maps describing the distributions of 36 non-covalent interacting atom types around protein surfaces. One machine learning model was trained for each of the 30 protein atom types. The machine learning algorithms predicted tentative carbohydrate binding sites on query proteins by recognizing the characteristic interacting atom distribution patterns specific for carbohydrate binding sites from known protein structures. The prediction results for all protein atom types were integrated into surface patches as tentative carbohydrate binding sites based on normalized prediction confidence level. The prediction capabilities of the predictors were benchmarked by a 10-fold cross validation on 497 non-redundant proteins with known carbohydrate binding sites. The predictors were further tested on an independent test set with 108 proteins. The residue-based Matthews correlation coefficient (MCC) for the independent test was 0.45, with prediction precision and sensitivity (or recall) of 0.45 and 0.49 respectively. In addition, 111 unbound carbohydrate-binding protein structures for which the structures were determined in the absence of the carbohydrate ligands were predicted with the trained predictors. The overall
Keller, B U; Montal, M S; Hartshorne, R P; Montal, M
1990-01-01
Two-dimensional probability density analysis of single channel current recordings was applied to two purified channel proteins reconstituted in planar lipid bilayers: Torpedo acetylcholine receptors and voltage-sensitive sodium channels from rat brain. The information contained in the dynamic history of the gating process, i.e., the time sequence of opening and closing events was extracted from two-dimensional distributions of transitions between identifiable states. This approach allows one to identify kinetic models consistent with the observables. Gating of acetylcholine receptors expresses "memory" of the transition history: the receptor has two channel open (O) states; the residence time in each of them strongly depends on both the preceding open time and the intervening closed interval. Correspondingly, the residence time in the closed (C) states depends on both the preceding open time and the preceding closed time. This result confirms the scheme that considers, at least, two transition pathways between the open and closed states and extends the details of the model in that it defines that the short-lived open state is primarily entered from long-lived closed states while the long-lived open state is accessed mainly through short-lived closed states. Since ligand binding to the acetylcholine-binding sites is a reaction with channel closed states, we infer that the longest closed state (approximately 19 ms) is unliganded, the intermediate closed state (approximately 2 ms) is singly liganded and makes transitions to the short open state (approximately 0.5 ms) and the shortest closed state (approximately 0.4 ms) is doubly liganded and isomerizes to long open states (approximately 5 ms). This is the simplest interpretation consistent with available data. In contrast, sodium channels modified with batrachotoxin to eliminate inactivation show no correlation in the sequence of channel opening and closing events, i.e., have no memory of the transition history. This
Yan, Xu; Guo, Lixin; Cheng, Mingjian; Li, Jiangting; Huang, Qingqing; Sun, Ridong
2017-06-26
The probability densities of orbital angular momentum (OAM) modes of the autofocusing Airy beam (AAB) carrying power-exponent-phase vortex (PEPV) after passing through the weak anisotropic non-Kolmogorov turbulent atmosphere are theoretically formulated. It is found that the AAB carrying PEPV is the result of the weighted superposition of multiple OAM modes at differing positions within the beam cross-section, and the mutual crosstalk among different OAM modes will compensate the distortion of each OAM mode and be helpful for boosting the anti-jamming performance of the communication link. Based on numerical calculations, the role of the wavelength, waist width, topological charge and power order of PEPV in the probability density distribution variations of OAM modes of the AAB carrying PEPV is explored. Analysis shows that a relatively small beam waist and longer wavelength are good for separating the detection regions between signal OAM mode and crosstalk OAM modes. The probability density distribution of the signal OAM mode does not change obviously with the topological charge variation; but it will be greatly enhanced with the increase of power order. Furthermore, it is found that the detection region center position of crosstalk OAM mode is an emergent property resulting from power order and topological charge. Therefore, the power order can be introduced as an extra steering parameter to modulate the probability density distributions of OAM modes. These results provide guidelines for the design of an optimal detector, which has potential application in optical vortex communication systems.
Yu, N.; Delrieu, G.; Boudevillain, Brice; Hazenberg, P.; Uijlenhoet, R.
2014-01-01
This study offers a unified formulation of single- and multimoment normalizations of the raindrop size distribution (DSD), which have been proposed in the framework of scaling analyses in the literature. The key point is to consider a well-defined “general distribution” g(x) as the probability
Ballestra, Luca Vincenzo; Pacelli, Graziella; Radi, Davide
2016-12-01
We propose a numerical method to compute the first-passage probability density function in a time-changed Brownian model. In particular, we derive an integral representation of such a density function in which the integrand functions must be obtained solving a system of Volterra equations of the first kind. In addition, we develop an ad-hoc numerical procedure to regularize and solve this system of integral equations. The proposed method is tested on three application problems of interest in mathematical finance, namely the calculation of the survival probability of an indebted firm, the pricing of a single-knock-out put option and the pricing of a double-knock-out put option. The results obtained reveal that the novel approach is extremely accurate and fast, and performs significantly better than the finite difference method.
Conjuntivite presumível por Acanthamoeba
Ruthes,Ana Cristina de Carvalho; Wahab,Sâmia; Wahab,Najua; Moreira,Hamilton; Moreira,Luciane
2004-01-01
OBJETIVO: Abordar quatro casos de conjuntivite presumível por Acanthamoeba, descrevendo o diagnóstico, considerando sinais e sintomas e o tratamento instituído. MÉTODOS: Foram estudados casos de conjuntivite presumível por Acanthamoeba diagnosticados no Hospital de Olhos do Paraná (HOP), no período de setembro/1998 a janeiro/2002. Todos os olhos estudados foram submetidos a um protocolo de investigação que incluía exame oftalmológico completo, microbiologia e cultura de secreções conjuntivais...
International Nuclear Information System (INIS)
Burgazzi, Luciano
2011-01-01
PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest
International Nuclear Information System (INIS)
Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.
1992-10-01
The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed
Presumed hereditary retinal degenerations: Ibadan experience ...
African Journals Online (AJOL)
This study describes the clinical presentation of RP, the prevalence of associated treatable disorders and the characteristics of patients with severe visual impairment and blindness. Method: A retrospective review of 52 cases presumed and diagnosed to have RP was performed on patients who presented at the Eye Clinic, ...
2018-01-30
stressors, explain the distribution of a keystone species . Biological Invasions 18:3309‐3318. Steen, D. A. 2010. Snakes in the grass: secretive natural...Carolina Sandhills and the invasive Burmese python (Python molurus bivittatus) in Everglades National Park, Florida. For both species , existing...radiotelemetry and extensive road survey data are used to generate the first density estimates available for the species . The results show that southern
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Velazquez, Sergio
2008-01-01
Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error ε made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R 2 statistic (R a 2 ). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R a 2 statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R a 2 increases
International Nuclear Information System (INIS)
Croce, R.P.; Demma, Th.; Pierro, V.; Pinto, I.M.; Longo, M.; Marano, S.; Matta, V.
2004-01-01
The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness
Energy Technology Data Exchange (ETDEWEB)
Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br
2009-07-01
This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.
Presumed oculoglandular syndrome from Bartonella quintana.
Borboli, Sheila; Afshari, Natalie A; Watkins, Lynnette; Foster, C Stephen
2007-01-01
To describe a case of clinically diagnosed oculoglandular syndrome in a 17-year-old patient that was presumed to be due to Bartonella quintana, as suggested by a positive serologic titer. The patient presented to the Massachusetts Eye and Ear Infirmary emergency room with signs and symptoms suggestive of oculoglandular syndrome. He had a follicular conjunctivitis with a conjunctival granuloma of the right eye and an ipsilateral large, tender submandibular lymph node. He had recently acquired a kitten and a clinical diagnosis of cat-scratch disease was made. A laboratory workup was initiated to determine the cause of this clinical presentation and empirical treatment with antibiotics was started. All laboratory results were negative or normal except for the IgM titer to Bartonella quintana, which was elevated. The patient responded well to treatment and his symptoms resolved within a few weeks. Bartonella quintana infection, a pathogen prevalent in HIV-infected, homeless, or alcoholic patients, is a possible etiologic agent of cat-scratch disease and the associated condition of oculoglandular syndrome.
Venturi, D.; Karniadakis, G. E.
2012-08-01
By using functional integral methods we determine new evolution equations satisfied by the joint response-excitation probability density function (PDF) associated with the stochastic solution to first-order nonlinear partial differential equations (PDEs). The theory is presented for both fully nonlinear and for quasilinear scalar PDEs subject to random boundary conditions, random initial conditions or random forcing terms. Particular applications are discussed for the classical linear and nonlinear advection equations and for the advection-reaction equation. By using a Fourier-Galerkin spectral method we obtain numerical solutions of the proposed response-excitation PDF equations. These numerical solutions are compared against those obtained by using more conventional statistical approaches such as probabilistic collocation and multi-element probabilistic collocation methods. It is found that the response-excitation approach yields accurate predictions of the statistical properties of the system. In addition, it allows to directly ascertain the tails of probabilistic distributions, thus facilitating the assessment of rare events and associated risks. The computational cost of the response-excitation method is order magnitudes smaller than the one of more conventional statistical approaches if the PDE is subject to high-dimensional random boundary or initial conditions. The question of high-dimensionality for evolution equations involving multidimensional joint response-excitation PDFs is also addressed.
Durrieu, G; Ciffroy, P; Garnier, J-M
2006-11-01
The objective of the study was to provide global probability density functions (PDFs) representing the uncertainty of distribution coefficients (Kds) in freshwater for radioisotopes of Co, Cs, Sr and I. A comprehensive database containing Kd values referenced in 61 articles was first built and quality scores were affected to each data point according to various criteria (e.g. presentation of data, contact times, pH, solid-to-liquid ratio, expert judgement). A weighted bootstrapping procedure was then set up in order to build PDFs, in such a way that more importance is given to the most relevant data points (i.e. those corresponding to typical natural environments). However, it was also assessed that the relevance and the robustness of the PDFs determined by our procedure depended on the number of Kd values in the database. Owing to the large database, conditional PDFs were also proposed, for site studies where some parametric information is known (e.g. pH, contact time between radionuclides and particles, solid-to-liquid ratio). Such conditional PDFs reduce the uncertainty on the Kd values. These global and conditional PDFs are useful for end-users of dose models because the uncertainty and sensitivity of Kd values are taking into account.
Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente
2017-04-29
Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.
Chowdhury, Snehaunshu
2017-01-23
In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating existing soot models are reported at intervals of ∆x/D∆x/D = 5 along the centerline of turbulent, non-premixed, C2H4/N2 flames. The jet exit Reynolds numbers of the flames investigated were 10,000 and 20,000. A simplified burner geometry based on a published design was chosen to aid modelers. Soot was sampled directly from the flame using a sampling probe with a 0.5-mm diameter orifice and diluted with N2 by a two-stage dilution process. The overall dilution ratio was not evaluated. An SMPS system was used to analyze soot particle concentrations in the diluted samples. Sampling conditions were optimized over a wide range of dilution ratios to eliminate the effect of agglomeration in the sampling probe. Two differential mobility analyzers (DMAs) with different size ranges were used separately in the SMPS measurements to characterize the entire size range of particles. In both flames, the PDFs were found to be mono-modal in nature near the jet exit. Further downstream, the profiles were flatter with a fall-off at larger particle diameters. The geometric mean of the soot size distributions was less than 10 nm for all cases and increased monotonically with axial distance in both flames.
Conjuntivite presumível por Acanthamoeba Conjunctivitis presumably due to Acanthamoeba
Directory of Open Access Journals (Sweden)
Ana Cristina de Carvalho Ruthes
2004-12-01
Full Text Available OBJETIVO: Abordar quatro casos de conjuntivite presumível por Acanthamoeba, descrevendo o diagnóstico, considerando sinais e sintomas e o tratamento instituído. MÉTODOS: Foram estudados casos de conjuntivite presumível por Acanthamoeba diagnosticados no Hospital de Olhos do Paraná (HOP, no período de setembro/1998 a janeiro/2002. Todos os olhos estudados foram submetidos a um protocolo de investigação que incluía exame oftalmológico completo, microbiologia e cultura de secreções conjuntivais. RESULTADOS: Os exames laboratoriais de microscopia e cultura do material colhido estes pacientes revelaram o diagnóstico de Acanthamoeba. A maioria dos pacientes referia olhos vermelhos e irritação ocular de longa data. Os autores encontraram correlação entre a cultura e o exame direto, em que se evidenciou a presença de cistos e trofozoítas do protozoário. CONCLUSÃO: Este é o primeiro relato de conjuntivite provavelmente por Acanthamoeba de acordo com a literatura revisada. Pacientes selecionados e refratários ao tratamento habitual de infecção ocular externa devem ser considerados para estudo laboratorial adequado à procura etiológica da doença.PURPOSE: To describe four cases of conjunctivitis presumably due to Acanthamoeba considering diagnosis, signs, symptoms and treatment. METHODS: We reviewed the medical records of all patients who presented a clinical diagnosis of Acanthamoeba conjunctivitis between September/1998 to January/2001 at the "Hospital de Olhos do Paraná (HOP". All eyes were submitted to a protocol of investigation that included ophthalmologic examination, microscopic examination and culture exams of conjunctival smears for adequate treatment. RESULTS: The laboratorial results of conjunctival smears revealed contamination with Acanthamoeba by direct examination and thereafter, confirmed by culture. The authors observed cysts and trophozoites of Acanthamoeba. CONCLUSION: This is the first report of
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Gotovac, Hrvoje; Cvetkovic, Vladimir; Andricevic, Roko
2010-05-01
The travel time formulation of advective transport in heterogeneous porous media is of interest both conceptually, e.g., for incorporating retention processes, and in applications where typically the travel time peak, early, and late arrivals of contaminants are of major concern in a regulatory or remediation context. Furthermore, the travel time moments are of interest for quantifying uncertainty in advective transport of tracers released from point sources in heterogeneous aquifers. In view of this interest, the travel time distribution has been studied in the literature; however, the link to the hydraulic conductivity statistics has been typically restricted to the first two moments. Here we investigate the influence of higher travel time moments on the travel time probability density function (pdf) in heterogeneous porous media combining Monte Carlo simulations with the maximum entropy principle. The Monte Carlo experimental pdf is obtained by the adaptive Fup Monte Carlo method (AFMCM) for advective transport characterized by a multi-Gaussian structure with exponential covariance considering two injection modes (in-flux and resident) and lnK variance up to 8. A maximum entropy (MaxEnt) algorithm based on Fup basis functions is used for the complete characterization of the travel time pdf. All travel time moments become linear with distance. Initial nonlinearity is found mainly for the resident injection mode, which exhibits a strong nonlinearity within first 5IY for high heterogeneity. For the resident injection mode, the form of variance and all higher moments changes from the familiar concave form predicted by the first-order theory to a convex form; for the in-flux mode, linearity is preserved even for high heterogeneity. The number of moments sufficient for a complete characterization of the travel time pdf mainly depends on the heterogeneity level. Mean and variance completely describe travel time pdf for low and mild heterogeneity, skewness is dominant
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
DEFF Research Database (Denmark)
Ekelund, Flemming; Christensen, Søren; Rønn, Regin
1999-01-01
An automated modification of the most-probable-number (MPN) technique has been developed for enumeration of phagotrophic protozoa. The method is based on detection of prey depletion in micro titre plates rather than on presence of protozoa. A transconjugant Pseudomonas fluorescens DR54 labelled...... into micro titre plates amended with a suspension of the transconjugant. After 45 days measurement of light emission allowed detection of individual wells in the titre plates, where protozoan grazing had removed the inoculated bacteria....
Directory of Open Access Journals (Sweden)
Farnoosh Basaligheh
2015-12-01
Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.
DEFF Research Database (Denmark)
Ekelund, Flemming; Christensen, Søren; Rønn, Regin
1999-01-01
An automated modification of the most-probable-number (MPN) technique has been developed for enumeration of phagotrophic protozoa. The method is based on detection of prey depletion in micro titre plates rather than on presence of protozoa. A transconjugant Pseudomonas fluorescens DR54 labelled...... with a luxAB gene cassette was constructed, and used as growth medium for the protozoa in the micro titre plates. The transconjugant produced high amounts of luciferase which was stable and allowed detection for at least 8 weeks. Dilution series of protozoan cultures and soil suspensions were inoculated...
International Nuclear Information System (INIS)
Kumar, A.; Rao, K.S.; Srinivasan, M.
1983-01-01
The Trombay criticality formula (TCF) has been derived by incorporating a number of well-known concepts of criticality physics to enable prediction of changes in critical size or k /SUB eff/ following alterations in geometrical and physical parameters of uniformly reflected small reactor assemblies characterized by large neutron leakage from the core. The variant parameters considered are size, shape, density and diluent concentration of the core, and density and thickness of the reflector. The effect of these changes (except core size) manifests, through sigma /SUB c/ the critical surface mass density of the ''corresponding critical core,'' that sigma, the massto-surface-area ratio of the core,'' is essentially a measure of the product /rho/ extended to nonspherical systems and plays a dominant role in the TCF. The functional dependence of k /SUB eff/ on sigma/sigma /SUB c/ , the system size relative to critical, is expressed in the TCF through two alternative representations, namely the modified Wigner rational form and, an exponential form, which is given
Directory of Open Access Journals (Sweden)
Thomas S Churcher
2017-01-01
Full Text Available Over a century since Ronald Ross discovered that malaria is caused by the bite of an infectious mosquito it is still unclear how the number of parasites injected influences disease transmission. Currently it is assumed that all mosquitoes with salivary gland sporozoites are equally infectious irrespective of the number of parasites they harbour, though this has never been rigorously tested. Here we analyse >1000 experimental infections of humans and mice and demonstrate a dose-dependency for probability of infection and the length of the host pre-patent period. Mosquitoes with a higher numbers of sporozoites in their salivary glands following blood-feeding are more likely to have caused infection (and have done so quicker than mosquitoes with fewer parasites. A similar dose response for the probability of infection was seen for humans given a pre-erythrocytic vaccine candidate targeting circumsporozoite protein (CSP, and in mice with and without transfusion of anti-CSP antibodies. These interventions prevented infection more efficiently from bites made by mosquitoes with fewer parasites. The importance of parasite number has widespread implications across malariology, ranging from our basic understanding of the parasite, how vaccines are evaluated and the way in which transmission should be measured in the field. It also provides direct evidence for why the only registered malaria vaccine RTS,S was partially effective in recent clinical trials.
Probability and complex quantum trajectories
International Nuclear Information System (INIS)
John, Moncy V.
2009-01-01
It is shown that in the complex trajectory representation of quantum mechanics, the Born's Ψ*Ψ probability density can be obtained from the imaginary part of the velocity field of particles on the real axis. Extending this probability axiom to the complex plane, we first attempt to find a probability density by solving an appropriate conservation equation. The characteristic curves of this conservation equation are found to be the same as the complex paths of particles in the new representation. The boundary condition in this case is that the extended probability density should agree with the quantum probability rule along the real line. For the simple, time-independent, one-dimensional problems worked out here, we find that a conserved probability density can be derived from the velocity field of particles, except in regions where the trajectories were previously suspected to be nonviable. An alternative method to find this probability density in terms of a trajectory integral, which is easier to implement on a computer and useful for single particle solutions, is also presented. Most importantly, we show, by using the complex extension of Schrodinger equation, that the desired conservation equation can be derived from this definition of probability density
Leptospirosis in a dog with uveitis and presumed cholecystitis.
Gallagher, Alexander
2011-01-01
A 7 yr old castrated male Australian shepherd dog was examined for acute change in iris color, lethargy, and anorexia. Uveitis, acute renal failure, and presumed cholecystitis were diagnosed. Based on clinical findings, leptosporosis was suspected, and the dog was treated with antibiotics and supportive care. The dog made a complete recovery, and leptospirosis was confirmed on convalescent titers. Due to the zoonotic potential, leptospirosis should be considered in cases of uveitis, as well as possible cholecystitis.
Presumed Group B Streptococcal Meningitis After Epidural Blood Patch.
Beilin, Yaakov; Spitzer, Yelena
2015-06-15
Bacterial meningitis after epidural catheter placement is rare. We describe a case in which a parturient received labor epidural analgesia for vaginal delivery complicated by dural puncture. The patient developed postdural puncture headache and underwent 2 separate epidural blood patch procedures. She subsequently developed a headache with fever and focal neurologic deficits. She was treated with broad spectrum antibiotics for presumed meningitis, and she made a full recovery. Blood cultures subsequently grew group B streptococcus.
Presumed consent for organ donation: is Romania prepared for it?
Grigoras, I; Condac, C; Cartes, C; Blaj, M; Florin, G
2010-01-01
In November 2007, a legislative initiative regarding the presumed consent for organ donation was proposed for parliamentary debate in Romania and was followed by public debate. The study aimed to asses public opinions expressed in the Romanian media. An Internet search was made. The pro and con reasons, the affiliation of parts involved in the debate and suggested future direction of action were identified. The Internet search had 8572 results. The parts involved in the pro and con debate consisted of governmental structures, physicians, ethicists, politicians, media, religious authorities, nongovernmental associations, and lay persons. The main pros were the low rate of organ donation and the long waiting lists, enhancement of organ procurement, avoidance of wasting valuable organs, avoiding responsibility, and the stress imposed to the family in giving the donation consent, humanitarian purposes (saving lives), going along with the scientific progress, and less bureaucracy. The main cons were an unethical issue, violation of human rights, denial of brain death, unethical advantage of public ignorance, unethical use of underprivileged people, little results in terms of organ procurement, but huge negative effects on public opinion, public mistrust in transplant programs and impossibility of refusal identification due to particularities of the Romanian medical system. The con opinions prevailed. For the moment, Romania seems to be unprepared to accept presumed consent. A future change in public perception regarding organ transplantation may modify the terms of a public debate.
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Presumed symbolic use of diurnal raptors by Neanderthals.
Directory of Open Access Journals (Sweden)
Eugène Morin
Full Text Available In Africa and western Eurasia, occurrences of burials and utilized ocher fragments during the late Middle and early Late Pleistocene are often considered evidence for the emergence of symbolically-mediated behavior. Perhaps less controversial for the study of human cognitive evolution are finds of marine shell beads and complex designs on organic and mineral artifacts in early modern human (EMH assemblages conservatively dated to ≈ 100-60 kilo-years (ka ago. Here we show that, in France, Neanderthals used skeletal parts of large diurnal raptors presumably for symbolic purposes at Combe-Grenal in a layer dated to marine isotope stage (MIS 5b (≈ 90 ka and at Les Fieux in stratigraphic units dated to the early/middle phase of MIS 3 (60-40 ka. The presence of similar objects in other Middle Paleolithic contexts in France and Italy suggest that raptors were used as means of symbolic expression by Neanderthals in these regions.
Reactivation of presumed adenoviral keratitis after laser in situ keratomileusis.
Safak, Nilgün; Bilgihan, Kamil; Gürelik, Gökhan; Ozdek, Sengül; Hasanreisoğlu, Berati
2002-04-01
We report a patient with reactivation of presumed adenoviral keratoconjunctivitis after laser in situ keratomileusis (LASIK) to correct high myopia. The preoperative refraction was -13.00 diopters (D) in the right eye and -14.00 D in the left eye, and the best corrected visual acuity was 20/20 in both eyes. On the first postoperative day, mild conjunctival hyperemia and multiple subepithelial infiltrations localized in the flap zone consistent with adenoviral keratoconjunctivitis were seen. After prompt treatment, the lesions resolved. As a consequence, LASIK successfully corrected the high myopia. Adenoviral keratoconjunctivitis can be reactivated after LASIK, unlike after photorefractive keratectomy, despite the absence of symptomatic and clinical findings before the procedure.
Siebor, E; Llanes, C; Lafon, I; Ogier-Desserrey, A; Duez, J M; Pechinot, A; Caillot, D; Grandjean, M; Sixt, N; Neuwirth, C
2007-03-01
Reported here are the microbiological and epidemiological details of a presumed outbreak of aerobic gram-negative bacilli infections affecting 19 hematological patients, which was traced to contaminated disinfectant. Over a 5-month period, the following organisms were isolated from the blood cultures of 19 neutropenic patients: Pseudomonas fluorescens (n = 13), Achromobacter xylosoxidans (n = 12), Comamonas testosteroni (n = 2) or Stenotrophomonas maltophilia (n = 1). The affected patients were all treated with an expensive regimen of broad-spectrum antibiotic therapy. The same bacteria were recovered from environmental samples as well as from the water pipes of an apparatus for dispensing disinfectant (didecyldimethylammonium chloride). Genotyping results indicated that many of the clinical strains were identical to strains isolated from the apparatus. It was eventually discovered that the night staff was in the habit of disinfecting the blood-culture bottles before use, thereby contaminating the bottles with bacteria contained in the disinfectant. Contamination of the apparatus resulted from faulty maintenance.
Wang, Xiao-Feng; Yu, Jing-Jia; Wang, Xiao-Jing; Jing, Yi-Xuan; Sun, Li-Hao; Tao, Bei; Wang, Wei-Qing; Ning, Guang; Liu, Jian-Min; Zhao, Hong-Yan
2018-04-01
In the current study, we investigated the vitamin D status, and its relationships with parathyroid hormone (PTH) levels, bone mineral density (BMD), and the 10-year probability of fractures in Chinese patients with type 2 diabetes mellitus (T2DM). This was a cross-sectional study of 785 patients. BMDs at the lumbar spine (L2-4), femoral neck (FN), and total hip (TH) were measured by dual-energy X-ray absorptiometry (DXA). Serum levels of 25-hydroxyvitamin D (25(OH)D) and intact PTH were also quantified. The 10-year probability of fracture risk (major osteoporotic fracture [MOF] and hip fracture [HF]) was assessed using the fracture risk assessment tool (FRAX). The prevalence of vitamin D deficiency was 82.3%, and the mean 25(OH)D level was 36.9 ± 15.2 nmol/L. The adequate group had higher BMDs at the FN and TH and lower MOF risk than the inadequate groups. Lower 25(OH)D was associated with higher PTH ( r = -0.126, PPTH was negatively correlated with BMDs at 3 sites and positively correlated with MOF and HF, but this relationship disappeared in the adequate subgroup. Multivariate stepwise regression analysis revealed that PTH was the determinant of MOF (standard β = 0.073, P = .010) and HF (standard β = 0.094, P = .004). Our results identified a significantly high rate of vitamin D deficiency among Chinese patients with T2DM. PTH is an important risk factor responsible for the higher 10-year probability of osteoporotic fractures in diabetic patients, especially in those with lower vitamin D levels. AKP = alkaline phosphatase; ALB = serum albumin; BMD = bone mineral density; BMI = body mass index; Ca = calcium; CKD = chronic kidney disease; Cr = creatinine; FN = femoral neck; FRAX = fracture risk assessment tool; HbA1c = glycated hemoglobin A1c; HF = hip fracture; L2-4 = lumbar spine; MOF = major osteoporotic fracture; 25(OH)D = 25-hydroxyvitamin D; P = phosphorus; PTH = parathyroid hormone; T2DM = type 2 diabetes mellitus; TH = total hip; UA = uric acid.
The spectrum of presumed tubercular uveitis in Tunisia, North Africa.
Khochtali, Sana; Gargouri, Salma; Abroug, Nesrine; Ksiaa, Imen; Attia, Sonia; Sellami, Dorra; Feki, Jamel; Khairallah, Moncef
2015-10-01
The purpose of this study was to analyze the spectrum of presumed tubercular uveitis in Tunisia, North Africa. We retrospectively reviewed the clinical records of 38 patients (65 eyes) diagnosed with presumed tubercular uveitis at two referral centers in Tunisia, between January 2009 and December 2011. Mean age at presentation was 42.7 years. Twenty-four patients were women (63.2%) and 14 (36.8%) were men. Twenty-three eyes (35.4%) had posterior uveitis, 21 eyes (32.3%) had intermediate uveitis, 13 eyes (20%) had panuveitis, and 8 eyes (12.3%) had anterior uveitis. Ocular findings included vitritis in 67.7% of eyes, posterior synechiae in 47.7%, multifocal non-serpiginoid choroiditis in 23.1%, multifocal serpiginoid choroiditis in 21.5%, periphlebitis in 21.5%, and mutton-fat keratic precipitates in 20%. Anti-tubercular treatment was prescribed in 33 patients (86.8%) and was associated with systemic corticosteroids in 20 patients (52.6%) and periocular injections of corticosteroids in four patients (10.5%). After a mean follow-up of 14.2 months (range, 10-58), inflammation was controlled, with a significant improvement in visual acuity (VA) (p = 0.028). However, recurrences developed in two patients (5.3%). Final VA was better than 20/40 in 27 eyes (41.5%) and less than 20/200 in five eyes (7.7%). In Tunisia, all anatomic types are possible in tuberculosis-associated uveitis, but posterior and intermediate uveitis are more frequent. Vitritis, posterior synechiae, multifocal serpiginoid or non-serpiginoid choroiditis, and periphlebitis are the most common manifestations.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Infectious endotheliitis: a rare case of presumed mycotic origin
Zapata, Luis Fernando; Paulo, José David; Restrepo, Carlos A; Velásquez, Luis Fernando; Montoya, Andrés E Toro; Zapata, Melissa A
2013-01-01
Purpose To report an interesting case of infectious endotheliitis of presumed mycotic origin. Methods A case report of a 56-year-old male farmer who sought medical attention after a month-long evolution of irritative symptoms in his right eye, accompanied by visual acuity (VA) impairment. The patient received topical and oral broad-spectrum antibiotic treatment with no improvement before being referred to a cornea specialist, where he was found to have VA of 20/150 and was noted on biomicroscopy to have endothelial feathery coalescent lesions. The patient was admitted to the hospital for an aqueous humor sample and intravenous voriconazole. Results The microbiological studies did not isolate any micro-organisms. However, clinical evidence of improvement was confirmed after 5 days of antimycotic intravenous therapy. Complete clinical resolution was achieved at 1 month after treatment completion with oral voriconazole, as evidenced by VA of 20/20 and disappearance of endothelial lesions. Conclusion Endothelial involvement by fungi is a rare condition. In this case, no microbes were isolated, but the characteristic morphology of the lesions, the history of onychomycosis, and the spectacular response to voriconazole turn this case into a valid presumptive diagnosis. PMID:23901253
Outbreak of keratitis presumed to be caused by Acanthamoeba.
Mathers, W D; Sutphin, J E; Folberg, R; Meier, P A; Wenzel, R P; Elgin, R G
1996-02-01
A sharp increase of Acanthamoeba keratitis from two cases per year to 30 cases per year at our institution prompted this study to determine whether there was a change in the clinical characteristics, basic epidemiology, and outcome of this disease. We reviewed all cases of Acanthamoeba keratitis diagnosed at the University of Iowa Hospitals and Clinics from mid-1993 through 1994. We screened 217 patients with keratitis by tandem scanning confocal microscopy and suspected Acanthamoeba in 51 patients. Diagnosis was confirmed by cytology in 43 patients (48 eyes). There were no positive cultures. Patients examined within four weeks of onset of symptoms were younger (mean age, 32.6 +/- 15.4 years) and wore contact lenses (11 of 18 patients), and infrequently herpes simplex keratitis (four of 18 patients) was diagnosed. Patients examined after four weeks were older (mean age, 54.0 +/- 19.5 years), infrequently wore contact lenses (six of 25 patients), and often had herpes simplex keratitis (12 of 25 patients). Corneal examination with tandem scanning confocal microscopy was associated with a marked increase in the detection of Acanthamoeba, strongly suggesting that the disease is more prevalent than suspected. Acanthamoeba may account for many cases of clinically presumed herpes simplex keratitis, the leading cause of corneal blindness in the United States. Acanthamoeba should be considered in the differential diagnosis of any unexplained keratitis, even those of short duration.
Energy Technology Data Exchange (ETDEWEB)
Varella, Marcio Teixeira do Nascimento
2001-12-15
We have calculated annihilation probability densities (APD) for positron collisions against He atom and H{sub 2} molecule. It was found that direct annihilation prevails at low energies, while annihilation following virtual positronium (Ps) formation is the dominant mechanism at higher energies. In room-temperature collisions (10{sup -2} eV) the APD spread over a considerable extension, being quite similar to the electronic densities of the targets. The capture of the positron in an electronic Feshbach resonance strongly enhanced the annihilation rate in e{sup +}-H{sub 2} collisions. We also discuss strategies to improve the calculation of the annihilation parameter (Z{sub eff} ), after debugging the computational codes of the Schwinger Multichannel Method (SMC). Finally, we consider the inclusion of the Ps formation channel in the SMC and show that effective configurations (pseudo eigenstates of the Hamiltonian of the collision ) are able to significantly reduce the computational effort in positron scattering calculations. Cross sections for electron scattering by polyatomic molecules were obtained in three different approximations: static-exchange (SE); tatic-exchange-plus-polarization (SEP); and multichannel coupling. The calculations for polar targets were improved through the rotational resolution of scattering amplitudes in which the SMC was combined with the first Born approximation (FBA). In general, elastic cross sections (SE and SEP approximations) showed good agreement with available experimental data for several targets. Multichannel calculations for e{sup -} -H{sub 2}O scattering, on the other hand, presented spurious structures at the electronic excitation thresholds (author)
Infectious endotheliitis: a rare case of presumed mycotic origin
Directory of Open Access Journals (Sweden)
Zapata LF
2013-07-01
Full Text Available Luis Fernando Zapata,1 José David Paulo,1 Carlos A Restrepo,1 Luis Fernando Velásquez,2 Andrés E Toro Montoya,2 Melissa A Zapata21Department of Ophthalmology Hospital Pablo Tobón Uribe; 2School of Medicine, Universidad Pontificia Bolivariana, Medellín, ColombiaPurpose: To report an interesting case of infectious endotheliitis of presumed mycotic origin.Methods: A case report of a 56-year-old male farmer who sought medical attention after a month-long evolution of irritative symptoms in his right eye, accompanied by visual acuity (VA impairment. The patient received topical and oral broad-spectrum antibiotic treatment with no improvement before being referred to a cornea specialist, where he was found to have VA of 20/150 and was noted on biomicroscopy to have endothelial feathery coalescent lesions. The patient was admitted to the hospital for an aqueous humor sample and intravenous voriconazole.Results: The microbiological studies did not isolate any micro-organisms. However, clinical evidence of improvement was confirmed after 5 days of antimycotic intravenous therapy. Complete clinical resolution was achieved at 1 month after treatment completion with oral voriconazole, as evidenced by VA of 20/20 and disappearance of endothelial lesions.Conclusion: Endothelial involvement by fungi is a rare condition. In this case, no microbes were isolated, but the characteristic morphology of the lesions, the history of onychomycosis, and the spectacular response to voriconazole turn this case into a valid presumptive diagnosis.Keywords: endotheliitis, mycotic, keratitis, voriconazole
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
[Allergy and autoimmunity: Molecular diagnostics, therapy, and presumable pathogenesis].
Arefieva, A S; Smoldovskaya, O V; Tikhonov, A A; Rubina, A Yu
2017-01-01
Allergic and autoimmune diseases represent immunopathological reactions of an organism to antigens. Despite that the allergy is a result of exaggerated immune response to foreign antigens (allergens) and autoimmune diseases are characterized by the pathological response to internal antigens (autoantigens), the underlying mechanisms of these diseases are probably common. Thus, both types of diseases represent variations in the hypersensitivity reaction. A large percentage of both the adult and pediatric population is in need of early diagnostics of these pathologies of the immune system. Considering the diversity of antibodies produced in allergic and autoimmune disease and the difficulties accompanying clinical diagnosing, molecular diagnostics of these pathological processes should be carried out in several stages, including screening and confirmatory studies. In this review, we summarize the available data on the molecular diagnostics and therapy of allergic and autoimmune diseases and discuss the basic similarities and differences in the mechanisms of their development.
Stochastic Modeling of Climatic Probabilities.
1979-11-01
students who contributed in a major way to the success of the project are Sarah Autrey, Jeff Em erson, Karl Grammel , Tom licknor and Debbie Wa i te. A...sophisticati . n an d cost of weapons systems and the recognition that the environment di-grades or offers opportunities h ~s led to tile requirement for...First , make a h istogram of the data , an d then “smooth” the histogram to obtain a frequency distribution (probability density function). The
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Feline dry eye syndrome of presumed neurogenic origin: a case report.
Sebbag, Lionel; Pesavento, Patricia A; Carrasco, Sebastian E; Reilly, Christopher M; Maggs, David J
2018-01-01
A 14-year-old female spayed Abyssinian cat, which about 1 year previously underwent thoracic limb amputation, radiotherapy and chemotherapy for an incompletely excised vaccine-related fibrosarcoma, was presented for evaluation of corneal opacity in the left eye (OS). The ocular surface of both eyes (OU) had a lackluster appearance and there was a stromal corneal ulcer OS. Results of corneal aesthesiometry, Schirmer tear test-1 (STT-1) and tear film breakup time revealed corneal hypoesthesia, and quantitative and qualitative tear film deficiency OU. Noxious olfactory stimulation caused increased lacrimation relative to standard STT-1 values suggesting an intact nasolacrimal reflex. Various lacrimostimulants were administered in succession; namely, 1% pilocarpine administered topically (15 days) or orally (19 days), and topically applied 0.03% tacrolimus (47 days). Pilocarpine, especially when given orally, was associated with notable increases in STT-1 values, but corneal ulceration remained/recurred regardless of administration route, and oral pilocarpine resulted in gastrointestinal upset. Tacrolimus was not effective. After 93 days, the cat became weak and lame and a low thyroxine concentration was detected in serum. The cat was euthanized and a necropsy performed. Both lacrimal glands were histologically normal, but chronic neutrophilic keratitis and reduced conjunctival goblet cell density were noted OU. The final diagnosis was dry eye syndrome (DES) of presumed neurogenic origin, associated with corneal hypoesthesia. This report reinforces the importance of conducting tearfilm testing in cats with ocular surface disease, as clinical signs of DES were different from those described in dogs.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...
Aetiological study of the presumed ocular histoplasmosis syndrome in the Netherlands
Ongkosuwito, J.V.; Kortbeek, L.M.; Lelij, van der A.; Molicka, E.; Kijlstra, A.; Smet, de M.D.; Suttrop-Schulten, M.S.A.
1999-01-01
Aim. To investigate whether presumed ocular histoplasmosis syndrome in the Netherlands is caused by Histoplasma capsulatum and whether other risk factors might play a role in the pathogenesis of this syndrome. Methods. 23 patients were clinically diagnosed as having presumed ocular histoplasmosis
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Directory of Open Access Journals (Sweden)
Laktineh Imad
2010-04-01
Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Paroxysmal atrial fibrillation in seven dogs with presumed neurally-mediated syncope.
Porteiro Vázquez, D M; Perego, M; Santos, L; Gerou-Ferriani, M; Martin, M W S; Santilli, R A
2016-03-01
To document the electrocardiographic findings of vagally-induced paroxysmal atrial fibrillation following a presumed reflex syncopal episode in the dog. Seven dogs with a syncopal episode followed by a paroxysm of atrial fibrillation recorded on a 24-hour Holter. Twenty-four hour Holter monitors were retrospectively reviewed, analysing the cardiac rhythm associated with syncopal events. Each recording was analysed from 10 min before the syncopal episode to until 10 min after a normal sinus rhythm had returned. Nine episodes were recorded in seven dogs, with one patient experiencing three events during one Holter recording. Five of the seven dogs presented with underlying structural heart disease. In two the syncopal episodes occurred following exercise, two associated with coughing and three were during a period of rest. All dogs had documented on the Holter recording a rhythm abnormality during syncope. The most common finding leading up to the syncopal event was development of a progressive sinus bradycardia, followed by sinus arrest interrupted by a ventricular escape rhythm and then ventricular arrest. This was then followed by an atrial fibrillation. The atrial fibrillation was paroxysmal in seven recordings and persistent in two. In two dogs, the atrial fibrillation reorganised into self-limiting runs of atypical atrial flutter. This combination of electrocardiographic arrhythmias are probably caused by an inappropriate parasympathetic stimulation initiating a reflex or neurally-mediated syncope, with abnormal automaticity of the sinus node and of the subsidiary pacemaker cells and changes in the electrophysiological properties of the atrial muscle, which promoted the paroxysmal atrial fibrillation. Copyright © 2015 Elsevier B.V. All rights reserved.
Nonstationary envelope process and first excursion probability.
Yang, J.-N.
1972-01-01
The definition of stationary random envelope proposed by Cramer and Leadbetter, is extended to the envelope of nonstationary random process possessing evolutionary power spectral densities. The density function, the joint density function, the moment function, and the crossing rate of a level of the nonstationary envelope process are derived. Based on the envelope statistics, approximate solutions to the first excursion probability of nonstationary random processes are obtained. In particular, applications of the first excursion probability to the earthquake engineering problems are demonstrated in detail.
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Teachers' Understandings of Probability
Liu, Yan; Thompson, Patrick
2007-01-01
Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Probability on compact Lie groups
Applebaum, David
2014-01-01
Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Presumed consent in organ donation: the devil is in the detail
Hutchinson, Odette
2008-01-01
This article follows the recent publication of the Organs for Donation Task Force report, "Organs for Transplants", and considers the debate surrounding a change in the law in favour of presumed consent in organ donation.
Feline dry eye syndrome of presumed neurogenic origin: a case report
Directory of Open Access Journals (Sweden)
Lionel Sebbag
2017-12-01
Full Text Available Case summary A 14-year-old female spayed Abyssinian cat, which about 1 year previously underwent thoracic limb amputation, radiotherapy and chemotherapy for an incompletely excised vaccine-related fibrosarcoma, was presented for evaluation of corneal opacity in the left eye (OS. The ocular surface of both eyes (OU had a lackluster appearance and there was a stromal corneal ulcer OS. Results of corneal aesthesiometry, Schirmer tear test-1 (STT-1 and tear film breakup time revealed corneal hypoesthesia, and quantitative and qualitative tear film deficiency OU. Noxious olfactory stimulation caused increased lacrimation relative to standard STT-1 values suggesting an intact nasolacrimal reflex. Various lacrimostimulants were administered in succession; namely, 1% pilocarpine administered topically (15 days or orally (19 days, and topically applied 0.03% tacrolimus (47 days. Pilocarpine, especially when given orally, was associated with notable increases in STT-1 values, but corneal ulceration remained/recurred regardless of administration route, and oral pilocarpine resulted in gastrointestinal upset. Tacrolimus was not effective. After 93 days, the cat became weak and lame and a low thyroxine concentration was detected in serum. The cat was euthanized and a necropsy performed. Both lacrimal glands were histologically normal, but chronic neutrophilic keratitis and reduced conjunctival goblet cell density were noted OU. Relevance and novel information The final diagnosis was dry eye syndrome (DES of presumed neurogenic origin, associated with corneal hypoesthesia. This report reinforces the importance of conducting tearfilm testing in cats with ocular surface disease, as clinical signs of DES were different from those described in dogs.
Data-driven probability concentration and sampling on manifold
International Nuclear Information System (INIS)
Soize, C.; Ghanem, R.
2016-01-01
A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.
Data-driven probability concentration and sampling on manifold
Energy Technology Data Exchange (ETDEWEB)
Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)
2016-09-15
A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
2014-06-30
set of methods, many of which have their origin in probability in Banach spaces , that arise across a broad range of contemporary problems in di↵erent...salesman problem, . . . • Probability in Banach spaces : probabilistic limit theorems for Banach - valued random variables, empirical processes, local...theory of Banach spaces , geometric functional analysis, convex geometry. • Mixing times and other phenomena in high-dimensional Markov chains. At
Wang, Xing M.
2009-01-01
After a brief introduction to Probability Bracket Notation (PBN), indicator operator and conditional density operator (CDO), we investigate probability spaces associated with various quantum systems: system with one observable (discrete or continuous), system with two commutative observables (independent or dependent) and a system of indistinguishable non-interacting many-particles. In each case, we derive unified expressions of conditional expectation (CE), conditional probability (CP), and ...
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Spataru, Aurel
2013-01-01
Probability theory is a rapidly expanding field and is used in many areas of science and technology. Beginning from a basis of abstract analysis, this mathematics book develops the knowledge needed for advanced students to develop a complex understanding of probability. The first part of the book systematically presents concepts and results from analysis before embarking on the study of probability theory. The initial section will also be useful for those interested in topology, measure theory, real analysis and functional analysis. The second part of the book presents the concepts, methodology and fundamental results of probability theory. Exercises are included throughout the text, not just at the end, to teach each concept fully as it is explained, including presentations of interesting extensions of the theory. The complete and detailed nature of the book makes it ideal as a reference book or for self-study in probability and related fields. It covers a wide range of subjects including f-expansions, Fuk-N...
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
2010-07-01
... presumed to be the most advantageous method of transportation? 301-72.1 Section 301-72.1 Public Contracts... Transportation § 301-72.1 Why is common carrier presumed to be the most advantageous method of transportation? Travel by common carrier is presumed to be the most advantageous method of transportation because it...
African Journals Online (AJOL)
Willem Scholtz
“quite probably, also the end of Angola's existence as an independent country”. It went on: “The victory at Cuito Cuanavale for the liberation forces and their Cuban compatriots was therefore decisive in consolidating Angola's independence and achieving that of Namibia.” Therefore, when reflecting on the events, “it is not ...
Indian Academy of Sciences (India)
IAS Admin
He spends several months in India visiting schools, colleges and universities. He enjoys teaching mathematics and statistics at all levels. He loves Indian classical and folk music. This issue of Resonance features Joseph Leonard. Doob, who played a critical role in the devel- opment of probability theory in the world from.
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.
Probability Theory Without Tears!
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Indian Academy of Sciences (India)
OF PROBABILITY *. The simplest laws of natural science are those that state the conditions under which some event of interest to us will either certainly occur or certainly not occur; i.e., these conditions may be expressed in one of the following two forms: 1. If a complex (i.e., a set or collection) of conditions S is realized, then.
Stochastic response of nonlinear system in probability domain
Indian Academy of Sciences (India)
Keywords. Stochastic average procedure; nonlinear single-DOF system; probability density function. Abstract. A stochastic averaging procedure for obtaining the probability density function (PDF) of the response for a strongly nonlinear single-degree-of-freedom system, subjected to both multiplicative and additive random ...
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Energy Technology Data Exchange (ETDEWEB)
Duong, Hong Phuoc; Janssen, Francoise; Hall, Michelle; Ismaili, Khalid [Universite Libre de Bruxelles (ULB), Department of Pediatric Nephrology, Hopital Universitaire des Enfants Reine Fabiola, Brussels (Belgium); Piepsz, Amy [Hopital Universitaire Saint-Pierre, Department of Radioisotopes, Ghent (Belgium); Khelif, Karim; Collier, Frank [Universite Libre de Bruxelles (ULB), Department of Pediatric Urology, Hopital Universitaire des Enfants Reine Fabiola, Brussel (Belgium); Man, Kathia de [University Hospital Ghent, Department of Nuclear Medicine, Ghent (Belgium); Damry, Nash [Universite Libre de Bruxelles (ULB), Department of Pediatric Radiology, Hopital Universitaire des Enfants Reine Fabiola, Brussel (Belgium)
2015-05-01
The main criteria used for deciding on surgery in children with presumed antenatally detected pelviureteric junction obstruction (PPUJO) are the level of hydronephrosis (ultrasonography), the level of differential renal function (DRF) and the quality of renal drainage after a furosemide challenge (renography), the importance of each factor being far from generally agreed. Can we predict, on the basis of ultrasound parameters, the patient in whom radionuclide renography can be avoided? We retrospectively analysed the medical charts of 81 consecutive children with presumed unilateral PPUJO detected antenatally. Ultrasound and renographic studies performed at the same time were compared. Anteroposterior pelvic diameter (APD) and calyceal size were both divided into three levels of dilatation. Parenchymal thickness was considered either normal or significantly decreased. Acquisition of renograms under furosemide stimulation provided quantification of DRF, quality of renal drainage and cortical transit. The percentages of patients with low DRF and poor drainage were significantly higher among those with major hydronephrosis, severe calyceal dilatation or parenchymal thinning. Moreover, impaired cortical transit, which is a major risk factor for functional decline, was seen more frequently among those with very severe calyceal dilatation. However, none of the structural parameters obtained by ultrasound examination was able to predict whether the level of renal function or the quality of drainage was normal or abnormal. Alternatively, an APD <30 mm, a calyceal dilatation of <10 mm and a normal parenchymal thickness were associated with a low probability of decreased renal function or poor renal drainage. In the management strategy of patients with prenatally detected PPUJO, nuclear medicine examinations may be postponed in those with an APD <30 mm, a calyceal dilatation of <10 mm and a normal parenchymal thickness. On the contrary, precise estimation of DRF and renal
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Measurement uncertainty and probability
National Research Council Canada - National Science Library
Willink, Robin
2013-01-01
... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Laboratory-tutorial activities for teaching probability
Directory of Open Access Journals (Sweden)
Michael C. Wittmann
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
A. Algra (Ale); P.J. Koudstaal (Peter Jan); J. van Gijn (Jan)
1999-01-01
textabstractPatients who have had a transient ischaemic attack or nondisabling ischaemic stroke of presumed arterial origin have an annual risk of death from all vascular causes, non-fatal stroke, or non-fatal myocardial infarction that ranges between 4% and 11% without treatment. In the
Franke, CL; Koehler, PJJ; Gorter, JW; Kappelle, LJ; Rinkel, GJE; Tjeerdsma, HC; van Gijn, J; Dammers, JWHH; Straatman, HJS; ten Houten, R; Veering, MM; Bakker, SLM; Dippel, D; Koudstaal, PJ; van Gemert, HMA; van Swieten, JC; Horn, J; Kwa, IH; Limburg, M; Stam, J; Boon, AM; Lieuwens, WHG; Visscher, F; Bouwsma, C; Rutgers, AWF; Snoek, JW; Brouwers, PJAM; Nihom, J; Solleveld, H; Carbaat, PAT; Hertzberger, LI; Kleijweg, RP; Nanninga-van den Neste, VMH; van Diepen, AJH; Linssen, WHJP; Vanneste, JAL; Vos, J; Weinstein, HC; Schipper, JP; Berntsen, PJIM; de Vries-Leenders, EM; Geervliet, JP; Tans, RJJ; Feikema, WJ; Lohmann, HJHM; van Kasteel, [No Value; Jongebloed, FA; Leyten, QH; van Wensen, PJM; Jansen, C; Driesen, JJM; van Oudenaarden, WF; Verhey, JCB; Bottger, HRF; Driessen-Kletter, MF; Zwols, F; van der Gaast, JB; Wittebol, MC; van Oostenbrugge, RJ; Beintema, KD; Hilbers, J; van der Weil, HL; van Lieshout, HBM; Weststrate, W; Bernsen, PLJA; Frenken, CWGM; Poels, EFJ; Lindeboom, SF; van der Steen, A; Glimmerveen, WF; Martens, EIF; Bulens, C; de Vries-Bos, LHP; Venables, GS; Koster, JG; Sinnige, LGF; Klaver, MM; Koetsveld-Baart, JC; Mauser, HW; van Geusau, RBA; Dijkman, MH; Hoppenbrouwers, WJJF; Banford, WJJF; Briet, PE; Eekhof, JLA; Witjes, R; Hamburger, HL; van der Sande, JJ; Bath, P; Hankey, GJ; Koning, E; Ricci, S; Berendes, JN; Hooff, LJMA; van Spreeken, ACGA; Kuhler, AR; Mallo, GN; van Walbeek, HK; Gauw, JC; Vermeij, AJ; Verheij, JCB; Swen, JWA; Canhao, P; Keyser, A; Holscher, RS; de Jong, GJ; Kraaier, [No Value; Algra, A; Briet, E; deVries-Goldschemdingi, J; Eikelboom, BC; Greebe, P; Hauer, RNW; Hermsen, MG; Loeliger, EA; Pop, GAM; Rosendaal, FR; Schobben, AFAM; Sixma, FF; Slabbers, DCV; Tijssen, JCP; van Creval, H; van Es, GA; Verheugt, FWA; Vermeulin, M; Wulfsen, EKM; van der Meer, W.K.; Wever, Eric F. D.; Don, J
1997-01-01
Aspirin is only modestly effective in the secondary prevention after cerebral ischemia Studies in other vascular disorders suggest that anticoagulant drugs in patients with cerebral ischemia of presumed arterial (noncardiac) origin might be more effective. The aim of the Stroke Prevention in
Presumed Perinatal Stroke in a Child with Down Syndrome and Moyamoya Disease
Pysden, Karen; Fallon, Penny; Moorthy, Bhagavatheswaran; Ganesan, Vijeya
2010-01-01
Moyamoya disease describes a cerebral arteriopathy characterized by stenosis or occlusion of the terminal internal carotid and/or the proximal middle cerebral arteries. We report a female child with trisomy 21 and bilateral moyamoya disease who presented, unusually, with a presumed perinatal cerebral infarct. The clinical, radiological, and…
28 CFR 104.44 - Determination of presumed noneconomic losses for decedents.
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Determination of presumed noneconomic losses for decedents. 104.44 Section 104.44 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) SEPTEMBER 11TH VICTIM COMPENSATION FUND OF 2001 Amount of Compensation for Eligible Claimants. § 104.44...
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Determination of presumed noneconomic losses for claimants who suffered physical harm. 104.46 Section 104.46 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) SEPTEMBER 11TH VICTIM COMPENSATION FUND OF 2001 Amount of Compensation for...
International Nuclear Information System (INIS)
Zhou, J; Ding, X; Liang, J; Zhang, J; Wang, Y; Yan, D
2016-01-01
Purpose: With energy repainting in lung IMPT, the dose delivered is approximate to the convolution of dose in each phase with corresponding breathing PDF. This study is to compute breathing PDF weighted 4D dose in lung IMPT treatment and compare to its initial robust plan. Methods: Six lung patients were evaluated in this study. Amsterdam shroud image were generated from pre-treatment 4D cone-beam projections. Diaphragm motion curve was extract from the shroud image and the breathing PDF was generated. Each patient was planned to 60 Gy (12GyX5). In initial plans, ITV density on average CT was overridden with its maximum value for planning, using two IMPT beams with robust optimization (5mm uncertainty in patient position and 3.5% range uncertainty). The plan was applied to all 4D CT phases. The dose in each phase was deformed to a reference phase. 4D dose is reconstructed by summing all these doses based on corresponding weighting from the PDF. Plan parameters, including maximum dose (Dmax), ITV V100, homogeneity index (HI=D2/D98), R50 (50%IDL/ITV), and the lung-GTV’s V12.5 and V5 were compared between the reconstructed 4D dose to initial plans. Results: The Dmax is significantly less dose in the reconstructed 4D dose, 68.12±3.5Gy, vs. 70.1±4.3Gy in the initial plans (p=0.015). No significant difference is found for the ITV V100, HI, and R50, 92.2%±15.4% vs. 96.3%±2.5% (p=0.565), 1.033±0.016 vs. 1.038±0.017 (p=0.548), 19.2±12.1 vs. 18.1±11.6 (p=0.265), for the 4D dose and initial plans, respectively. The lung-GTV V12.5 and V5 are significantly high in the 4D dose, 13.9%±4.8% vs. 13.0%±4.6% (p=0.021) and 17.6%±5.4% vs. 16.9%±5.2% (p=0.011), respectively. Conclusion: 4D dose reconstruction based on phase PDF can be used to evaluate the dose received by the patient. A robust optimization based on the phase PDF may even further improve patient care.
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probably Almost Bayes Decisions
DEFF Research Database (Denmark)
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
Identification of probabilities.
Vitányi, Paul M B; Chater, Nick
2017-02-01
Within psychology, neuroscience and artificial intelligence, there has been increasing interest in the proposal that the brain builds probabilistic models of sensory and linguistic input: that is, to infer a probabilistic model from a sample. The practical problems of such inference are substantial: the brain has limited data and restricted computational resources. But there is a more fundamental question: is the problem of inferring a probabilistic model from a sample possible even in principle? We explore this question and find some surprisingly positive and general results. First, for a broad class of probability distributions characterized by computability restrictions, we specify a learning algorithm that will almost surely identify a probability distribution in the limit given a finite i.i.d. sample of sufficient but unknown length. This is similarly shown to hold for sequences generated by a broad class of Markov chains, subject to computability assumptions. The technical tool is the strong law of large numbers. Second, for a large class of dependent sequences, we specify an algorithm which identifies in the limit a computable measure for which the sequence is typical, in the sense of Martin-Löf (there may be more than one such measure). The technical tool is the theory of Kolmogorov complexity. We analyze the associated predictions in both cases. We also briefly consider special cases, including language learning, and wider theoretical implications for psychology.
Directory of Open Access Journals (Sweden)
Adriana Gondim de Moura Campos
Full Text Available ABSTRACT This case report describes the retinal optical coherence tomography (OCT findings in a microcephalic infant with macular atrophy presumably caused by intrauterine Zika virus infection. OCT demonstrated atrophy of the outer retinal layers and choriocapillaris, including the outer nuclear layer and ellipsoid zone, associated with retinal pigment epithelium hyper-reflectivity and increased OCT penetration into deeper layers of the choroid and sclera. A major concern associated with this infection is the apparent increased incidence of microcephaly in fetuses born to mothers infected with the Zika virus. It is becoming increasingly difficult to ignore the upsurge in congenital microcephaly observed in Brazil. Recently, ocular findings in infants with microcephaly associated with intrauterine Zika virus infection have been described. This is the first report of OCT imaging of macular atrophy in a child with presumed Zika virus infection-associated microcephaly.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Del Brutto, Oscar H; Mera, Robertino M; Del Brutto, Victor J; Zambrano, Mauricio; Lama, Julio
2015-04-01
Cerebral small vessel disease is probably one of the most common pathogenetic mechanisms underlying stroke in Latin America. However, the importance of silent markers of small vessel disease, including white matter hyperintensities of presumed vascular origin, has not been assessed so far. The study aims to evaluate prevalence and correlates of white matter hyperintensities in community-dwelling elders living in Atahualpa (rural Ecuador). Atahualpa residents aged ≥ 60 years were identified during a door-to-door survey and invited to undergo brain magnetic resonance imaging for identification and grading white matter hyperintensities and other markers of small vessel disease. Using multivariate logistic regression models, we evaluated whether white matter hyperintensities is associated with demographics, cardiovascular health status, stroke, cerebral microbleeds, and cortical atrophy, after adjusting for the other variables. Out of 258 enrolled persons (mean age, 70 ± 8 years; 59% women), 172 (67%) had white matter hyperintensities, which were moderate to severe in 63. Analyses showed significant associations of white matter hyperintensities presence and severity with age and cardiovascular health status, as well as with overt and silent strokes, and a trend for association with cerebral microbleeds and cortical atrophy. Prevalence and correlates of white matter hyperintensities in elders living in rural Ecuador is almost comparable with that reported from industrialized nations, reinforcing the concept that the burden of small vessel disease is on the rise in underserved Latin American populations. © 2014 World Stroke Organization.
Presumed Cases of Mumps in Pregnancy: Clinical and Infection Control Implications
Directory of Open Access Journals (Sweden)
Svjetlana Lozo
2012-01-01
Full Text Available Recently, a mumps outbreak in New York and New Jersey was reported by the Centers for Disease Control and Prevention (CDC. Subsequently, the dissemination of the disease was rapid, and, from June 28th 2009 through January 29th 2010, a total of 1,521 cases of mumps were reported in New York and New Jersey. Seven presumed cases occurred in pregnant women cared for at our institution. Mumps diagnosis as per the NYC Department of Health and Mental Hygiene was based on clinical manifestations, particularly parotitis. Prior immunizations with mumps vaccine and negative IgM were not adequate to rule out mumps infections. All of our seven patients had exposure to mumps in either their household or their community, and some of the them had symptoms of mumps. Due to the difficulties in interpreting serologies of these patients, their cases led to a presumed diagnosis of mumps. The diagnosis of mumps lead to the isolation of patients and health care personnel that were in contact with them. In this paper, we detail the presenting findings, diagnostic dilemmas and infection control challenges associated with presumed cases of mumps in pregnancy.
Walia, Harpreet S; Shah, Gaurav K; Blinder, Kevin J
2016-04-01
To assess the efficacy and safety of intravitreal aflibercept injection in the treatment of CNV secondary to presumed ocular histoplasmosis syndrome (POHS). To assess safety of intravitreal aflibercept for the treatment of CNV secondary to presumed ocular histoplasmosis syndrome. Masked, open-label, prospective study. Five subjects will receive 2.0 mg aflibercept injection every 8 weeks with 3 initial monthly doses over a 12 month period. No adverse systemic or ocular were reported. At month six, the mean visual acuity improved by 7.8 ETDRS letters, mean central subfoveal thickness decreased by 38.8 microns and mean OCT volume decreased by 0.076 mm3 . At month twelve, the mean visual acuity improved by 12.4 ETDRS letters, mean central subfoveal thickness decreased by 34.6 microns and mean OCT volume decreased by 0.576 mm3. The use of intravitreal 2.0 mg aflibercept injection for the treatment of CNV secondary to presumed ocular histoplasmosis syndrome yielded no systemic or ocular adverse events and produced improvement in visual acuity and reduction of OCT thickness and volume. Copyright © 2016 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
Path probabilities of continuous time random walks
International Nuclear Information System (INIS)
Eule, Stephan; Friedrich, Rudolf
2014-01-01
Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)
Path probabilities of continuous time random walks
Eule, Stephan; Friedrich, Rudolf
2014-12-01
Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman-Kac formulae.
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager
1997-01-01
To collect background information for formulating a description of the expected soil properties along the tunnel line, in 1987 Storebælt initiated a statistical investigation of the occurrence and size of boulders in the Great Belt area. The data for the boulder size distribution were obtained...... by use of aerial photographing of cliff beaches with subsequent stereo metric measurement on the photographs. To get information about the density of occurrence, a series of continuous seismic scannings were also made along parallel lines in a corridor with the tunnel line as about the central line....... The data collection part of the investigation was made on the basis of geological expert advice (Gunnar Larsen, Århus) by the Danish Geotechnical Institute (DGI).The statistical data analysis combined with stochastic modeling based on geometry and sound wave diffraction theory gave a point estimate...
Probability density fittings of corrosion test-data: Implications on ...
Indian Academy of Sciences (India)
the use of corrosion inhibitor admixture in concrete had been identified as an easy, effective and economical ... for acceptable reduction in corrosion rate had been recommended by others. The American .... immersed, longitudinally, in plastic bowls containing respective test solution of aggressive agent. Each of the first ...
METAPHOR: Probability density estimation for machine learning based photometric redshifts
Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.
2017-06-01
We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).
Indoor Localization with Probability Density Functionsd based on Bluetooth
Wendlandt, Kai; Robertson, Patrick; Berbig, Marcus
2005-01-01
We present a simple system to help people navigate inside of buildings or even in outside areas close to buildings. It is based on the “RSSI” and “Transmit power” data of an established Bluetooth link. The system is in principle sufficient for the intended application (pedestrian, indoor), but it is certainly not a high resolution indoor location system. The achievable accuracy is dependent on the setup (number of access points and their constellation and available Bluetooth devices) but will...
Classical-Quantum Correspondence by Means of Probability Densities
Vegas, Gabino Torres; Morales-Guzman, J. D.
1996-01-01
Within the frame of the recently introduced phase space representation of non relativistic quantum mechanics, we propose a Lagrangian from which the phase space Schrodinger equation can be derived. From that Lagrangian, the associated conservation equations, according to Noether's theorem, are obtained. This shows that one can analyze quantum systems completely in phase space as it is done in coordinate space, without additional complications.
METAPHOR: Probability density estimation for machine learning based photometric redshifts
Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.
2016-01-01
We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to
Probability density fittings of corrosion test-data: Implications on ...
Indian Academy of Sciences (India)
half-circuit potential test data of other inhibitors in reported studies that had been submitted else- where (Okeniyi et al .... with the concrete to prevent leaching of the admixture solution into the wooden mould. Each of ... this connection, the instrument, through a true potentiostatic circuit, null any residual poten- tial difference ...
On the discretization of probability density functions and the ...
Indian Academy of Sciences (India)
1), e.g., approximation formulae for the mean and variance or methods based on Gaussian quadrature with N points which can handle 2N moments [8]. Instead of trying to improve the existing discretization methods, the focus of this paper is ...
On the Probability Density Functions of Forster-Greer-Thorbecke ...
African Journals Online (AJOL)
Distributional properties of poverty indices are generally unknown due to the fact that statistical inference for poverty measures are mostly ignored in the field of poverty analysis where attention is usually based on identification and aggregation problems. This study considers the possibility of using Pearson system of ...
On the discretization of probability density functions and the ...
Indian Academy of Sciences (India)
Pramana – Journal of Physics. Current Issue : Vol. 90, Issue 2 · Current Issue Volume 90 | Issue 2. February 2018. Home · Volumes & Issues · Special Issues · Forthcoming Articles · Search · Editorial Board · Information for Authors · Subscription ...
Ignition probabilities for Compact Ignition Tokamak designs
International Nuclear Information System (INIS)
Stotler, D.P.; Goldston, R.J.
1989-09-01
A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs
Geometric modeling in probability and statistics
Calin, Ovidiu
2014-01-01
This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...
Mutations in btk in patients with presumed X-linked agammaglobulinemia.
Conley, M E; Mathias, D; Treadaway, J; Minegishi, Y; Rohrer, J
1998-01-01
In 1993, two groups showed that X-linked agammaglobulinemia (XLA) was due to mutations in a tyrosine kinase now called Btk. Most laboratories have been able to detect mutations in Btk in 80%-90% of males with presumed XLA. The remaining patients may have mutations in Btk that are difficult to identify, or they may have defects that are phenotypically similar to XLA but genotypically different. We analyzed 101 families in which affected males were diagnosed as having XLA. Mutations in Btk were identified in 38 of 40 families with more than one affected family member and in 56 of 61 families with sporadic disease. Excluding the patients in whom the marked decrease in B cell numbers characteristic of XLA could not be confirmed by immunofluorescence studies, mutations in Btk were identified in 43 of 46 patients with presumed sporadic XLA. Two of the three remaining patients had defects in other genes required for normal B cell development, and the third patient was unlikely to have XLA, on the basis of results of extensive Btk analysis. Our techniques were unable to identify a mutation in Btk in one male with both a family history and laboratory findings suggestive of XLA. DNA samples from 41 of 49 of the mothers of males with sporadic disease and proven mutations in Btk were positive for the mutation found in their son. In the other 8 families, the mutation appeared to arise in the maternal germ line. In 20 families, haplotype analysis showed that the new mutation originated in the maternal grandfather or great-grandfather. These studies indicate that 90%-95% of males with presumed XLA have mutations in Btk. The other patients are likely to have defects in other genes. PMID:9545398
Physical Constructivism and Quantum Probability
Ozhigov, Yu. I.
2009-03-01
I describe the main ideas of constructive physics and its role for the probability interpretation of quantum theory. It is shown how the explicit probability space for quantum systems gives the formal representation of entanglement and decoherence.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Noize, Pernelle; Bazin, Fabienne; Pariente, Antoine; Dufouil, Carole; Ancelin, Marie-Laure; Helmer, Catherine; Moore, Nicholas; Fourrier-Réglat, Annie
2012-10-01
To evaluate the validity of chronic drug exposure presumed from cross-sectional interviews taking reimbursement data as reference. The study concerned 2,985 elderly persons of the French Three-City cohort (1) who were interviewed on current drug use 2 and 4 years after inclusion and (2) whose reimbursement data were obtained from the main health care insurance system. Validity (sensitivity, Se; specificity, Sp; positive predictive value, PPV; negative predictive value, NPV) of chronic exposure presumed from follow-up interviews was investigated taking two exposure definitions from reimbursements as reference for the period between interviews: at least 80% coverage with and without a maximal time between reimbursements of 60 days. Using 80% coverage as reference, validity of interview data was substantial for cardiovascular and antithrombotic drugs (Se, 85.3-95.4%; Sp, 67.1-97.6%; PPV, 65.9-86.6%; NPV, 93.3-99.3%). For benzodiazepines, nonsteroidal anti-inflammatory drugs, or analgesics, validity was low especially owing to PPVs (15.8-51.4%). Using reported use at cross-sectional interviews as a proxy for chronic exposure between interviews was valid for drugs used regularly but not so for drugs used more irregularly. Copyright © 2012 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Danielle Di Cavalcanti
Full Text Available To report the echocardiographic evaluation of 103 infants with presumed congenital Zika syndrome.An observational retrospective study was performed at Instituto de Medicina Integral Prof. Fernando Figueira (IMIP, Recife, Brazil. 103 infants with presumed congenital Zika syndrome. All infants had microcephaly and head computed tomography findings compatible with congenital Zika syndrome. Zika IgM antibody was detected in cerebrospinal fluid samples of 23 infants. In 80 infants, the test was not performed because it was not available at that time. All infants had negative serology for HIV, syphilis, rubella, cytomegalovirus and toxoplasmosis. A complete transthoracic two-dimensional, M-mode, continuous wave and pulsed wave Doppler and color Doppler echocardiographic (PHILIPS HD11XE or HD15 examination was performed on all infants.14/103 (13.5% echocardiograms were compatible with congenital heart disease: 5 with an ostium secundum atrial septal defect, 8 had a hemodynamically insignificant small apical muscular ventricular septal defect and one infant with dyspnea had a large membranous ventricular septal defect. The echocardiograms considered normal included 45 infants with a persistent foramen ovale and 16 with a minimum patent ductus arteriosus.Preliminarily this study suggests that congenital Zika syndrome may be associated with an increase prevalence of congenital heart disease. However the types of defects noted were septal defects, a proportion of which would not be hemodynamically significant.
A Case of Presumed Tuberculosis Uveitis with Occlusive Vasculitis from an Endemic Region
Directory of Open Access Journals (Sweden)
Berna Başarır
2017-06-01
Full Text Available In this report, we present a case with presumed unilateral tuberculosis uveitis from an endemic region. A 23-year-old male presented with decreased vision in his left eye for 15 days. Visual acuities were 1.0 in his right eye and 0.3 in his left eye. Ophthalmologic examination was normal for the right eye. Slit-lamp examination revealed 2+ cells in the vitreous without anterior chamber reaction in his left eye. Fundus examination revealed occlusive vasculitis and granuloma. His history revealed that he had a respiratory infection with fever 3 months ago while visiting his native country, Rwanda, and was treated with non-specific antibiotic therapy. His visual symptom started 2 weeks after his systemic symptoms resolved. Laboratory findings included 15 mm induration in purified protein derivative tuberculin skin test, HIV negativity, and parenchymal lesions in chest X-ray. Bronchoalveolar lavage was negative for acid-fast bacillus. A pulmonary disease consultant reported presumed tuberculosis because of the patient’s history. Anti-tuberculosis treatment was initiated. The patient’s visual acuity improved rapidly and his signs regressed. A careful history should be taken from patients with uveitis. Travel to tuberculosis-endemic areas may be important for diagnosis and should be asked about directly.
Directory of Open Access Journals (Sweden)
Dipika V Patel
2013-01-01
Full Text Available A 28-year-old female with a history of contact lens wear presented with a 1 week history of pain and photophobia in her left eye. In vivo confocal microscopy (IVCM and corneal scrape confirmed the diagnosis of Acanthamoeba keratitis (AK which was treated with intensive topical propamidine isethionate (0.1% and chlorhexidine (0.02% with tapering dosage over 11 months. Five years after complete resolution of AK and cessation of all contact lens wear, the subject presented to her optometrist with a history of ocular discomfort and mild photophobia. Without further investigation she was prescribed topical corticosteroids. Three weeks later she presented with pain and reduced vision in the left eye. Slit-lamp examination revealed focal, inferior corneal stromal edema. IVCM confirmed widespread Acanthamoeba cysts. Treatment with topical polyhexamethylene biguanide (PHMB 0.02% and propamidine isethionate 0.1% resulted in resolution of the AK. Despite an initially mild AK, this subject presumably retained viable Acanthamoeba cysts in her cornea 5 years after the initial episode. This report highlights the importance of caution when using corticosteroids in patients with a previous history of AK, even in the relatively distant past. Patients with AK should be warned regarding the risks of recurrence following presumed resolution.
The probability of a tornado missile hitting a target
International Nuclear Information System (INIS)
Goodman, J.; Koch, J.E.
1983-01-01
It is shown that tornado missile transportation is a diffusion Markovian process. Therefore, the Green's function method is applied for the estimation of the probability of hitting a unit target area. This propability is expressed through a joint density of tornado intensity and path area, a probability of tornado missile injection and a tornado missile height distribution. (orig.)
Effects of Potential Lane-Changing Probability on Uniform Flow
International Nuclear Information System (INIS)
Tang Tieqiao; Huang Haijun; Shang Huayan
2010-01-01
In this paper, we use the car-following model with the anticipation effect of the potential lane-changing probability (Acta Mech. Sin. 24 (2008) 399) to investigate the effects of the potential lane-changing probability on uniform flow. The analytical and numerical results show that the potential lane-changing probability can enhance the speed and flow of uniform flow and that their increments are related to the density.
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Estimating loblolly pine size-density trajectories across a range of planting densities
Curtis L. VanderSchaaf; Harold E. Burkhart
2013-01-01
Size-density trajectories on the logarithmic (ln) scale are generally thought to consist of two major stages. The first is often referred to as the density-independent mortality stage where the probability of mortality is independent of stand density; in the second, often referred to as the density-dependent mortality or self-thinning stage, the probability of...
Probability measures, Lévy measures and analyticity in time
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich
2008-01-01
We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators, we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...
Probability Measures, Lévy Measures, and Analyticity in Time
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich
We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Energy Technology Data Exchange (ETDEWEB)
Yang, Jeannie C.; Ostlie, Daniel J. [Children' s Mercy Hospital, Department of Surgery, Kansas City, MO (United States); Rivard, Douglas C.; Morello, Frank P. [Children' s Mercy Hospital, Department of Radiology, Kansas City, MO (United States)
2008-08-15
A Meckel diverticulum is an embryonic remnant of the omphalomesenteric duct that occurs in approximately 2% of the population. Most are asymptomatic; however, they are vulnerable to inflammation with subsequent consequences including diverticulitis and perforation. We report an 11-year-old boy who underwent laparoscopic appendectomy for perforated appendicitis at an outside institution. During his convalescence he underwent percutaneous drainage of a presumed postoperative abscess. A follow-up drain study demonstrated an enteric fistula. The drain was slowly removed from the abdomen over a period of 1 week. Three weeks following drain removal the patient reported recurrent nausea and abdominal pain. A CT scan demonstrated a 3.7-cm rim-enhancing air-fluid level with dependent contrast consistent with persistent enteric fistula and abscess. Exploratory laparoscopy was performed, at which time a Meckel diverticulum was identified and resected. This case highlights the diagnostic challenge and limitations of conventional radiology in complicated Meckel diverticulum. (orig.)
Can "presumed consent" justify the duty to treat infectious diseases? An analysis
Directory of Open Access Journals (Sweden)
Arda Berna
2008-03-01
Full Text Available Abstract Background AIDS, SARS, and the recent epidemics of the avian-flu have all served to remind us the debate over the limits of the moral duty to care. It is important to first consider the question of whether or not the "duty to treat" might be subject to contextual constraints. The purpose of this study was to investigate the opinions and beliefs held by both physicians and dentists regarding the occupational risks of infectious diseases, and to analyze the argument that the notion of "presumed consent" on the part of professionals may be grounds for supporting the duty to treat. Methods For this cross-sectional survey, the study population was selected from among physicians and dentists in Ankara. All of the 373 participants were given a self-administered questionnaire. Results In total, 79.6% of the participants said that they either had some degree of knowledge about the risks when they chose their profession or that they learned of the risks later during their education and training. Of the participants, 5.2% said that they would not have chosen this profession if they had been informed of the risks. It was found that 57% of the participants believed that there is a standard level of risk, and 52% of the participants stated that certain diseases would exceed the level of acceptable risk unless specific protective measures were implemented. Conclusion If we use the presumed consent argument to establish the duty of the HCW to provide care, we are confronted with problems ranging over the difficulty of choosing a profession autonomously, the constant level of uncertainty present in the medical profession, the near-impossibility of being able to evaluate retrospectively whether every individual was informed, and the seemingly inescapable problem that this practice would legitimize, and perhaps even foster, discrimination against patients with certain diseases. Our findings suggest that another problem can be added to the list: one
Radia, Meera; Gilhooley, Michael James; Panos, Chris; Claoué, Charles
2015-07-16
Keratosis follicularis (Darier's disease) is an autosomal dominant dermatological disorder characterised by abnormal epidermal differentiation and loss of normal cell-to-cell adhesion. Cardinal features include diffuse hyperkeratotic warty papules with scaly plaques in seborrhoeic regions with associated mucous membrane changes. Darier's disease is rare (prevalence 2.7 in 100,000), with few ocular sequelae reported: commonly dry eye with or without Sjögren's syndrome. This is the first report, to the best of our knowledge, to describe a case of recurrent herpes simplex virus (HSV) keratitis and episcleritis in a 47-year-old man suffering from Darier's disease. The patient's condition predisposed him towards developing ocular complications due to several factors: impaired desmosome function leading to poor cell-to-cell adhesion in the corneal epithelium, dry eye and HSV invasion of inflamed periocular skin presumably combining to allow viral colonisation of a poorly protected cornea. 2015 BMJ Publishing Group Ltd.
Coverage Probability of Random Intervals
Chen, Xinjia
2007-01-01
In this paper, we develop a general theory on the coverage probability of random intervals defined in terms of discrete random variables with continuous parameter spaces. The theory shows that the minimum coverage probabilities of random intervals with respect to corresponding parameters are achieved at discrete finite sets and that the coverage probabilities are continuous and unimodal when parameters are varying in between interval endpoints. The theory applies to common important discrete ...
M. van Heerde (Marc); K. Biermann (Katharina); P.E. Zondervan (Pieter); G. Kazemier (Geert); C.H.J. van Eijck (Casper); C.J. Pek (Chulja); E.J. Kuipers (Ernst); H.R. van Buuren (Henk)
2012-01-01
textabstractBackground: Occasionally patients undergoing resection for presumed malignancy of the pancreatic head are diagnosed postoperatively with benign disease. Autoimmune pancreatitis (AIP) is a rare disease that mimics pancreatic cancer. We aimed to determine the prevalence of benign disease
Monica Shukla; J P Srivastava; V K Srivastava; S C Saxena; Seema Nigam
2004-01-01
Research Question : What is the attitude of young females towards their husband or sex partners presuming them infected with HIV?Objectives : Attitude of young slum dwelling females towards husband or sex partner presuming them HIV infectedaccording to age of respondentsaccording to marital status of respondentsaccording to occupation of respondentsaccording to literacy status of respondents Study Design : Cross sectional studyStudy Area : 10% of the Slums of Kanpur City having population les...
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
Forbidden Transition Probabilities of Astrophysical Interest among ...
Indian Academy of Sciences (India)
astrophysical plasma densities are very low, the probability of collisions is small and many states decay by M1 or E2 ..... ences were computed according to the formula [gf (l) − gf (v)] ×100/max{gf(l), gf(v)}. Transition gf (l) gf (v). Diff. (%). 3d a 2G. −. 3d a 2D. 1.518E−11. 1.683E−11. 9.8. 3d a 2G. −. 3d a 2G. 5.290E−07.
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Probability machines: consistent probability estimation using nonparametric learning machines.
Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A
2012-01-01
Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
Grandy, W. T. Jr.
2009-01-01
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Logic with a Probability Semantics
Hailperin, Theodore
2010-01-01
The present study is an extension of the topic introduced in Dr. Hailperin's Sentential Probability Logic, where the usual true-false semantics for logic is replaced with one based more on probability, and where values ranging from 0 to 1 are subject to probability axioms. Moreover, as the word "sentential" in the title of that work indicates, the language there under consideration was limited to sentences constructed from atomic (not inner logical components) sentences, by use of sentential connectives ("no," "and," "or," etc.) but not including quantifiers ("for all," "there is"). An initial
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
Presumed congenital infection by Zika virus: findings on psychomotor development - a case report
Directory of Open Access Journals (Sweden)
Ana Carla Gomes Botelho
Full Text Available Abstract Introduction: the identification of Zika virus (ZikV in the amniotic fluid, in the placenta and in newborns' brains suggests a neurotropism of this agent in the brain development, resulting in neuro-psycho-motor alterations. Thus, this present study reports the assessment of children diagnosed by a congenital infection, presumably by ZikV, followed-up at the Rehabilitation Center Prof. Ruy Neves Baptist at the Instituto de Medicina Integral Prof. Fernando Figueira (IMIP. Description: as proposed by the Ministry of Health, the following instruments were used to evaluate the neuro-motor functions of four children with microcephaly aged between three and four months: The Test of Infant Motor Performance (TIMP; the functional vision assessment; the manual function scale development; and the clinical evaluation protocol on pediatric dysphagia (PAD-PED. Discussion: the children evaluated presented atypical motor performance, muscle tone and spontaneous motricity which encompass the symmetry and the motion range of the upper and lower limbs proven to be altered. The functional vision showed alterations which can cause limitations in the performance of functional activities and the learning process. Regarding to the speech articulator's functions observed that the maturation and coordination of sucking, swallowing and breathing did not yet encounter the appropriate age maturity level.
The syndrome of presumed ocular histoplasmosis in Mexico: a preliminary study.
Pedroza-Seres, M; Quiroz-Mercado, H; Granados, J; Taylor, M L
1994-01-01
A study to screen for the syndrome of presumed ocular histoplasmosis (SPOH) among native populations from three Mexican states was performed. Two of these states, Guerrero and Querétaro, were selected as histoplasmosis is endemic there, whereas Tlaxcala was considered a control, due to the absence of reported cases. A total of 253 individuals were submitted to ocular fundus examination to obtain evidence of SPOH. A high percentage of positive reactors to histoplasmin skin test (ST) was observed in Guerrero (83%) and Querétaro (53%), whereas in Tlaxcala positive ST were almost absent (2.04%). Only five individuals had retinal lesions, although these lesions were not characteristic of the syndrome. Stimulation of these individual's cells showed different patterns in the histoplasmin-induced lymphocyte transformation response, and two out of five individuals with retinal lesions presented a stimulated response, as well as three controls without lesions. Histocompatibility antigens (HLA) were determined in a sample of each population and no particular allele, including HLA-B7, was found to be related to SPOH as reported in the USA; however, HLA-B22 was found in three individuals who developed pulmonary histoplasmosis. Results do not provide clinical evidence or data on specific HLA risk factors, for the presence of SPOH in the population studied.
Clinical Course of a Presumed Metastatic Uveal Melanoma to the Contralateral Choroid.
Caminal Mitjana, Josep M; Vilà Grané, Natàlia; Adán Civera, Alfredo; Sabater, Noelia; Arias Barquet, Lluis; Rubio Caso, Marcos J
2015-01-01
We present the ultrasound and optic coherence tomography follow-up of a presumed choroidal metastasis from a contralateral melanoma. A 53-year-old male was diagnosed with uveal melanoma with extraescleral extension in his left eye. A year later, the fundus examination revealed a flat, gray-green, pigmented choroidal lesion in the right eye. The ultrasonography showed a mass, almost flat, and all these findings were compatible with a choroidal melanocytic lesion with risk factors for growth. One month later, melanocytic skin lesions appeared on the scalp, as well as small tumors. Three months later, an ultrasonography on B scan showed a growth of the tumor size. The patient developed a progressive deterioration and died. Three possibilities can explain the occurrence of a choroidal pigmented tumor in the contralateral eye: first, bilateral primary choroidal melanomas; second, both choroidal tumors are metastatic in origin from an unknown primary melanoma; and third, the contralateral tumor is a metastatic tumor from the primary choroidal melanoma.
Presumable incipient hybrid speciation of door snails in previously glaciated areas in the Caucasus.
Koch, Eva L; Neiber, Marco T; Walther, Frank; Hausdorf, Bernhard
2016-04-01
Homoploid hybrid speciation, speciation by hybridization without a change in chromosome number, may be the result of an encounter of closely related species in a habitat that is different from that usually occupied by these species. In the northwestern Caucasus the land snail species Micropontica caucasica and M. circassica form two distinct entities with little admixture at low and intermediate altitudes. However, at higher altitudes in the Lagonaki plateau, which were repeatedly glaciated, Micropontica populations with intermediate characters occur. Admixture analyses based on AFLP data demonstrated that the populations from the Lagonaki plateau are homoploid hybrids that now form a cluster separate from the parental species. The Lagonaki populations are characterized by a mtDNA haplotype clade that has been found in the parental species only once. The fixation of this haplotype clade in most hybrid populations suggests that these haplotypes are better adapted to the cooler conditions in high altitude habitats and have replaced the haplotypes of the parental species in a selective sweep. The fixation of a presumably adaptive mitochondrial haplotype clade in the Lagonaki populations is an important step towards speciation under the differential fitness species concept. Copyright © 2015 Elsevier Inc. All rights reserved.
Excimer Laser Phototherapeutic Keratectomy for the Treatment of Clinically Presumed Fungal Keratitis
Directory of Open Access Journals (Sweden)
Liang-Mao Li
2014-01-01
Full Text Available This retrospective study was to evaluate treatment outcomes of excimer laser phototherapeutic keratectomy (PTK for clinically presumed fungal keratitis. Forty-seven eyes of 47 consecutive patients underwent manual superficial debridement and PTK. All corneal lesions were located in the anterior stroma and were resistant to medication therapy for at least one week. Data were collected by a retrospective chart review with at least six months of follow-up data available. After PTK, infected corneal lesions were completely removed and the clinical symptoms resolved in 41 cases (87.2%. The mean ablation depth was 114.39±45.51 μm and diameter of ablation was 4.06±1.07 mm. The mean time for healing of the epithelial defect was 8.8±5.6 days. Thirty-four eyes (82.9% showed an improvement in best spectacle-corrected visual acuity of two or more lines. PTK complications included mild to moderate corneal haze, hyperopic shift, irregular astigmatism, and thinning cornea. Six eyes (12.8% still showed progressed infection, and conjunctival flap covering, amniotic membrane transplantation, or penetrating keratoplasty were given. PTK is a valuable therapeutic alternative for superficial infectious keratitis. It can effectively eradicate lesions, hasten reepithelialization, and restore and preserve useful visual function. However, the selection of surgery candidates should be conducted carefully.
ISC origin times for announced and presumed underground nuclear explosions at several test sites
International Nuclear Information System (INIS)
Rodean, H.C.
1979-01-01
Announced data for US and French underground nuclear explosions indicate that nearly all detonations have occurred within one or two tenths of a second after the minute. This report contains ISC origin-time data for announced explosions at two US test sites and one French test site, and includes similar data for presumed underground nuclear explosions at five Soviet sites. Origin-time distributions for these sites are analyzed for those events that appeared to be detonated very close to the minute. Particular attention is given to the origin times for the principal US and Soviet test sites in Nevada and Eastern Kazakhstan. The mean origin times for events at the several test sites range from 0.4 s to 2.8 s before the minute, with the earlier mean times associated with the Soviet sites and the later times with the US and French sites. These times indicate lower seismic velocities beneath the US and French sites, and higher velocities beneath the sites in the USSR 9 figures, 8 tables
High-density limit of quantum chromodynamics
International Nuclear Information System (INIS)
Alvarez, E.
1983-01-01
By means of a formal expansion of the partition function presumably valid at large baryon densities, the propagator of the quarks is expressed in terms of the gluon propagator. This result is interpreted as implying that correlations between quarks and gluons are unimportant at high enough density, so that a kind of mean-field approximation gives a very accurate description of the physical system
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Clinical Features and Risk Factors of Patients with Presumed Ocular Toxoplasmosis
Fuh, Ukamaka Celestina; Omoti, Afekhide E.; Enock, Malachi E.
2016-01-01
Purpose: To determine the clinical features and risk factors of presumed ocular toxoplasmosis (POT) in patients affected with the condition at Irrua, Nigeria. Methods: The study included 69 patients with POT, and 69 age and sex matched subjects who served as the control group. Data was obtained using interviewer administered questionnaires. Examination included measurement of visual acuity (VA), intraocular pressure (IOP), slit lamp examination, gonioscopy and dilated fundus examination. Results: Mean age of cases and control subjects was 57.16 ± 18.69 and 56.09 ± 16.01 years respectively. The peak age group in patients with POT was 60 years and above. The most common presenting complaint was blurred vision occurring in 100% of cases. Drinking unfiltered water in 58 (84.1%) patients was the most common risk factor. Other risk factors included post cataract surgery status in 32 (46.4%) subjects, ingestion of poorly cooked meat in 30 (43.5%) cases and exposure to cats in 9 (13.0%) patients. All risk factors were more common in POT patients (P < 0.05). Out of 69 patients, 62 (89.9%) had unilateral while 7 (10.1%) had bilateral involvement. Out of 76 eyes with uveitis, 53 (69.7%) were blind. Active disease was significantly more common with increasing age (P < 0.05). Conclusion: Patients with POT were rather old and some risk factors were modifiable, therefore health education for preventing the transmission of toxoplasmosis and provision of sanitary water may help reduce the incidence of ocular toxoplasmosis. PMID:27195085
Joint maximum-likelihood magnitudes of presumed underground nuclear test explosions
Peacock, Sheila; Douglas, Alan; Bowers, David
2017-08-01
Body-wave magnitudes (mb) of 606 seismic disturbances caused by presumed underground nuclear test explosions at specific test sites between 1964 and 1996 have been derived from station amplitudes collected by the International Seismological Centre (ISC), by a joint inversion for mb and station-specific magnitude corrections. A maximum-likelihood method was used to reduce the upward bias of network mean magnitudes caused by data censoring, where arrivals at stations that do not report arrivals are assumed to be hidden by the ambient noise at the time. Threshold noise levels at each station were derived from the ISC amplitudes using the method of Kelly and Lacoss, which fits to the observed magnitude-frequency distribution a Gutenberg-Richter exponential decay truncated at low magnitudes by an error function representing the low-magnitude threshold of the station. The joint maximum-likelihood inversion is applied to arrivals from the sites: Semipalatinsk (Kazakhstan) and Novaya Zemlya, former Soviet Union; Singer (Lop Nor), China; Mururoa and Fangataufa, French Polynesia; and Nevada, USA. At sites where eight or more arrivals could be used to derive magnitudes and station terms for 25 or more explosions (Nevada, Semipalatinsk and Mururoa), the resulting magnitudes and station terms were fixed and a second inversion carried out to derive magnitudes for additional explosions with three or more arrivals. 93 more magnitudes were thus derived. During processing for station thresholds, many stations were rejected for sparsity of data, obvious errors in reported amplitude, or great departure of the reported amplitude-frequency distribution from the expected left-truncated exponential decay. Abrupt changes in monthly mean amplitude at a station apparently coincide with changes in recording equipment and/or analysis method at the station.
Kobayashi, Atsushi; Parchi, Piero; Yamada, Masahito; Mohri, Shirou; Kitamoto, Tetsuyuki
2016-06-01
As an experimental model of acquired Creutzfeldt-Jakob disease (CJD), we performed transmission studies of sporadic CJD using knock-in mice expressing human prion protein (PrP). In this model, the inoculation of the sporadic CJD strain V2 into animals homozygous for methionine at polymorphic codon 129 (129 M/M) of the PRNP gene produced quite distinctive neuropathological and biochemical features, that is, widespread kuru plaques and intermediate type abnormal PrP (PrP(Sc) ). Interestingly, this distinctive combination of molecular and pathological features has been, to date, observed in acquired CJD but not in sporadic CJD. Assuming that these distinctive phenotypic traits are specific for acquired CJD, we revisited the literature and found two cases showing widespread kuru plaques despite the 129 M/M genotype, in a neurosurgeon and in a patient with a medical history of neurosurgery without dura mater grafting. By Western blot analysis of brain homogenates, we revealed the intermediate type of PrP(Sc) in both cases. Furthermore, transmission properties of brain extracts from these two cases were indistinguishable from those of a subgroup of dura mater graft-associated iatrogenic CJD caused by infection with the sporadic CJD strain V2. These data strongly suggest that the two atypical CJD cases, previously thought to represent sporadic CJD, very likely acquired the disease through exposure to prion-contaminated brain tissues. Thus, we propose that the distinctive combination of 129 M/M genotype, kuru plaques, and intermediate type PrP(Sc) , represents a reliable criterion for the identification of acquired CJD cases among presumed sporadic cases. © 2015 Japanese Society of Neuropathology.
[Childhood vaccinations anno 2004. II. The real and presumed side effects of vaccination].
Rümke, H C; Visser, H K
2004-02-21
Vaccinations protect to a high degree against infectious diseases, but may cause side effects. In the Netherlands since 1962 the adverse events following immunizations are registered and analysed by the National Institute of Health and Environment (RIVM). Since 1983 a permanent Committee of the Dutch Health Council reviews adverse events reported to the RIVM. With the so-called killed vaccines the side effects are mainly local (redness, swelling, pain) or general (fever, listlessness, irritability, sleep and eating problems). They are seen mainly after DPT-IPV vaccination against diphtheria, pertussis, tetanus and poliomyelitis. Some side effects occur rarely (collapse reactions, discoloured legs, persistent screaming and convulsions) and very rarely serious neurological events are reported. After MMR vaccination against measles, mumps and rubella, cases of arthritis, thrombocytopenia and ataxia are reported sporadically. Usually, they have a spontaneous recovery. During recent years a scala of diseases or symptoms have been associated with vaccination (presumed side effects). Careful and extensive investigations have shown that such hypotheses could not be supported. Examples are allergic diseases as asthma, diabetes mellitus, multiple sclerosis (after hepatitis B vaccination), autism and inflammatory bowel disease (after MMR vaccination) and sudden infant death syndrome. The total number of cases where at least a possible relation between side effects and vaccination is observed--apart from local reactions and moderate general symptoms--is very rare (about 0.25 per 1000 vaccinations) and does not balance the benefits from vaccination. There appears increasing doubt about the use and safety of vaccinations. More research is needed about the motives of people to choose for and against vaccination. The education about vaccination for parents and professionals who are involved with vaccination has to be improved. Internet can play an important role.
Chorioretinal Lesions Presumed Secondary to Zika Virus Infection in an Immunocompromised Adult.
Henry, Christopher R; Al-Attar, Luma; Cruz-Chacón, Alexis M; Davis, Janet L
2017-04-01
Zika virus has spread rapidly throughout the Americas since 2015. The public health implications of Zika virus infection lend special importance to identifying the virus in unsuspected hosts. To describe relevant imaging studies and clinical features of chorioretinal lesions that are presumably associated with Zika virus and that share analogous features with chorioretinal lesions reported in cases of Dengue fever and West Nile virus. This is a case report from an academic referral center in Miami, Florida, of a woman in her 60s from Guaynabo, Puerto Rico, who presented with reduced visual acuity and bilateral diffuse, subretinal, confluent, placoid, and multifocal chorioretinal lesions. The patient was observed over a 5-month period. Visual acuity, clinical course, and multimodal imaging study results. Fluorescein angiography revealed early hypofluorescence and late staining of the chorioretinal lesions. Optical coherence tomography demonstrated outer retinal disruption in the placoid macular lesions. Zika RNA was detected in a plasma sample by real-time reverse transcription polymerase chain reaction testing and was suspected to be the cause of chorioretinal lesions after other viral and infectious causes were ruled out. Three weeks after the onset of symptoms, the patient's visual acuity had improved to 20/60 OD and 20/25 OS, with intraocular pressures of 18 mm Hg OD and 19 mm Hg OS. In 6 weeks, the chorioretinal lesions had healed and visual acuity had improved to 20/25 OD and 20/20 OS. Follow-up optical coherence tomography demonstrated interval recovery of the outer retina and photoreceptors. Acute-onset, self-resolving, placoid, or multifocal nonnecrotizing chorioretinal lesions may be a feature of active Zika virus chorioretinitis, as reported in other Flavivirus infections in adults. Similar findings in potentially exposed adults suggest that clinicians should consider IgM antibody or polymerase chain reaction testing for Zika virus as well as diagnostic
Directory of Open Access Journals (Sweden)
Nguyen Tien Huy
Full Text Available BACKGROUND AND PURPOSE: Successful outcomes from bacterial meningitis require rapid antibiotic treatment; however, unnecessary treatment of viral meningitis may lead to increased toxicities and expense. Thus, improved diagnostics are required to maximize treatment and minimize side effects and cost. Thirteen clinical decision rules have been reported to identify bacterial from viral meningitis. However, few rules have been tested and compared in a single study, while several rules are yet to be tested by independent researchers or in pediatric populations. Thus, simultaneous test and comparison of these rules are required to enable clinicians to select an optimal diagnostic rule for bacterial meningitis in settings and populations similar to ours. METHODS: A retrospective cross-sectional study was conducted at the Infectious Department of Pediatric Hospital Number 1, Ho Chi Minh City, Vietnam. The performance of the clinical rules was evaluated by area under a receiver operating characteristic curve (ROC-AUC using the method of DeLong and McNemar test for specificity comparison. RESULTS: Our study included 129 patients, of whom 80 had bacterial meningitis and 49 had presumed viral meningitis. Spanos's rule had the highest AUC at 0.938 but was not significantly greater than other rules. No rule provided 100% sensitivity with a specificity higher than 50%. Based on our calculation of theoretical sensitivity and specificity, we suggest that a perfect rule requires at least four independent variables that posses both sensitivity and specificity higher than 85-90%. CONCLUSIONS: No clinical decision rules provided an acceptable specificity (>50% with 100% sensitivity when applying our data set in children. More studies in Vietnam and developing countries are required to develop and/or validate clinical rules and more very good biomarkers are required to develop such a perfect rule.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
Flood hazard probability mapping method
Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart
2015-04-01
In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.
Lindeboom, J. A.; Mathura, K. R.; Harkisoen, S.; van den Akker, H. P.; Ince, C.
2005-01-01
Microvascular changes because of smoking are frequently presumed in models because of the negative effect of smoking portrayed on the microcirculation. We hypothesized that cigarette smoke might lead to a decrease in gingival capillary density. Capillary density was assessed with orthogonal
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Probability measures on metric spaces
Parthasarathy, K R
2005-01-01
In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
2016-06-01
Reports an error in "Presumed fair: Ironic effects of organizational diversity structures" by Cheryl R. Kaiser, Brenda Major, Ines Jurcevic, Tessa L. Dover, Laura M. Brady and Jenessa R. Shapiro (Journal of Personality and Social Psychology, 2013[Mar], Vol 104[3], 504-519). In the article, a raw data merging error in one racial discrimination claim condition from Experiment 6 inadvertently resulted in data analyses on an inaccurate data set. When the error was discovered by the authors and corrected, all analyses reported in Experiment 6 for claim validity, seriousness of the claim, and support for the claimant were inaccurate and none were statistically significant. The conclusions should be altered to indicate that participants with management experience who reflected on their own workplace diversity policies did not show the predicted effects. The literature review, remaining five studies, and remaining conclusions in the article are unaffected by this error. Experiment 6 should also report that 26.4% (not 26.4.7%) of participants had a graduate degree and eight participants (not 8%) did not provide educational data. Experiment 5 should have referred to the claim validity measure as a six-item measure ( .92) rather than a five-item measure; analyses on claim validity are accurate in text. Table 2's note should have said standard errors, not standard deviations. (The following abstract of the original article appeared in record 2012-31077-001.) This research tests the hypothesis that the presence (vs. absence) of organizational diversity structures causes high-status group members (Whites, men) to perceive organizations with diversity structures as procedurally fairer environments for underrepresented groups (racial minorities, women), even when it is clear that underrepresented groups have been unfairly disadvantaged within these organizations. Furthermore, this illusory sense of fairness derived from the mere presence of diversity structures causes high
U.S. Environmental Protection Agency — Road density is generally highly correlated with amount of developed land cover. High road densities usually indicate high levels of ecological disturbance. More...
Swedish earthquakes and acceleration probabilities
International Nuclear Information System (INIS)
Slunga, R.
1979-03-01
A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)
Kolmogorov complexity and probability measures
Czech Academy of Sciences Publication Activity Database
Šindelář, Jan; Boček, Pavel
2002-01-01
Roč. 38, č. 6 (2002), s. 729-745 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * probability measure Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
2015-01-01
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...
Measurement Invariance, Entropy, and Probability
Directory of Open Access Journals (Sweden)
D. Eric Smith
2010-02-01
Full Text Available We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student’s probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability.
DEFF Research Database (Denmark)
Garnett, E S; Webber, C E; Coates, G
1977-01-01
The density of a defined volume of the human lung can be measured in vivo by a new noninvasive technique. A beam of gamma-rays is directed at the lung and, by measuring the scattered gamma-rays, lung density is calculated. The density in the lower lobe of the right lung in normal man during quiet...
Surgery for stress urinary incontinence due to presumed sphincter deficiency after prostate surgery.
Silva, Laercio A; Andriolo, Régis B; Atallah, Álvaro N; da Silva, Edina M K
2014-09-27
Incontinence after prostatectomy for benign or malignant disease is a well-known and often a feared outcome. Although small degrees of incidental incontinence may go virtually unnoticed, larger degrees of incontinence can have a major impact on a man's quality of life.Conceptually, post-prostatectomy incontinence may be caused by sphincter malfunction or bladder dysfunction, or both. Most men with post-prostatectomy incontinence (60% to 100%) have stress urinary incontinence, which is involuntary urinary leakage on effort or exertion, or on sneezing or coughing. This may be due to intrinsic sphincter deficiency and may be treated with surgery for optimal management of incontinence. Detrusor dysfunction is more common after surgery for benign prostatic disease. To determine the effects of surgical treatment for urinary incontinence related to presumed sphincter deficiency after prostate surgery for:- men with lower urinary tract symptoms (LUTS) secondary to benign prostatic hyperplasia (BPH) - transurethral resection of prostate (TURP), photo vaporisation of the prostate, laser enucleation of the prostate or open prostatectomy - and- men with prostate cancer - radical prostatectomy (retropubic, perineal, laparoscopic, or robotic). We searched the Cochrane Incontinence Group Specialised Register, which contains trials identified from Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, MEDLINE in process, ClinicalTrials.gov, and handsearching of journals and conference proceedings (searched 31 March 2014); MEDLINE (January 1966 to April 2014); EMBASE (January 1988 to April 2014); and LILACS (January 1982 to April 2014). We handsearched the reference lists of relevant articles and conference proceedings. We contacted investigators to locate studies. Randomised or quasi-randomised trials that include surgical treatments of urinary incontinence after prostate surgery. Two authors independently screened the trials identified, appraised quality of papers
International Nuclear Information System (INIS)
Coleman, J.H.
1980-10-01
A technique is discussed for computing the probability distribution of the accumulated dose received by an arbitrary receptor resulting from several single releases from an intermittent source. The probability density of the accumulated dose is the convolution of the probability densities of doses from the intermittent releases. Emissions are not assumed to be constant over the brief release period. The fast fourier transform is used in the calculation of the convolution
The Probabilities of Unique Events
2012-08-30
social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...yields an analog magnitude monotonically related to the proportion of possibilities in the mental model in which Obama is re- elected. We refer to this... internal representation that corresponds to a simple line within two boundaries: |−−−−−− | The left vertical represents impossibility, the right
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Learning Grasp Affordance Densities
DEFF Research Database (Denmark)
Detry, Renaud; Kraft, Dirk; Kroemer, Oliver
2011-01-01
these and records their outcomes. When a satisfactory number of grasp data is available, an importance-sampling algorithm turns these into a grasp density. We evaluate our method in a largely autonomous learning experiment run on three objects of distinct shapes. The experiment shows how learning increases success...... rates. It also measures the success rate of grasps chosen to maximize the probability of success given reaching constraints....
Stochastic response of nonlinear system in probability domain
Indian Academy of Sciences (India)
Stochastic average procedure; nonlinear single-DOF system; proba- bility density function. 1. Introduction. Stochastic response analysis of nonlinear systems has been extensively studied in the fre- quency, time and probability domains. In the frequency domain, the stochastic linearization technique is generally used for ...
Negative probability of random multiplier in turbulence
Bai, Xuan; Su, Weidong
2017-11-01
The random multiplicative process (RMP), which has been proposed for over 50 years, is a convenient phenomenological ansatz of turbulence cascade. In the RMP, the fluctuation in a large scale is statistically mapped to the one in a small scale by the linear action of an independent random multiplier (RM). Simple as it is, the RMP is powerful enough since all of the known scaling laws can be included in this model. So far as we know, however, a direct extraction for the probability density function (PDF) of RM has been absent yet. The reason is the deconvolution during the process is ill-posed. Nevertheless, with the progress in the studies of inverse problems, the situation can be changed. By using some new regularization techniques, for the first time we recover the PDFs of the RMs in some turbulent flows. All the consistent results from various methods point to an amazing observation-the PDFs can attain negative values in some intervals; and this can also be justified by some properties of infinitely divisible distributions. Despite the conceptual unconventionality, the present study illustrates the implications of negative probability in turbulence in several aspects, with emphasis on its role in describing the interaction between fluctuations at different scales. This work is supported by the NSFC (No. 11221062 and No. 11521091).
Maximum entropy principle and partial probability weighted moments
Deng, Jian; Pandey, M. D.; Xie, W. C.
2012-05-01
Maximum entropy principle (MaxEnt) is usually used for estimating the probability density function under specified moment constraints. The density function is then integrated to obtain the cumulative distribution function, which needs to be inverted to obtain a quantile corresponding to some specified probability. In such analysis, consideration of higher ordermoments is important for accurate modelling of the distribution tail. There are three drawbacks for this conventional methodology: (1) Estimates of higher order (>2) moments from a small sample of data tend to be highly biased; (2) It can merely cope with problems with complete or noncensored samples; (3) Only probability weighted moments of integer orders have been utilized. These difficulties inevitably induce bias and inaccuracy of the resultant quantile estimates and therefore have been the main impediments to the application of the MaxEnt Principle in extreme quantile estimation. This paper attempts to overcome these problems and presents a distribution free method for estimating the quantile function of a non-negative randomvariable using the principle of maximum partial entropy subject to constraints of the partial probability weighted moments estimated from censored sample. The main contributions include: (1) New concepts, i.e., partial entropy, fractional partial probability weighted moments, and partial Kullback-Leibler measure are elegantly defined; (2) Maximum entropy principle is re-formulated to be constrained by fractional partial probability weighted moments; (3) New distribution free quantile functions are derived. Numerical analyses are performed to assess the accuracy of extreme value estimates computed from censored samples.
ORF virus infection in a hunter in Western Austria, presumably transmitted by game.
Kitchen, Maria; Müller, Hansgeorg; Zobl, Alexandra; Windisch, Andrea; Romani, Nikolaus; Huemer, Hartwig
2014-03-01
A variety of animals host parapoxviruses. Orf virus is prevalent in sheep and goats in the Tyrol region of Austria and Northern Italy. Zoonotic infections in humans mostly occur after occupational exposure. We report here a case of a hunter with a typical Orf lesion (contagious ecthyma) on the finger, with no history of direct contact with domestic animals. Three weeks previously he had been hunting chamois (Rupicapra rupicapra) and cut his finger while handling a carcass. Parapoxvirus infection was confirmed by electron microscopy and PCR, and the species was identified by DNA sequencing. The sequence was highly homologous with prevalent sheep Orf virus and rather distant from parapoxviruses found in red deer in Northern Italy. As this case indicated that the infection was acquired via game, we performed spot testing in the suspected area and detected several seropositive animals. This is a strong indication that Orf virus has been introduced into chamois in Western Austria. This probably occurred via roaming domestic sheep sharing the high alpine areas during the summer months.
Ho, Shirley S; Poorisat, Thanomwong; Neo, Rachel L; Detenber, Benjamin H
2014-01-01
This study uses the influence of presumed media influence model as the theoretical framework to examine how perceived social norms (i.e., descriptive, subjective, and injunctive norms) will mediate the influence of pro- and antidrinking media messages on adolescents' intention to consume alcohol in rural Thailand. Data collected from 1,028 high school students indicate that different mechanisms underlie drinking intentions between nondrinkers and those who have consumed alcohol or currently drink. Among nondrinkers, perceived peer attention to prodrinking messages indirectly influenced adolescents' prodrinking attitudes and intentions to consume alcohol through all three types of perceived social norms. Among drinkers, perceived peer attention to pro- and antidrinking messages indirectly influenced adolescents' prodrinking attitudes and intentions to drink alcohol through perceived subjective norm. The findings provide support for the extended influence of presumed media influence model and have practical implications for how antidrinking campaigns targeted at teenagers in Thailand might be designed.
Hartley, Jane E K; Wight, Daniel; Hunt, Kate
2014-01-01
Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers’ alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's ‘presumed media influence’ theory. Media portrayals of romantic/sexual relationships appeared to influe...
Tamhankar, Madhura A; Biousse, Valerie; Ying, Gui-Shuang; Prasad, Sashank; Subramanian, Prem S; Lee, Michael S; Eggenberger, Eric; Moss, Heather E; Pineles, Stacy; Bennett, Jeffrey; Osborne, Benjamin; Volpe, Nicholas J; Liu, Grant T; Bruce, Beau B; Newman, Nancy J; Galetta, Steven L; Balcer, Laura J
2013-11-01
To estimate the proportion of patients presenting with isolated third, fourth, or sixth cranial nerve palsy of presumed microvascular origin versus other causes. Prospective, multicenter, observational case series. A total of 109 patients aged 50 years or older with acute isolated ocular motor nerve palsy. Magnetic resonance imaging (MRI) of the brain. Causes of acute isolated ocular motor nerve palsy (presumed microvascular or other) as determined with early MRI and clinical assessment. Among 109 patients enrolled in the study, 22 had cranial nerve III palsy, 25 had cranial nerve IV palsy, and 62 had cranial nerve VI palsy. A cause other than presumed microvascular ischemia was identified in 18 patients (16.5%; 95% confidence interval, 10.7-24.6). The presence of 1 or more vasculopathic risk factors (diabetes, hypertension, hypercholesterolemia, coronary artery disease, myocardial infarction, stroke, and smoking) was significantly associated with a presumed microvascular cause (P = 0.003, Fisher exact test). Vasculopathic risk factors were also present in 61% of patients (11/18) with other causes. In the group of patients who had vasculopathic risk factors only, with no other significant medical condition, 10% of patients (8/80) were found to have other causes, including midbrain infarction, neoplasms, inflammation, pituitary apoplexy, and giant cell arteritis (GCA). By excluding patients with third cranial nerve palsies and those with GCA, the incidence of other causes for isolated fourth and sixth cranial nerve palsies was 4.7% (3/64). In our series of patients with acute isolated ocular motor nerve palsies, a substantial proportion of patients had other causes, including neoplasm, GCA, and brain stem infarction. Brain MRI and laboratory workup have a role in the initial evaluation of older patients with isolated acute ocular motor nerve palsies regardless of whether vascular risk factors are present. Copyright © 2013 American Academy of Ophthalmology
Wax, Joseph R; Conroy, Kelley; Pinette, Michael G; Litton, Christian; Cartin, Angelina
2017-12-28
When administered inappropriately, first-trimester misoprostol management of induced or spontaneous abortion can result in loss or damage of a continuing pregnancy. Despite these serious consequences, such misoprostol exposures continue to occur. Unfortunately, contributing factors and preventive measures receive little attention. We describe the cases of 4 women in whom misoprostol was inappropriately administered during management of induced and presumed spontaneous abortion. In each case, careful adherence to published clinical guidance could have avoided the exposures. © 2017 Wiley Periodicals, Inc.
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Can an ensemble give anything more than Gaussian probabilities?
Directory of Open Access Journals (Sweden)
J. C. W. Denholm-Price
2003-01-01
Full Text Available Can a relatively small numerical weather prediction ensemble produce any more forecast information than can be reproduced by a Gaussian probability density function (PDF? This question is examined using site-specific probability forecasts from the UK Met Office. These forecasts are based on the 51-member Ensemble Prediction System of the European Centre for Medium-range Weather Forecasts. Verification using Brier skill scores suggests that there can be statistically-significant skill in the ensemble forecast PDF compared with a Gaussian fit to the ensemble. The most significant increases in skill were achieved from bias-corrected, calibrated forecasts and for probability forecasts of thresholds that are located well inside the climatological limits at the examined sites. Forecast probabilities for more climatologically-extreme thresholds, where the verification more often lies within the tails or outside of the PDF, showed little difference in skill between the forecast PDF and the Gaussian forecast.
Lectures on probability and statistics
Energy Technology Data Exchange (ETDEWEB)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen
2002-01-01
This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.
The Inductive Applications of Probability Calculus
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Does probability of occurrence relate to population dynamics?
Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M
2014-12-01
Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence
Domestic wells have high probability of pumping septic tank leachate
Directory of Open Access Journals (Sweden)
J. E. Bremer
2012-08-01
Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.
Cross Check of NOvA Oscillation Probabilities
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics
2018-01-12
In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.
Directory of Open Access Journals (Sweden)
Manuel Uribe-Alcocer
1999-12-01
these regions, and/or, less probably, by a recent historical replacement of one population by the other. The absence of karyotype differences might also be attributed to characteristics inherent to the genome organization in the genus Cichlasoma still to be identified and understood.Cichlasoma istlanum (Jordan & Snyder, 1900 es un cíclido dulceacuícola que se encuentra en la provincia del Río Balsas en la Cuenca del Pacífico mexicano. De Buen (1946, basado en caracteres merísticos de la especie, propuso la división en dos subespecies: C. istlana istlana, procedente del Río Ixtla, en el estado de Morelos, y C. istlana fusca, del Río Huámito, Michoacán. En este trabajo se estableció el cariotipo de la especie por medio de procedimientos citogenéticos convencionales y de bandeo G, y se realizó un análisis comparativo de los cariotipos provenientes de las muestras de las dos poblaciones previamente propuestas como subespecies. Se recolectaron diez hembras en el Río Amacuzac, en el estado de Morelos, y nueve especímenes en el Río Huámito: dos hembras y siete machos. Mediante la cuenta de 264 campos mitóticos de la primera muestra y 203 de la segunda, se pudo establecer un número modal de 2n=48 en cada una, moda que se consideró correspondía al número diploide de la especie. El análisis cariotípico se basó en diez cariogramas preparados de la población de Morelos y de ocho de la de Michoacán, que incluyó tres provenientes de hembras y cinco de machos. La fórmula cromosómica encontrada fue de 8sm+40stt. El patrón de bandas G de ambas poblaciones fue similar y las comparaciones estadísticas de las longitudes promedio de los pares cromosómicos no mostraron diferencias significativas concluyentes entre ambas poblaciones. La existencia de un cariotipo prácticamente idéntico concuerda con la falta de diversificación subespecífica entre las poblaciones estudiadas. Los análisis morfométricos realizados por otros autores que encontraron
Poage, J. L.
1975-01-01
A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.
Directory of Open Access Journals (Sweden)
Huping Xue
Full Text Available BACKGROUND: Horizontal gene transfer (HGT is recognized as one of the major forces for bacterial genome evolution. Many clinically important bacteria may acquire virulence factors and antibiotic resistance through HGT. The comparative genomic analysis has become an important tool for identifying HGT in emerging pathogens. In this study, the Serine-Aspartate Repeat (Sdr family has been compared among different sources of Staphylococcus aureus (S. aureus to discover sequence diversities within their genomes. METHODOLOGY/PRINCIPAL FINDINGS: Four sdr genes were analyzed for 21 different S. aureus strains and 218 mastitis-associated S. aureus isolates from Canada. Comparative genomic analyses revealed that S. aureus strains from bovine mastitis (RF122 and mastitis isolates in this study, ovine mastitis (ED133, pig (ST398, chicken (ED98, and human methicillin-resistant S. aureus (MRSA (TCH130, MRSA252, Mu3, Mu50, N315, 04-02981, JH1 and JH9 were highly associated with one another, presumably due to HGT. In addition, several types of insertion and deletion were found in sdr genes of many isolates. A new insertion sequence was found in mastitis isolates, which was presumably responsible for the HGT of sdrC gene among different strains. Moreover, the sdr genes could be used to type S. aureus. Regional difference of sdr genes distribution was also indicated among the tested S. aureus isolates. Finally, certain associations were found between sdr genes and subclinical or clinical mastitis isolates. CONCLUSIONS: Certain sdr gene sequences were shared in S. aureus strains and isolates from different species presumably due to HGT. Our results also suggest that the distributional assay of virulence factors should detect the full sequences or full functional regions of these factors. The traditional assay using short conserved regions may not be accurate or credible. These findings have important implications with regard to animal husbandry practices that may
Probable Linezolid-Induced Pancytopenia
Directory of Open Access Journals (Sweden)
Nita Lakhani
2005-01-01
Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.
K-forbidden transition probabilities
International Nuclear Information System (INIS)
Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki
2000-01-01
Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.
1993-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
The Black Hole Formation Probability
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
Directory of Open Access Journals (Sweden)
Linde Hans-Jörg
2009-08-01
Full Text Available Abstract Background Timely identification of pathogens is crucial to minimize mortality in patients with severe infections. Detection of bacterial and fungal pathogens in blood by nucleic acid amplification promises to yield results faster than blood cultures (BC. We analyzed the clinical impact of a commercially available multiplex PCR system in patients with suspected sepsis. Methods Blood samples from patients with presumed sepsis were cultured with the Bactec 9240™ system (Becton Dickinson, Heidelberg, Germany and aliquots subjected to analysis with the LightCycler® SeptiFast® (SF Test (Roche Diagnostics, Mannheim, Germany at a tertiary care centre. For samples with PCR-detected pathogens, the actual impact on clinical management was determined by chart review. Furthermore a comparison between the time to a positive blood culture result and the SF result, based on a fictive assumption that it was done either on a once or twice daily basis, was made. Results Of 101 blood samples from 77 patients, 63 (62% yielded concordant negative results, 14 (13% concordant positive and 9 (9% were BC positive only. In 14 (13% samples pathogens were detected by SF only, resulting in adjustment of antibiotic therapy in 5 patients (7,7% of patients. In 3 samples a treatment adjustment would have been made earlier resulting in a total of 8 adjustments in all 101 samples (8%. Conclusion The addition of multiplex PCR to conventional blood cultures had a relevant impact on clinical management for a subset of patients with presumed sepsis.
Directory of Open Access Journals (Sweden)
Monica Shukla
2004-06-01
Full Text Available Research Question : What is the attitude of young females towards their husband or sex partners presuming them infected with HIV?Objectives : Attitude of young slum dwelling females towards husband or sex partner presuming them HIV infectedaccording to age of respondentsaccording to marital status of respondentsaccording to occupation of respondentsaccording to literacy status of respondents Study Design : Cross sectional studyStudy Area : 10% of the Slums of Kanpur City having population less than 3000.Participants : 13 to 25 years aged females of selected slums.Study variables : Age, marital status, occur' ,:on, literacy status.Statistical Analysis : PercentageResults : 12.3% expressed about non disclosure of disease. 25.7% were indecisive, 65.2% to pursue for treatment, 32.7% to continue sex relationship and 31.2% to continue social relationship. Respondents employed as teachers showed greater degree of concern for more care (45.5% and also for continuation of social and sexual relationship. Continuation on social & sexual relationship along with pursuation for treatment and more care of victim (husband/sex partner was observed among highest percentage in graduate and above level with gradual decrease in the prevalence with decrease in the level of education.
Hartley, Jane E K; Wight, Daniel; Hunt, Kate
2014-01-01
Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers’ alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's ‘presumed media influence’ theory. Media portrayals of romantic/sexual relationships appeared to influence teenagers’ constructions of gender-appropriate sexual behaviour more than did media portrayals of drinking behaviour, perhaps because the teenagers had more firsthand experience of observing drinking than of observing sexual relationships. Presumed media influence may be less influential if one has experience of the behaviour portrayed. Drinking and sexual behaviour were highly interrelated: sexual negotiation and activities were reportedly often accompanied by drinking. For teenagers, being drunk or, importantly, pretending to be drunk, may be a useful way to try out what they perceived to be gender-appropriate identities. In sum, teenagers’ drinking and sexual/romantic relationships are primary ways in which they do gender and the media's influence on their perceptions of appropriate gendered behaviour is mediated through peer relationships. PMID:24443822
Do we need to change the legislation to a system of presumed consent to address organ shortage?
Simillis, Constantinos
2010-04-01
Organ transplantation significantly improves the health, quality of life and life-expectancy of people whose organs have failed. Most patients in the UK cannot enjoy the benefits of a transplant because of an extreme shortage of organs. This paper demonstrates the magnitude of the problem of organ shortage and identifies possible causes. The current UK legislation regarding consent to organ transplantation is analysed and compared with other jurisdictions. The hypothesis of changing the legislation to a system of presumed consent in order to address the organ shortage is explored. The main issues surrounding a change in the legislation are considered, and the effects on society and the individual are discussed. This paper argues that there is not enough convincing evidence to support a change in the legislation to a system of presumed consent at this time. Instead, an increase in organ donations could be achieved by improving the effectiveness of the current system of organ donation, and by improving the public's awareness and understanding of organ transplantation issues.
Hartley, Jane E K; Wight, Daniel; Hunt, Kate
2014-06-01
Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers' alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's 'presumed media influence' theory. Media portrayals of romantic/sexual relationships appeared to influence teenagers' constructions of gender-appropriate sexual behaviour more than did media portrayals of drinking behaviour, perhaps because the teenagers had more firsthand experience of observing drinking than of observing sexual relationships. Presumed media influence may be less influential if one has experience of the behaviour portrayed. Drinking and sexual behaviour were highly interrelated: sexual negotiation and activities were reportedly often accompanied by drinking. For teenagers, being drunk or, importantly, pretending to be drunk, may be a useful way to try out what they perceived to be gender-appropriate identities. In sum, teenagers' drinking and sexual/romantic relationships are primary ways in which they do gender and the media's influence on their perceptions of appropriate gendered behaviour is mediated through peer relationships. © 2014 The Authors. Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for SHIL (SHIL).
dell'Omo, R; Konstantopoulou, K; Wong, R; Pavesio, C
2009-11-01
To examine fundus autofluorescence (FAF) findings in eyes with presumed idiopathic outer lamellar defects (OLD) at the fovea and to discuss their pathogenesis. Prospective observational case series of five eyes of four patients presenting with OLD at the fovea defined as discrete lesions of 50-100 mum in size located at the level of the outer retina on biomicroscopy and imaged on optical coherence tomography (OCT) as cylindrical, well-demarcated interruption of hyper-reflective bands corresponding to the inner/outer segments junction of photoreceptors and to the complex retinal pigment epithelium-choriocapillaris; none of the enrolled patients had any positive history for direct sungazing, welding-arc or sunbed exposure, whiplash injury, ocular trauma, macular oedema/detachment or evidence of vitreomacular traction. The corresponding FAF images were evaluated. In eyes with OLD, the neuroretina in the foveal region appeared to be thinner than in fellow, unaffected eyes. FAF revealed well-demarcated, hypoautofluorescent areas (corresponding in location to the OLD observed clinically and on OCT), surrounded by an irregular halo of relatively increased autofluorescence in the context of the greater hypoautofluorescent macular region. Biomicroscopy, OCT and FAF findings of presumed idiopathic OLD of the fovea strongly resemble those observed in association with chronic solar retinopathy. In association with OCT, FAF might represent a useful technique with which to detect subtle solar-induced injuries of the retina.
Premaratna, R; Halambarachchige, L P; Nanayakkara, D M; Chandrasena, T G A N; Rajapakse, R P V J; Bandara, N K B K R G W; de Silva, H J
2011-12-01
Chikungunya fever (CGF) and rickettsioses are known to cause acute onset febrile illnesses associated with severe arthritis. Rickettsial arthritis is curable with the use of appropriate anti-rickettsial antibiotics, however the arthritis of CGF tends to have a prolonged course leading to protracted disability. The aim of this study was to investigate the contribution of CGF and rickettsioses to cases of fever and arthritis during a presumed CGF outbreak in Sri Lanka. Fifty-eight consecutive patients with presumed CGF were further investigated to determine the occurrence of rickettsioses among them, and to identify differences in clinical, hematological, and biochemical parameters between the two diseases. Nearly a third of the patients had serological evidence of rickettsioses accounting for their illness. The presence of a late onset major joint arthropathy sparing the small joints of the hands and feet, and the occurrence of a late onset discrete maculopapular rash over the trunk and extremities, suggested rickettsioses over CGF. White blood cell count, erythrocyte sedimentation rate, C-reactive protein, and liver function tests were not helpful in differentiating rickettsioses from CGF. Patients with rickettsioses and arthritis who received an empirical course of doxycycline recovered faster than those who did not receive specific treatment. The establishment of rapid diagnostic methods able to differentiate the etiological agents of fever and arthritis, such as CGF and rickettsioses, would be beneficial in endemic settings. Copyright © 2011 International Society for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
... Bone Density Exam/Testing › Low Bone Density Low Bone Density Low bone density is when your bone ... to people with normal bone density. Detecting Low Bone Density A bone density test will determine whether ...
How Life History Can Sway the Fixation Probability of Mutants
Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne
2016-01-01
In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.
2013-01-01
distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...
Bounded Densities and Their Derivatives
DEFF Research Database (Denmark)
Kozine, Igor; Krymsky, V.
2009-01-01
This paper describes how one can compute interval-valued statistical measures given limited information about the underlying distribution. The particular focus is on a bounded derivative of a probability density function and its combination with other available statistical evidence for computing ...
Stochastic self-propagating star formation with anisotropic probability distribution
Jungwiert, B.; Palous, J.
1994-07-01
We present a 2D computer code for stochastic self-propagating star formation (SSPSF) in differentially rotating galaxies. The isotropic probability distribution, used in previous models of Seiden, Gerola and Schulman (Seiden & Schulman, 1990, and references therein), is replaced by an anisotropic one. The motivation is provided by models of expanding large-scale supernova remnants (SNR) in disks with shear (Palous et al. 1990): the distortion of the SNR leads to uneven density distribution along its periphery and, consequently, to uneven distribution of new star forming sites. To model anisotropic SSPSF, we process in two steps: first, we eliminate artificial anisotropies inherent to the technique used by Seiden, Gerola and Schulman and, second, we define the probability ellipse on each star forming site. The anisotropy is characterized by its axes ratio and inclination with respect to the galactic center. We show that anisotropic SSPSF is able to produce highly organized spiral structures. Depending on the character of the probability ellipse, we can obtain continous spiral arms of different length, thickness and pitch angle. The relation of the probability ellipse to rotation curves interstellar medium (ISM) density and metallicity is discussed as well as its variation along the Hubble sequence and van den Bergh's luminosity classification of galaxies. To demonstrate applications, we compare our results with two different classes of galaxies: M 101-type grand-design spirals with open and robust arms and NGC 2841-type flocculent galaxies with thin and tightly wound arms.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Multiple decomposability of probabilities on contractible locally ...
Indian Academy of Sciences (India)
1970) (Berlin-Heidelberg-New. York: Springer). [10] Heyer H, Probability Measures on Locally Compact Groups (1977) (Berlin-Heidelberg-. New York: Springer). [11] Jurek Z and Mason D, Operator Limit Distributions in Probability Theory (1993).
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
Bosch, Thijs; Witteveen, Sandra; Haenen, Anja; Landman, Fabian; Schouls, Leo M
2016-07-15
Livestock-associated methicillin-resistant Staphylococcus aureus (LA-MRSA) was detected in 2003 and rapidly became the predominant MRSA clade in the Netherlands. Studies have shown that transmissions are difficult to identify, since this MRSA variant represents a genetically homogenous clade when current typing techniques are used. Here, next-generation sequencing was performed on 206 LA-MRSA isolates to assess the capability of LA-MRSA to be transmitted between humans. The usefulness of single nucleotide variants (SNVs), the composition of the SCCmec region, and the presence of plasmids to identify transmission of LA-MRSA were assessed. In total, 30 presumed putative nosocomial transmission events and 2 LA-MRSA outbreaks were studied; in most cases, SNV analysis revealed that the isolates of the index patient and the contact(s) clustered closely together. In three presumed events, the isolates did not cluster together, indicating that transmission was unlikely. The composition of the SCCmec region corroborated these findings. However, plasmid identification did not support our SNV analysis, since different plasmids were present in several cases where SNV and SCCmec analysis suggested that transmission was likely. Next-generation sequencing shows that transmission of LA-MRSA does occur in Dutch health care settings. Transmission was identified based on SNV analysis combined with epidemiological data and in the context of epidemiologically related and unrelated isolates. Analysis of the SCCmec region provided limited, albeit useful, information to corroborate conclusions on transmissions, but plasmid identification did not. In 2003, a variant of methicillin-resistant Staphylococcus aureus (MRSA) isolated from pigs was also found in pig farmers in France and the Netherlands. Soon thereafter, this livestock-associated MRSA (LA-MRSA) was identified in many other countries. Transmission of LA-MRSA between humans, particularly in the health care setting, is regarded to
Probable Cause: A Decision Making Framework.
1984-08-01
primacy more likely than recency , and vice versa? Third, the updating of causal beliefs depends on positive as well as negative evidence. Therefore, a full...order, contiguity in time and space, and similarity of cause and effect . In doing so, we show how these cues can conflict with probabilistic ideas. A...causal chain between an effect and its presumed cause. The model is used to discuss a wide OR range of studies on causal judgments and explicates
Improving Density Estimation by Incorporating Spatial Information
Directory of Open Access Journals (Sweden)
Andrea L. Bertozzi
2010-01-01
Full Text Available Given discrete event data, we wish to produce a probability density that can model the relative probability of events occurring in a spatial region. Common methods of density estimation, such as Kernel Density Estimation, do not incorporate geographical information. Using these methods could result in nonnegligible portions of the support of the density in unrealistic geographic locations. For example, crime density estimation models that do not take geographic information into account may predict events in unlikely places such as oceans, mountains, and so forth. We propose a set of Maximum Penalized Likelihood Estimation methods based on Total Variation and H1 Sobolev norm regularizers in conjunction with a priori high resolution spatial data to obtain more geographically accurate density estimates. We apply this method to a residential burglary data set of the San Fernando Valley using geographic features obtained from satellite images of the region and housing density information.
presumed choloroquine retinopathy
African Journals Online (AJOL)
A follow up national survey would be desirable to determine the actual magnitude of the problem. Key words: blindness chloroquine, retinopathy, irreversible ... the treatment of lupus erythematosus and rheumatoid arthritis. The effective doses commonly used exceed that used in treating malaria, as the drug is administered.
Zhang, Jiaxin; Shields, Michael D.
2018-01-01
This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Adolescents' misinterpretation of health risk probability expressions.
Cohn, L D; Schydlower, M; Foley, J; Copeland, R L
1995-05-01
To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).
Directory of Open Access Journals (Sweden)
Fabio Lavinsky
2013-06-01
Full Text Available PURPOSES: Microbial keratitis is commonly diagnosed worldwide, and continues to cause significant ocular morbidity, requiring prompt and appropriate treatment. The objective of this study is to describe the clinical characteristics and outcomes of patients with presumed microbial keratitis admitted to The Goldschleger Eye Institute, Sheba Medical Center, Tel Aviv University, Tel Hashomer, Israel. METHODS: A cross-sectional study was conducted, in which the medical records of patients with presumed microbial keratitis admitted during a period of 3 years were reviewed. RESULTS: Keratitis was diagnosed in 276 patients (51% males and 48.9% females. The mean age was 39.29 ± 22.30 years. The hospital length of stay ranged from 1 to 65 days (mean 5.69 ± 5.508. Fortified antibiotics were still used at discharge in 72% of the cases. Overall visual acuity improved significantly from the time of admission to the 1st-week follow up visit showing a p0.05. The degree of hypopyon and cells in the anterior chamber was significantly related to the hospital length of stay (r Spearman=0.31; p<0.001 and r Spearman=0.21; p<.001, respectively as well as to a worse visual outcome (r Spearman=0.32; p<0.01 and r Spearman=0.18; p=0.01, respectively. Of all patients, 2.3% required an urgent therapeutic penetrating keratoplasty, and 1% underwent evisceration. There was no enucleation. CONCLUSION: Treating keratitis aggressively and assuring patient compliance is imperative for a good final visual outcome. Inpatient treatment may have a positive impact on this outcome.
Bouyssou, Sarah; Specchi, Swan; Desquilbet, Loïc; Pey, Pascaline
2017-05-01
Noncardiogenic pulmonary edema is an important cause of respiratory disease in dogs and cats but few reports describe its radiographic appearance. The purpose of this retrospective case series study was to describe radiographic findings in a large cohort of dogs and cats with presumed noncardiogenic pulmonary edema and to test associations among radiographic findings versus cause of edema. Medical records were retrieved for dogs and cats with presumed noncardiogenic edema based on history, radiographic findings, and outcome. Radiographs were reviewed to assess lung pattern and distribution of the edema. Correlation with the cause of noncardiogenic pulmonary edema was evaluated with a Fisher's exact test. A total of 49 dogs and 11 cats were included. Causes for the noncardiogenic edema were airway obstruction (n = 23), direct pulmonary injury (n = 13), severe neurologic stimulation (n = 12), systemic disease (n = 6), near-drowning (n = 3), anaphylaxis (n = 2) and blood transfusion (n = 1). Mixed, symmetric, peripheral, multifocal, bilateral, and dorsal lung patterns were observed in 44 (73.3%), 46 (76.7%), 55 (91.7%), 46 (76.7%), 46 (76.7%), and 34 (57.6%) of 60 animals, respectively. When the distribution was unilateral, pulmonary infiltration involved mainly the right lung lobes (12 of 14, 85.7%). Increased pulmonary opacity was more often asymmetric, unilateral, and dorsal for postobstructive pulmonary edema compared to other types of noncardiogenic pulmonary edema, but no other significant correlations could be identified. In conclusion, noncardiogenic pulmonary edema may present with a quite variable radiographic appearance in dogs and cats. © 2016 American College of Veterinary Radiology.
Chilingarian, L I
2005-01-01
Individual typological features of behavior of dogs were investigated by the method of choice between the low-valuable food available constantly and food of high quality presented with low probability. Animals were subjected to instrumental conditioning with the same conditioned stimuli but different types of reinforcement. Depression of a white pedal was always reinforced with meat-bread-crumb mixture, depression of a black pedal was reinforced with two pieces of liver (with probabilities of 100, 40, 33, 20, or 0%). The choice of reinforcement depended on probability of valuable food and individual typological features of the nervous system of a dog. Decreasing the probability of the reinforcement value to 40-20% revealed differences in behavior of dogs. Dogs of the first group, presumably with the weak type of the nervous system, more frequently pressed the white pedal (always reinforced) than the black pedal thus "avoiding a situation of risk" to receive an empty cup. They displayed symptoms of neurosis: whimper, refusals of food or of the choice of reinforcement, and obtrusive movements. Dogs of the second group, presumably with the strong type of the nervous system, more frequently pressed the black pedal (more valuable food) for the low-probability reward until they obtained the valuable food. They did not show neurosis symptoms and were not afraid of "situation of risk". A decrease in probability of the valuable reinforcement increased a percentage of long-latency depressions of pedals. It can be probably suggested that this phenomenon was associated with increasing involvement of cognitive processes, when contributions of the assessments of probability and value of the reinforcement to decision making became approximately equal. Choice between the probability and value of alimentary reinforcement is a good method for revealing individual typological features of dogs.
UT Biomedical Informatics Lab (BMIL) Probability Wheel.
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K
2016-01-01
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Novak, Sebastian; Chatterjee, Krishnendu; Nowak, Martin A
2013-10-07
The basic idea of evolutionary game theory is that payoff determines reproductive rate. Successful individuals have a higher payoff and produce more offspring. But in evolutionary and ecological situations there is not only reproductive rate but also carrying capacity. Individuals may differ in their exposure to density limiting effects. Here we explore an alternative approach to evolutionary game theory by assuming that the payoff from the game determines the carrying capacity of individual phenotypes. Successful strategies are less affected by density limitation (crowding) and reach higher equilibrium abundance. We demonstrate similarities and differences between our framework and the standard replicator equation. Our equation is defined on the positive orthant, instead of the simplex, but has the same equilibrium points as the replicator equation. Linear stability analysis produces the classical conditions for asymptotic stability of pure strategies, but the stability properties of internal equilibria can differ in the two frameworks. For example, in a two-strategy game with an internal equilibrium that is always stable under the replicator equation, the corresponding equilibrium can be unstable in the new framework resulting in a limit cycle. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
On The Left Tail-End Probabilities and the Probability Generating ...
African Journals Online (AJOL)
On The Left Tail-End Probabilities and the Probability Generating Function. ... Journal of the Nigerian Association of Mathematical Physics ... In this paper, another tail-end probability function is proposed using the left tail-end probabilities, p( ≤ i ) = Πṙ The resulting function, πx(t), is continuous and converges uniformly ...
Time-of-arrival probabilities and quantum measurements
International Nuclear Information System (INIS)
Anastopoulos, Charis; Savvidou, Ntina
2006-01-01
In this paper we study the construction of probability densities for time of arrival in quantum mechanics. Our treatment is based upon the facts that (i) time appears in quantum theory as an external parameter to the system, and (ii) propositions about the time of arrival appear naturally when one considers histories. The definition of time-of-arrival probabilities is straightforward in stochastic processes. The difficulties that arise in quantum theory are due to the fact that the time parameter of the Schroedinger's equation does not naturally define a probability density at the continuum limit, but also because the procedure one follows is sensitive on the interpretation of the reduction procedure. We consider the issue in Copenhagen quantum mechanics and in history-based schemes like consistent histories. The benefit of the latter is that it allows a proper passage to the continuous limit--there are, however, problems related to the quantum Zeno effect and decoherence. We finally employ the histories-based description to construct Positive-Operator-Valued-Measures (POVMs) for the time-of-arrival, which are valid for a general Hamiltonian. These POVMs typically depend on the resolution of the measurement device; for a free particle, however, this dependence cancels in the physically relevant regime and the POVM coincides with that of Kijowski
Real analysis and probability solutions to problems
Ash, Robert P
1972-01-01
Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.
Kimata, F.
2012-12-01
Asama volcano is one of the active volcanoes in Japan, and it erupted on September 1, 2004. A shallow dike intrusion is estimated in the Takamine, 4 - 5 km west of the Asama crater from the ground deformation detected by GPS measurements (Aoki et al., 2005). Ground deformation observation close to the pressure source should clarify the depth and volume change of pressure sources. We establish the precise leveling routes ranging to Mt. Takamine above the presumed pressure source from Oiwake, at the southern foot of Asama volcano in May 2005.The precise levelings have practiced seven times for five years since May 2005 to June 2011. We calculated the vertical deformation for six-months or two-years between leveling epochs. Generally, deformations detected by the precise leveling are small of 10 mm. Deformations detected in the periods of May 2005 - Nov.2005. - May 2006 - May 2009 - June 2010 - June 2011, are grouping two patterns. One is definite subsidence, and another is slight uplift. Murakami (2005) discusses the line length changes between two GPS sites of Tsumagoi and Tobu, and he shows that the extension of line length just before the eruption in 2004 and 2009 and contraction between the eruption. Slight uplifts in the periods of May 2005 - May 2006 are corresponding to the period observed the extension, and subsidence in the periods of May 2006 - May 2007, May 2009 - June 2010, and June 2010 - June 2011. Two pressures sources are estimated from the ground deformation detected by precise leveling. One is a deeper spherical deflation source in the 6 km BSL depth beneath the mountainside, and another is the shallow dike intrusion beneath Mt. Takamine. A pressure source model was previously estimated from the leveling data for last 100 years (Murase et al., 2007), and it is suggestive a dominant source of the Asama volcano. They suggest a slight inflation after 1960, however our results show the deflation of -6.6 km3/6yr in the deeper sources for five years after
What probabilities tell about quantum systems, with application to entropy and entanglement
Myers, John M
2010-01-01
The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”
Mechanisms Affecting Population Density in Fragmented Habitat
Directory of Open Access Journals (Sweden)
Lutz Tischendorf
2005-06-01
Full Text Available We conducted a factorial simulation experiment to analyze the relative importance of movement pattern, boundary-crossing probability, and mortality in habitat and matrix on population density, and its dependency on habitat fragmentation, as well as inter-patch distance. We also examined how the initial response of a species to a fragmentation event may affect our observations of population density in post-fragmentation experiments. We found that the boundary-crossing probability from habitat to matrix, which partly determines the emigration rate, is the most important determinant for population density within habitat patches. The probability of crossing a boundary from matrix to habitat had a weaker, but positive, effect on population density. Movement behavior in habitat had a stronger effect on population density than movement behavior in matrix. Habitat fragmentation and inter-patch distance may have a positive or negative effect on population density. The direction of both effects depends on two factors. First, when the boundary-crossing probability from habitat to matrix is high, population density may decline with increasing habitat fragmentation. Conversely, for species with a high matrix-to-habitat boundary-crossing probability, population density may increase with increasing habitat fragmentation. Second, the initial distribution of individuals across the landscape: we found that habitat fragmentation and inter-patch distance were positively correlated with population density when individuals were distributed across matrix and habitat at the beginning of our simulation experiments. The direction of these relationships changed to negative when individuals were initially distributed across habitat only. Our findings imply that the speed of the initial response of organisms to habitat fragmentation events may determine the direction of observed relationships between habitat fragmentation and population density. The time scale of post
Kwek, Tong Kiat; Lew, Thomas W K; Tan, Hui Ling; Kong, Sally
2009-04-01
The success of solid organ transplantation in the treatment of end-stage organ failure has fuelled a growing demand for transplantable organs worldwide that has far outstripped the supply from brain dead heart-beating donors. In Singapore, this has resulted in long waiting lists of patients for transplantable organs, especially kidneys. The Human Organ Transplant Act, introduced in 1987, is an opt-out scheme that presumes consent to removal of certain organs for transplantation upon death. Despite this legislation, the number of deceased organ donors in Singapore, at 7 to 9 per million population per year, remains low compared to many other developed countries. In this paper, we reviewed the clinical challenges and ethical dilemmas encountered in managing and identifying potential donors in the neurological intensive care unit (ICU) of a major general hospital in Singapore. The large variance in donor actualisation rates among local restructured hospitals, at 0% to 56.6% (median 8.8%), suggests that considerable room still exists for improvement. To address this, local hospitals need to review their processes and adopt changes and best practices that will ensure earlier identification of potential donors, avoid undue delays in diagnosing brain death, and provide optimal care of multi-organ donors to reduce donor loss from medical failures.
Milani, Vivian; Goldman, Suzan Menasce; Finguerman, Flora; Pinotti, Marianne; Ribeiro, Celso Scazufka; Abdalla, Nitamar; Szejnfeld, Jacob
2007-07-05
Breast cancer screening programs are critical for early detection of breast cancer. Early detection is essential for diagnosing, treating and possibly curing breast cancer. Since there are no data on the incidence of breast cancer, nationally or regionally in Brazil, our aim was to assess women by means of mammography, to determine the prevalence of this disease. The study protocol was designed in collaboration between the Department of Diagnostic Imaging (DDI), Institute of Diagnostic Imaging (IDI) and São Paulo Municipal Health Program. A total of 139,945 Brazilian women were assessed by means of mammography between April 2002 and September 2004. Using the American College of Radiology (ACR) criteria (Breast Imaging Reporting and Data System, BIRADS), the prevalence of suspected and highly suspected breast lesions were determined. The prevalence of suspected (BIRADS 4) and highly suspected (BIRADS 5) lesions increased with age, especially after the fourth decade. Accordingly, BIRADS 4 and BIRADS 5 lesions were more prevalent in the fourth, fifth, sixth and seventh decades. The presumed prevalence of suspected and highly suspected breast cancer lesions in the population of São Paulo was 0.6% and it is similar to the prevalence of breast cancer observed in other populations.
Directory of Open Access Journals (Sweden)
Vivian Milani
Full Text Available CONTEXT AND OBJECTIVE: Breast cancer screening programs are critical for early detection of breast cancer. Early detection is essential for diagnosing, treating and possibly curing breast cancer. Since there are no data on the incidence of breast cancer, nationally or regionally in Brazil, our aim was to assess women by means of mammography, to determine the prevalence of this disease. DESIGN AND SETTING: The study protocol was designed in collaboration between the Department of Diagnostic Imaging (DDI, Institute of Diagnostic Imaging (IDI and São Paulo Municipal Health Program. METHODS: A total of 139,945 Brazilian women were assessed by means of mammography between April 2002 and September 2004. Using the American College of Radiology (ACR criteria (Breast Imaging Reporting and Data System, BIRADS®, the prevalence of suspected and highly suspected breast lesions were determined. RESULTS: The prevalence of suspected (BIRADS® 4 and highly suspected (BIRADS® 5 lesions increased with age, especially after the fourth decade. Accordingly, BIRADS® 4 and BIRADS® 5 lesions were more prevalent in the fourth, fifth, sixth and seventh decades. CONCLUSION: The presumed prevalence of suspected and highly suspected breast cancer lesions in the population of São Paulo was 0.6% and it is similar to the prevalence of breast cancer observed in other populations.
Directory of Open Access Journals (Sweden)
T. Y. Alvin Liu
2018-01-01
Full Text Available A 37-year-old Caucasian woman presented with acute decrease in central vision in her right eye and was found to have subfoveal choroidal neovascularization (CNV due to presumed ocular histoplasmosis syndrome (POHS. Her visual acuity improved from 20/70 to 20/20 at her 6-month follow-up, after 3 consecutive monthly intravitreal bevacizumab injections were initiated at her first visit. Although no CNV activity was seen on fluorescein angiography (FA or spectral-domain optical coherence tomography (SD-OCT at her 2-month, 4-month, and 6-month follow-up visits, persistent flow in the CNV lesion was detected on optical coherence tomography angiography (OCTA. OCTA shows persistent vascular flow as well as changes in vascular flow in CNV lesions associated with POHS, indicating the continued presence of patent vessels and changes in these CNV lesions, even when traditional imaging of the lesion with OCT and FA indicates stability of the lesion with no disease activity. Additional cases with longitudinal follow-up are needed to assess how OCTA should be incorporated into clinical practice.
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Analytical Study of Thermonuclear Reaction Probability Integrals
Chaudhry, M. A.; Haubold, H. J.; Mathai, A. M.
2000-01-01
An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.
The transition probabilities of the reciprocity model
Snijders, T.A.B.
1999-01-01
The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
Fixed setpoints introduce error in licensing probability
Energy Technology Data Exchange (ETDEWEB)
Laratta, F., E-mail: flaratta@cogeco.ca [Oakville, ON (Canada)
2015-07-01
Although we license fixed (constrained) trip setpoints to a target probability, there is no provision for error in probability calculations or how error can be minimized. Instead, we apply reverse-compliance preconditions on the accident scenario such as a uniform and slow LOR to make probability seem error-free. But how can it be? Probability is calculated from simulated pre-LOR detector readings plus uncertainties before the LOR progression is even knowable. We can conserve probability without preconditions by continuously updating field setpoint equations with on-line detector data. Programmable Digital Controllers (PDC's) in CANDU 6 plants already have variable setpoints for Steam Generator and Pressurizer Low Level. Even so, these setpoints are constrained as a ramp or step in other CANDU plants and don't exhibit unconstrained variability. Fixed setpoints penalize safety and operation margins and cause spurious trips. We nevertheless continue to design suboptimal trip setpoint comparators for all trip parameters. (author)
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the ...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions.......Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
Assessing the clinical probability of pulmonary embolism
International Nuclear Information System (INIS)
Miniati, M.; Pistolesi, M.
2001-01-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3
Tomovski, Zivorad; Mehrez, Khaled
2016-01-01
By making use of the familiar Mathieu series and its generalizations, the authors derive a number of new integral representations and present a systematic study of probability density functions and probability distributions associated with some generalizations of the Mathieu series. In particular, the mathematical expectation, variance and the characteristic functions, related to the probability density functions of the considered probability distributions are derived. As a consequence, some ...
Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence
International Nuclear Information System (INIS)
Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del
2009-01-01
Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.
Upgrading Probability via Fractions of Events
Directory of Open Access Journals (Sweden)
Frič Roman
2016-08-01
Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Computation of the Complex Probability Function
Energy Technology Data Exchange (ETDEWEB)
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one is willing to assume Subjective Expected Utility....
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Probabilities on Streams and Reflexive Games
Directory of Open Access Journals (Sweden)
Andrew Schumann
2014-01-01
Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Padula, Andrew M; Winkel, Kenneth D
2016-04-01
A fatal outcome of a presumed tiger snake (Notechis scutatus) envenomation in a cat is described. Detectable venom components and antivenom concentrations in serum from clotted and centrifuged whole blood and urine were measured using a sensitive and specific ELISA. The cat presented in a paralysed state with a markedly elevated serum CK but with normal clotting times. The cat was treated with intravenous fluids and received two vials of equine whole IgG bivalent (tiger and brown snake) antivenom. Despite treatment the cat's condition did not improve and it died 36 h post-presentation. Serum concentration of detectable tiger snake venom components at initial presentation was 311 ng/mL and urine 832 ng/mL, this declined to non-detectable levels in serum 15-min after intravenous antivenom. Urine concentration of detectable tiger snake venom components declined to 22 ng/mL at post-mortem. Measurement of equine anti-tiger snake venom specific antibody demonstrated a concentration of 7.2 Units/mL in serum at post-mortem which had declined from an initial high of 13 Units/mL at 15-min post-antivenom. The ELISA data demonstrated the complete clearance of detectable venom components from serum with no recurrence in the post-mortem samples. Antivenom concentrations in serum at initial presentation were at least 100-fold higher than theoretically required to neutralise the circulating concentrations of venom. Despite the fatal outcome in this case it was concluded that this was unlikely that is was due to insufficient antivenom. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gupta, Gaurav; Kuppachi, Sarat; Kalil, Roberto S; Buck, Christopher B; Lynch, Charles F; Engels, Eric A
2018-01-01
Recent case series describe detection of BK polyomavirus (BKV) in urinary tract cancers in kidney transplant recipients, suggesting that BKV could contribute to the development of these cancers. We assessed risk for urinary tract cancers in kidney recipients with or without treatment for presumed BKV nephropathy (tBKVN) using data from the United States Transplant Cancer Match Study (2003-2013). Among 55 697 included recipients, 2015 (3.6%) were reported with tBKVN. Relative to the general population, incidence was similarly elevated (approximately 4.5-fold) for kidney cancer in recipients with or without tBKVN, and incidence was not increased in either group for prostate cancer. In contrast, for invasive bladder cancer, incidence was more strongly elevated in recipients with versus without tBKVN (standardized incidence ratios 4.5 vs. 1.7; N = 48 cases), corresponding to an incidence rate ratio (IRR) of 2.9 (95% confidence interval [CI] 1.0-8.2), adjusted for sex, age, transplant year, and use of polyclonal antibody induction. As a result, recipients with tBKVN had borderline increased incidence for all urothelial cancers combined (renal pelvis, ureter, and bladder cancers: adjusted IRR 2.2, 95% CI 0.9-5.4; N = 89 cases). Together with reports describing BKV detection in tumor tissues, these results support an association between BKV and urothelial carcinogenesis among kidney transplant recipients. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Determining probabilities of geologic events and processes
International Nuclear Information System (INIS)
Hunter, R.L.; Mann, C.J.; Cranwell, R.M.
1985-01-01
The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs
Certainties and probabilities of the IPCC
International Nuclear Information System (INIS)
2004-01-01
Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)
On Convergent Probability of a Random Walk
Lee, Y.-F.; Ching, W.-K.
2006-01-01
This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.
Predicting binary choices from probability phrase meanings.
Wallsten, Thomas S; Jang, Yoonhee
2008-08-01
The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.
Bayesian optimization for computationally extensive probability distributions.
Tamura, Ryo; Hukushima, Koji
2018-01-01
An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.
Sampling, Probability Models and Statistical Reasoning Statistical ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Modelling the probability of building fires
Directory of Open Access Journals (Sweden)
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Probability of Survival Decision Aid (PSDA)
National Research Council Canada - National Science Library
Xu, Xiaojiang; Amin, Mitesh; Santee, William R
2008-01-01
A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...
Probability of spent fuel transportation accidents
International Nuclear Information System (INIS)
McClure, J.D.
1981-07-01
The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Liquefaction Probability Curves for Surficial Geologic Units
Holzer, T. L.; Noce, T. E.; Bennett, M. J.
2009-12-01
Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship......-ship collisions, ship-platform collisions, and ship groundings. The main benefit of the method is that it allows comparisons of various navigation routes....
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Imprecise Probability Methods for Weapons UQ
Energy Technology Data Exchange (ETDEWEB)
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Correlations and Non-Linear Probability Models
DEFF Research Database (Denmark)
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations betwee...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Familiarity and preference for pitch probability profiles.
Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L
2015-05-01
We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge.
AbdelRazek, Mahmoud A; Gutierrez, Jose; Mampre, David; Cervantes-Arslanian, Anna; Ormseth, Cora; Haussen, Diogo; Thakur, Kiran T; Lyons, Jennifer L; Smith, Bryan R; O'Connor, Owen; Willey, Joshua Z; Mateen, Farrah J
2018-01-01
Human immunodeficiency virus (HIV) infection has been shown to increase both ischemic and hemorrhagic stroke risks, but there are limited data on the safety and outcomes of intravenous thrombolysis with tPA (tissue-type plasminogen activator) for acute ischemic stroke in HIV-infected patients. A retrospective chart review of intravenous tPA-treated HIV patients who presented with acute stroke symptoms was performed in 7 large inner-city US academic centers (various search years between 2000 and 2017). We collected data on HIV, National Institutes of Health Stroke Scale score, ischemic stroke risk factors, opportunistic infections, intravenous drug abuse, neuroimaging findings, and modified Rankin Scale score at last follow-up. We identified 33 HIV-infected patients treated with intravenous tPA (mean age, 51 years; 24 men), 10 of whom were stroke mimics. Sixteen of 33 (48%) patients had an HIV viral load less than the limit of detection while 10 of 33 (30%) had a CD4 count Stroke Scale score at presentation was 9, and mean time from symptom onset to tPA was 144 minutes (median, 159). The median modified Rankin Scale score for the 33-patient cohort was 1 and for the 23-patient actual stroke cohort was 2, measured at a median of 90 days poststroke symptom onset. Two patients had nonfatal hemorrhagic transformation (6%; 95% confidence interval, 1%-20%), both in the actual stroke group. Two patients had varicella zoster virus vasculitis of the central nervous system, 1 had meningovascular syphilis, and 7 other patients were actively using intravenous drugs (3 cocaine, 1 heroin, and 3 unspecified), none of whom had hemorrhagic transformation. Most HIV-infected patients treated with intravenous tPA for presumed and actual acute ischemic stroke had no complications, and we observed no fatalities. Stroke mimics were common, and thrombolysis seems safe in this group. We found no data to suggest an increased risk of intravenous tPA-related complications because of concomitant
Comparative analysis through probability distributions of a data set
Cristea, Gabriel; Constantinescu, Dan Mihai
2018-02-01
In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.
Cowles, Anne; Beatty, William W; Nixon, Sara Jo; Lutz, Lanna J; Paulk, Jason; Paulk, Kayla; Ross, Elliott D
2003-12-01
Previous studies have described patients with possible or probable Alzheimer's disease (AD) who continued to play familiar songs skillfully, despite their dementias. There are no reports about patients with dementia who successfully learned to play new songs, and two papers describe failures of patients with AD to learn to play a new song although they continued to play familiar songs competently. In the present paper we describe a moderately demented patient (SL) with probable AD who learned to play a song (Cossackaya!) on the violin that was published after the apparent onset of his dementia. He showed modest retention of the song at delays of 0 and 10 minutes. This contrasts with his profound disturbance in both recall and recognition on other anterograde memory tests (word lists, stories, figures, environmental sounds, sounds of musical instruments), and marked impairment on measures of remote memory (famous faces, autobiographical memory). SL showed milder deficits in confrontation naming, verbal fluency and attention, but no dyspraxia or aphasic comprehension deficits. Except for the Block Design test, his visuospatial skills were intact. SL's learning of the new song in the absence of any evidence of episodic memory is reminiscent of patients with temporal lobe amnesia who show better memory for song melody than for lyrics or verse, although his retention was not as good.
Oak regeneration and overstory density in the Missouri Ozarks
David R. Larsen; Monte A. Metzger
1997-01-01
Reducing overstory density is a commonly recommended method of increasing the regeneration potential of oak (Quercus) forests. However, recommendations seldom specify the probable increase in density or the size of reproduction associated with a given residual overstory density. This paper presents logistic regression models that describe this...
Mujuzi, Jamil Ddamulira
2009-01-01
The Ugandan Penal Code criminalizes same-sex relationships. The author analyzes the Ugandan High Court decision where the judge relied on the Constitution and international human rights instruments to hold that law enforcement officers must respect the rights to privacy and human dignity even of those people presumed to be in same-sex…
Directory of Open Access Journals (Sweden)
Janet M. Wojcicki
2017-01-01
Full Text Available Leukocyte telomere length is shorter in response to chronic disease processes associated with inflammation such as diabetes mellitus and coronary artery disease. Data from the National Health and Nutrition Examination Survey (NHANES from 1999 to 2002 was used to explore the relationship between leukocyte telomere length and presumed NAFLD, as indicated by elevated serum alanine aminotransferase (ALT levels, obesity, or abdominal obesity. Logistic regression models were used to evaluate the relationship between telomere length and presumed markers of NAFLD adjusting for possible confounders. There was no relationship between elevated ALT levels, abdominal obesity, or obesity and telomere length in adjusted models in NHANES (OR 1.13, 95% CI 0.48–2.65; OR 1.17, 95% CI 0.52–2.62, resp.. Mexican-American men had shorter telomere length in relation to presumed NAFLD (OR 0.07, 95% CI 0.006–0.79 and using different indicators of NAFLD (OR 0.012, 95% CI 0.0006–0.24. Mexican origin with presumed NAFLD had shorter telomere length than men in other population groups. Longitudinal studies are necessary to evaluate the role of telomere length as a potential predictor to assess pathogenesis of NALFD in Mexicans.
Moeskops, Pim; de Bresser, Jeroen; Kuijf, Hugo J.; Mendrik, AM; Biessels, Geert Jan; Pluim, Josien P.W.; Išgum, Ivana
2018-01-01
Automatic segmentation of brain tissues and white matter hyperintensities of presumed vascular origin (WMH) in MRI of older patients is widely described in the literature. Although brain abnormalities and motion artefacts are common in this age group, most segmentation methods are not evaluated in a
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.
OPTIMAL ESTIMATION OF RANDOM PROCESSES ON THE CRITERION OF MAXIMUM A POSTERIORI PROBABILITY
Directory of Open Access Journals (Sweden)
A. A. Lobaty
2016-01-01
Full Text Available The problem of obtaining the equations for the a posteriori probability density of a stochastic Markov process with a linear measurement model. Unlike common approaches based on consideration as a criterion for optimization of the minimum mean square error of estimation, in this case, the optimization criterion is considered the maximum a posteriori probability density of the process being evaluated.The a priori probability density estimated Gaussian process originally considered a differentiable function that allows us to expand it in a Taylor series without use of intermediate transformations characteristic functions and harmonic decomposition. For small time intervals the probability density measurement error vector, by definition, as given by a Gaussian with zero expectation. This makes it possible to obtain a mathematical expression for the residual function, which characterizes the deviation of the actual measurement process from its mathematical model.To determine the optimal a posteriori estimation of the state vector is given by the assumption that this estimate is consistent with its expectation – the maximum a posteriori probability density. This makes it possible on the basis of Bayes’ formula for the a priori and a posteriori probability density of an equation Stratonovich-Kushner.Using equation Stratonovich-Kushner in different types and values of the vector of drift and diffusion matrix of a Markov stochastic process can solve a variety of filtration tasks, identify, smoothing and system status forecast for continuous and for discrete systems. Discrete continuous implementation of the developed algorithms posteriori assessment provides a specific, discrete algorithms for the implementation of the on-board computer, a mobile robot system.
ON THE ORIGIN OF THE HIGH COLUMN DENSITY TURNOVER IN THE H I COLUMN DENSITY DISTRIBUTION
International Nuclear Information System (INIS)
Erkal, Denis; Gnedin, Nickolay Y.; Kravtsov, Andrey V.
2012-01-01
We study the high column density regime of the H I column density distribution function and argue that there are two distinct features: a turnover at N H I ≈ 10 21 cm –2 , which is present at both z = 0 and z ≈ 3, and a lack of systems above N H I ≈ 10 22 cm –2 at z = 0. Using observations of the column density distribution, we argue that the H I-H 2 transition does not cause the turnover at N H I ≈ 10 21 cm –2 but can plausibly explain the turnover at N H I ∼> 10 22 cm –2 . We compute the H I column density distribution of individual galaxies in the THINGS sample and show that the turnover column density depends only weakly on metallicity. Furthermore, we show that the column density distribution of galaxies, corrected for inclination, is insensitive to the resolution of the H I map or to averaging in radial shells. Our results indicate that the similarity of H I column density distributions at z = 3 and 0 is due to the similarity of the maximum H I surface densities of high-z and low-z disks, set presumably by universal processes that shape properties of the gaseous disks of galaxies. Using fully cosmological simulations, we explore other candidate physical mechanisms that could produce a turnover in the column density distribution. We show that while turbulence within giant molecular clouds cannot affect the damped Lyα column density distribution, stellar feedback can affect it significantly if the feedback is sufficiently effective in removing gas from the central 2-3 kpc of high-redshift galaxies. Finally, we argue that it is meaningful to compare column densities averaged over ∼ kpc scales with those estimated from quasar spectra that probe sub-pc scales due to the steep power spectrum of H I column density fluctuations observed in nearby galaxies.
Kasa Tom, Sharon; Welch, Henry; Kilalang, Cornelia; Tefuarani, Nakapi; Vince, John; Lavu, Evelyn; Johnson, Karen; Magaye, Ruth; Duke, Trevor
2017-05-11
The Gene Xpert MTB/ RIF assay (Xpert) is used for rapid, simultaneous detection of Mycobacterium tuberculosis (MTB) and rifampicin resistance. This study examined the accuracy of Xpert in children with suspected pulmonary tuberculosis (PTB). Children admitted to Port Moresby General Hospital with suspected PTB were prospectively enrolled between September 2014 and March 2015. They were classified into probable, possible and TB-unlikely groups. Sputum or gastric aspirates were tested by Xpert and smear microscopy; mycobacterial culture was undertaken on a subset. Children were diagnosed with TB on the basis of standard criteria which were used as the primary reference standard. Xpert, smear for acid-fast bacilli (AFB) and the Edwards TB score were compared with the primary reference standard. A total of 93 children ≤14 years with suspected PTB were enrolled; 67 (72%) were classified as probable, 21 (22%) possible and 5 (5.4%) TB-unlikely. Eighty were treated for TB based on the primary reference standard. Xpert was positive in 26/93 (28%) MTB cases overall, including 22/67 (33%) with probable TB and 4/21 (19%) with possible TB. Three (13%) samples identified rifampicin resistance. Xpert confirmed more cases of TB than AFB smear (26 vs 13, p = 0.019). The sensitivity of Xpert, AFB smear and an Edwards TB score of ≥7 was 31% (25/80), 16% (13/80) and 90% (72/80), respectively, and the specificity was 92% (12/13), 100% (13/13) and 31% (4/13), respectively, when compared with the primary reference standard. Xpert sensitivity is sub-optimal and cannot be relied upon for diagnosing TB, although a positive result is confirmatory. A detailed history and examination, standardised clinical criteria, radiographs and available tests remain the most appropriate way of diagnosing TB in children in resource-limited countries. Xpert helps confirm PTB better than AFB smear, and identifies rifampicin resistance. Practical guidelines should be used to identify children who
Probability, arrow of time and decoherence
Bacciagaluppi, Guido
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.
Probability analysis of nuclear power plant hazards
International Nuclear Information System (INIS)
Kovacs, Z.
1985-01-01
The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)
Quantum Probability and Spectral Analysis of Graphs
Hora, Akihito
2007-01-01
This is the first book to comprehensively cover the quantum probabilistic approach to spectral analysis of graphs. This approach has been developed by the authors and has become an interesting research area in applied mathematics and physics. The book can be used as a concise introduction to quantum probability from an algebraic aspect. Here readers will learn several powerful methods and techniques of wide applicability, which have been recently developed under the name of quantum probability. The exercises at the end of each chapter help to deepen understanding. Among the topics discussed along the way are: quantum probability and orthogonal polynomials; asymptotic spectral theory (quantum central limit theorems) for adjacency matrices; the method of quantum decomposition; notions of independence and structure of graphs; and asymptotic representation theory of the symmetric groups.
Uncertainty relation and probability. Numerical illustration
International Nuclear Information System (INIS)
Fujikawa, Kazuo; Umetsu, Koichiro
2011-01-01
The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)
7th High Dimensional Probability Meeting
Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan
2016-01-01
This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...
EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY
Directory of Open Access Journals (Sweden)
Magdalena Hykšová
2012-03-01
Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.
Independent events in elementary probability theory
Csenki, Attila
2011-07-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
A basic course in probability theory
Bhattacharya, Rabi
2016-01-01
This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...
Fixation probability on clique-based graphs
Choi, Jeong-Ok; Yu, Unjong
2018-02-01
The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Python for probability, statistics, and machine learning
Unpingco, José
2016-01-01
This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...
DEFF Research Database (Denmark)
Yura, Harold; Hanson, Steen Grüner
2012-01-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...
Chelini, Marie-Claire; Hebets, Eileen
2017-11-01
Female-biased sexual size dimorphism (SSD) is often considered an epiphenomenon of selection for the increased mating opportunities provided by early male maturation (i.e. , protandry). Empirical evidence of the adaptive significance of protandry remains nonetheless fairly scarce. We use field data collected throughout the reproductive season of an SSD crab spider, Mecaphesa celer , to test two hypotheses: Protandry provides fitness benefits to males, leading to female-biased SSD, or protandry is an indirect consequence of selection for small male size/large female size. Using field-collected data, we modeled the probability of mating success for females and males according to their timing of maturation. We found that males matured earlier than females and the proportion of virgin females decreased abruptly early in the season, but unexpectedly increased afterward. Timing of female maturation was not related to clutch size, but large females tended to have more offspring than small females. Timing of female and male maturation was inversely related to size at adulthood, as early-maturing individuals were larger than late-maturing ones, suggesting that both sexes exhibit some plasticity in their developmental trajectories. Such plasticity indicates that protandry could co-occur with any degree and direction of SSD. Our calculation of the probability of mating success along the season shows multiple male maturation time points with similar predicted mating success. This suggests that males follow multiple strategies with equal success, trading-off access to virgin females with intensity of male-male competition. Our results challenge classic hypotheses linking protandry and female-biased SSD, and emphasize the importance of directly testing the often-assumed relationships between co-occurring animal traits.
Comparing coefficients of nested nonlinear probability models
DEFF Research Database (Denmark)
Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders
2011-01-01
In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...... of ship-ship collisions, ship-platform collisions, and ship groundings. The main benefit of the method is that it allows comparisons of various navigation routes and procedures by assessing the relative frequencies of collisions and groundings....
Duelling idiots and other probability puzzlers
Nahin, Paul J
2002-01-01
What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki
Probable Gastrointestinal Toxicity of Kombucha Tea
Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David
1997-01-01
Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462
Lady luck the theory of probability
Weaver, Warren
1982-01-01
""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa
Probability groups as orbits of groups
International Nuclear Information System (INIS)
Bhattarai, H.N.
2003-11-01
The set of double cosets of a group with respect to a subgroup and the set of orbits of a group with respect to a group of automorphisms have structures which can be studied as multigroups, hypergroups or Pasch geometries. When the subgroup or the group of automorphisms are finite, the multivalued products can be provided with some weightages forming so-called Probability Groups. It is shown in this paper that some abstract probability groups can be realized as orbit spaces of groups. (author)
Fifty challenging problems in probability with solutions
Mosteller, Frederick
1987-01-01
Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall
Bayesian estimation of core-melt probability
International Nuclear Information System (INIS)
Lewis, H.W.
1984-01-01
A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease
Return probability: Exponential versus Gaussian decay
Energy Technology Data Exchange (ETDEWEB)
Izrailev, F.M. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)]. E-mail: izrailev@sirio.ifuap.buap.mx; Castaneda-Mendoza, A. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)
2006-02-13
We analyze, both analytically and numerically, the time-dependence of the return probability in closed systems of interacting particles. Main attention is paid to the interplay between two regimes, one of which is characterized by the Gaussian decay of the return probability, and another one is the well-known regime of the exponential decay. Our analytical estimates are confirmed by the numerical data obtained for two models with random interaction. In view of these results, we also briefly discuss the dynamical model which was recently proposed for the implementation of a quantum computation.
International Nuclear Information System (INIS)
Kövesárki, P; Brock, I C; Quiroz, A E Nuncio
2012-01-01
This paper introduces a probability density estimator based on Green's function identities. A density model is constructed under the sole assumption that the probability density is differentiable. The method is implemented as a binary likelihood estimator for classification purposes, so issues such as mis-modeling and overtraining are also discussed. The identity behind the density estimator can be interpreted as a real-valued, non-scalar kernel method which is able to reconstruct differentiable density functions.
Sonnino, Giorgio; Steinbrecher, György; Cardinali, Alessandro; Sonnino, Alberto; Tlidi, Mustapha
2013-01-01
Using statistical thermodynamics, we derive a general expression of the stationary probability distribution for thermodynamic systems driven out of equilibrium by several thermodynamic forces. The local equilibrium is defined by imposing the minimum entropy production and the maximum entropy principle under the scale invariance restrictions. The obtained probability distribution presents a singularity that has immediate physical interpretation in terms of the intermittency models. The derived reference probability distribution function is interpreted as time and ensemble average of the real physical one. A generic family of stochastic processes describing noise-driven intermittency, where the stationary density distribution coincides exactly with the one resulted from entropy maximization, is presented.
Conditional probability on MV-algebras
Czech Academy of Sciences Publication Activity Database
Kroupa, Tomáš
2005-01-01
Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005
Failure probability of regional flood defences
Lendering, K.T.; lang, M.; Klijn, F.; Samuels, P.
2016-01-01
Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This
Virus isolation: Specimen type and probable transmission
Indian Academy of Sciences (India)
First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.
Eliciting Subjective Probability Distributions on Continuous Variables
1975-08-01
STATEMENT (3l Ihl» Riporl) Approved for Public Release; Distribiition Unlimited vT u.VH SUTiON STATEMENT (ol in, motif el oofnd In Block 20, II...Adjusting Proper Scoring Rule Fractile Subjective Probability Uncertainty Measures ZO. ABSTRACT (Conllnuo an r«v*r*« oido H nocoomtry and
Collision probabilities and response matrices: an overview
International Nuclear Information System (INIS)
Leonard, A.
1975-01-01
Generally the term collision probability method is applied to a technique that employs a discretization of the integral form of the transport equation. Relative to the discrete ordinates method, the collision probability technique has the advantages of dealing with fewer number of variables (no angular coordinates) and generally faster convergence. Significant disadvantages include dense coupling of the variables, expensive precalculation of collision probabilities, and difficulties in treating anisotropic scattering. Various techniques for circumventing these weaknesses are described. In the response matrix method the assembly or system to be analyzed is decomposed into a number of simple subunits. The approximate Green's functions or response matrices of each type of subunit are then precalculated. To the desired accuracy, these response matrices yield the outgoing neutron currents to any given input. Thus the unknowns are the interface currents, and the coefficient matrix contains all the response matrices. A wide variety of techniques can and have been used to generate response matrices--diffusion theory, S/sub n/ methods, Monte Carlo, collision probabilities, and even response matrices. Again the precalculations are expensive. On the other hand once a response matrix has been computed, it may be stored and used again. Thus response matrix methods appear to be particularly advantageous for burnup, optimization, and possibly many kinetics problems where the properties of many subunits do not change. (43 references) (U.S.)
Complexity of Fuzzy Probability Logics II
Czech Academy of Sciences Publication Activity Database
Hájek, Petr
2007-01-01
Roč. 158, č. 23 (2007), s. 2605-2611 ISSN 0165-0114 R&D Projects: GA AV ČR IAA100300503 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * probability * computational complexity Subject RIV: BA - General Mathematics Impact factor: 1.373, year: 2007
Inferring Beliefs as Subjectively Imprecise Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2012-01-01
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...
Probability & Perception: The Representativeness Heuristic in Action
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Probability & Statistics: Modular Learning Exercises. Student Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Low Probability of Intercept Laser Range Finder
2017-07-19
performs the signal processing . Processor 30 performs a continuous sweep over the photodetector 38 output to isolate and amplify the optical signals ...December 2017 The below identified patent application is available for licensing. Requests for information should be addressed to...1 of 12 LOW PROBABILITY OF INTERCEPT LASER RANGE FINDER STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein may be
Estimating the Probability of Negative Events
Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike
2009-01-01
How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…
Pade approximant calculations for neutron escape probability
International Nuclear Information System (INIS)
El Wakil, S.A.; Saad, E.A.; Hendi, A.A.
1984-07-01
The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)
Concurrency meets probability: theory and practice (abstract)
Katoen, Joost P.
Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between
Probability & Statistics: Modular Learning Exercises. Teacher Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Investigating Probability with the NBA Draft Lottery.
Quinn, Robert J.
1997-01-01
Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…
Escape probabilities for fluorescent x-rays
International Nuclear Information System (INIS)
Dance, D.R.; Day, G.J.
1985-01-01
Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)
On a paradox of probability theory
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)
Exploring Concepts in Probability: Using Graphics Calculators
Ghosh, Jonaki
2004-01-01
This article describes a project in which certain key concepts in probability were explored using graphics calculators with year 10 students. The lessons were conducted in the regular classroom where students were provided with a Casio CFX 9850 GB PLUS graphics calculator with which they were familiar from year 9. The participants in the…
Probability from a Socio-Cultural Perspective
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Error probabilities in default Bayesian hypothesis testing
Gu, Xin; Hoijtink, Herbert; Mulder, J,
2016-01-01
This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for
The Britannica Guide to Statistics and Probability
2011-01-01
By observing patterns and repeated behaviors, mathematicians have devised calculations to significantly reduce human potential for error. This volume introduces the historical and mathematical basis of statistics and probability as well as their application to everyday situations. Readers will also meet the prominent thinkers who advanced the field and established a numerical basis for prediction
Comonotonic Book-Making with Nonadditive Probabilities
Diecidue, E.; Wakker, P.P.
2000-01-01
This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the
Reduction of Compound Lotteries with Objective Probabilities
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2015-01-01
The reduction of compound lotteries axiom (ROCL) has assumed a central role in the evaluation of behavior toward risk and uncertainty. We present experimental evidence on its validity in the domain of objective probabilities. Our battery of lottery pairs includes simple one-stage lotteries, two...
Applied probability models with optimization applications
Ross, Sheldon M
1992-01-01
Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.
Confusion between Odds and Probability, a Pandemic?
Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer
2012-01-01
This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…
Probability in Action: The Red Traffic Light
Shanks, John A.
2007-01-01
Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…
Directory of Open Access Journals (Sweden)
Johan Engelbrecht
2017-07-01
Full Text Available Background. The incidence of immunocompromised children with probable systemic cytomegalovirus (CMV infection is increasing. Currently, there is no protocol for screening children for CMV retinitis in South Africa. Screening for CMV retinitis may prevent permanent visual impairment. Objectives. To determine the prevalence of retinitis in children with probable systemic CMV infection. To assess the value of clinical and laboratory data in identifying risk factors for the development of CMV retinitis in children. Methods. A retrospective, cross-sectional study design was used. All children (≤12 years with probable systemic CMV infection who underwent ophthalmic screening over a 5-year period, were included. Presumed CMV retinitis was diagnosed by dilated fundoscopy. All cases were evaluated to identify possible risk factors for the development of CMV retinitis. Results. A total of 164 children were screened. Presumed CMV retinitis was diagnosed in 4.9% of participants. Causes of immunosuppression were HIV infection (n=7 and chemotherapy (n=1. HIV infection showed a definite trend towards association with the development of CMV retinitis in our study population (p=0.064. Conclusion.The prevalence of CMV retinitis was 4.9% in our sample. Other than HIV, we were not able to identify additional risk factors for CMV retinitis. Our results show that CD4 levels are possibly not a reliable indicator to predict CMV retinitis.
Bounding probabilistic safety assessment probabilities by reality
International Nuclear Information System (INIS)
Fragola, J.R.; Shooman, M.L.
1991-01-01
The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates
Establishment probability in newly founded populations
Directory of Open Access Journals (Sweden)
Gusset Markus
2012-06-01
Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.
The dynamics of variable-density turbulence
International Nuclear Information System (INIS)
Sandoval, D.L.
1995-11-01
The dynamics of variable-density turbulent fluids are studied by direct numerical simulation. The flow is incompressible so that acoustic waves are decoupled from the problem, and implying that density is not a thermodynamic variable. Changes in density occur due to molecular mixing. The velocity field, is in general, divergent. A pseudo-spectral numerical technique is used to solve the equations of motion. Three-dimensional simulations are performed using a grid size of 128 3 grid points. Two types of problems are studied: (1) the decay of isotropic, variable-density turbulence, and (2) buoyancy-generated turbulence in a fluid with large density fluctuations. In the case of isotropic, variable-density turbulence, the overall statistical decay behavior, for the cases studied, is relatively unaffected by the presence of density variations when the initial density and velocity fields are statistically independent. The results for this case are in quantitative agreement with previous numerical and laboratory results. In this case, the initial density field has a bimodal probability density function (pdf) which evolves in time towards a Gaussian distribution. The pdf of the density field is symmetric about its mean value throughout its evolution. If the initial velocity and density fields are statistically dependent, however, the decay process is significantly affected by the density fluctuations. For the case of buoyancy-generated turbulence, variable-density departures from the Boussinesq approximation are studied. The results of the buoyancy-generated turbulence are compared with variable-density model predictions. Both a one-point (engineering) model and a two-point (spectral) model are tested against the numerical data. Some deficiencies in these variable-density models are discussed and modifications are suggested