International Nuclear Information System (INIS)
Liu, L.H.; Xu, X.; Chen, Y.L.
2004-01-01
The laminar flamelet equations in combination with the joint probability density function (PDF) transport equation of mixture fraction and turbulence frequency have been used to simulate turbulent jet diffusion flames. To check the suitability of the presumed shapes of the PDF for the modeling of turbulence-radiation interactions (TRI), two types of presumed joint PDFs are constructed by using the second-order moments of temperature and the species concentrations, which are derived by the laminar flamelet model. The time-averaged radiative source terms and the time-averaged absorption coefficients are calculated by the presumed joint PDF approaches, and compared with those obtained by the laminar flamelet model. By comparison, it is shown that there are obvious differences between the results of the independent PDF approach and the laminar flamelet model. Generally, the results of the dependent PDF approach agree better with those of the flamelet model. For the modeling of TRI, the dependent PDF approach is superior to the independent PDF approach
International Nuclear Information System (INIS)
Cao Hong-Jun; Zhang Hui-Qiang; Lin Wen-Yi
2012-01-01
Four kinds of presumed probability-density-function (PDF) models for non-premixed turbulent combustion are evaluated in flames with various stoichiometric mixture fractions by using large eddy simulation (LES). The LES code is validated by the experimental data of a classical turbulent jet flame (Sandia flame D). The mean and rms temperatures obtained by the presumed PDF models are compared with the LES results. The β-function model achieves a good prediction for different flames. The predicted rms temperature by using the double-δ function model is very small and unphysical in the vicinity of the maximum mean temperature. The clip-Gaussian model and the multi-δ function model make a worse prediction of the extremely fuel-rich or fuel-lean side due to the clip at the boundary of the mixture fraction space. The results also show that the overall prediction performance of presumed PDF models is better at mediate stoichiometric mixture fractions than that at very small or very large ones. (fundamental areas of phenomenology(including applications))
Probability densities and Lévy densities
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
High throughput nonparametric probability density estimation.
Farmer, Jenny; Jacobs, Donald
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.
Box-particle probability hypothesis density filtering
Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.
2014-01-01
This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...
Multiple model cardinalized probability hypothesis density filter
Georgescu, Ramona; Willett, Peter
2011-09-01
The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.
Modulation Based on Probability Density Functions
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Comparison of density estimators. [Estimation of probability density functions
Energy Technology Data Exchange (ETDEWEB)
Kao, S.; Monahan, J.F.
1977-09-01
Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)
Therapeutic High-Density Barium Enema in a Case of Presumed Diverticular Hemorrhage
Directory of Open Access Journals (Sweden)
Nonthalee Pausawasdi
2011-02-01
Full Text Available Many patients with lower gastrointestinal bleeding do not have an identifiable source of bleeding at colonoscopy. A significant percentage of these patients will have recurrent bleeding. In many patients, the presence of multiple diverticula leads to a diagnosis of presumed diverticular bleeding. Current treatment options include therapeutic endoscopy, angiography, or surgical resection, all of which depend on the identification of the diverticular source of bleeding. This report describes a case of recurrent bleeding in an elderly patient with diverticula but no identifiable source treated successfully with barium impaction therapy. This therapeutic modality does not depend on the identification of the bleeding diverticular lesion and was well tolerated by our 86-year-old patient.
Interactive design of probability density functions for shape grammars
Dang, Minh; Lienhard, Stefan; Ceylan, Duygu; Neubert, Boris; Wonka, Peter; Pauly, Mark
2015-01-01
A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density
Continuation of probability density functions using a generalized Lyapunov approach
Baars, S.; Viebahn, J. P.; Mulder, T. E.; Kuehn, C.; Wubs, F. W.; Dijkstra, H. A.
2017-01-01
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial
Probability densities and the radon variable transformation theorem
International Nuclear Information System (INIS)
Ramshaw, J.D.
1985-01-01
D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation
On Farmer's line, probability density functions, and overall risk
International Nuclear Information System (INIS)
Munera, H.A.; Yadigaroglu, G.
1986-01-01
Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value
Probability-density-function characterization of multipartite entanglement
International Nuclear Information System (INIS)
Facchi, P.; Florio, G.; Pascazio, S.
2006-01-01
We propose a method to characterize and quantify multipartite entanglement for pure states. The method hinges upon the study of the probability density function of bipartite entanglement and is tested on an ensemble of qubits in a variety of situations. This characterization is also compared to several measures of multipartite entanglement
Visualization techniques for spatial probability density function data
Directory of Open Access Journals (Sweden)
Udeepta D Bordoloi
2006-01-01
Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.
An empirical probability model of detecting species at low densities.
Delaney, David G; Leung, Brian
2010-06-01
False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.
Assumed Probability Density Functions for Shallow and Deep Convection
Steven K Krueger; Peter A Bogenschutz; Marat Khairoutdinov
2010-01-01
The assumed joint probability density function (PDF) between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS) parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PD...
Probability Density Estimation Using Neural Networks in Monte Carlo Calculations
International Nuclear Information System (INIS)
Shim, Hyung Jin; Cho, Jin Young; Song, Jae Seung; Kim, Chang Hyo
2008-01-01
The Monte Carlo neutronics analysis requires the capability for a tally distribution estimation like an axial power distribution or a flux gradient in a fuel rod, etc. This problem can be regarded as a probability density function estimation from an observation set. We apply the neural network based density estimation method to an observation and sampling weight set produced by the Monte Carlo calculations. The neural network method is compared with the histogram and the functional expansion tally method for estimating a non-smooth density, a fission source distribution, and an absorption rate's gradient in a burnable absorber rod. The application results shows that the neural network method can approximate a tally distribution quite well. (authors)
Continuation of probability density functions using a generalized Lyapunov approach
Energy Technology Data Exchange (ETDEWEB)
Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)
2017-05-01
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.
A Balanced Approach to Adaptive Probability Density Estimation
Directory of Open Access Journals (Sweden)
Julio A. Kovacs
2017-04-01
Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.
INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS
Potter, Kristin; Kirby, Robert Michael; Xiu, Dongbin; Johnson, Chris R.
2012-01-01
The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.
Qin, Yong; Ma, Hong; Chen, Jinfeng; Cheng, Li
2015-12-01
Conventional multitarget tracking systems presume that each target can produce at most one measurement per scan. Due to the multiple ionospheric propagation paths in over-the-horizon radar (OTHR), this assumption is not valid. To solve this problem, this paper proposes a novel tracking algorithm based on the theory of finite set statistics (FISST) called the multipath probability hypothesis density (MP-PHD) filter in cluttered environments. First, the FISST is used to derive the update equation, and then Gaussian mixture (GM) is introduced to derive the closed-form solution of the MP-PHD filter. Moreover, the extended Kalman filter (EKF) is presented to deal with the nonlinear problem of the measurement model in OTHR. Eventually, the simulation results are provided to demonstrate the effectiveness of the proposed filter.
Divergence from, and Convergence to, Uniformity of Probability Density Quantiles
Directory of Open Access Journals (Sweden)
Robert G. Staudte
2018-04-01
Full Text Available We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs, obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples.
Probability density functions for CP-violating rephasing invariants
Fortin, Jean-François; Giasson, Nicolas; Marleau, Luc
2018-05-01
The implications of the anarchy principle on CP violation in the lepton sector are investigated. A systematic method is introduced to compute the probability density functions for the CP-violating rephasing invariants of the PMNS matrix from the Haar measure relevant to the anarchy principle. Contrary to the CKM matrix which is hierarchical, it is shown that the Haar measure, and hence the anarchy principle, are very likely to lead to the observed PMNS matrix. Predictions on the CP-violating Dirac rephasing invariant |jD | and Majorana rephasing invariant |j1 | are also obtained. They correspond to 〈 |jD | 〉 Haar = π / 105 ≈ 0.030 and 〈 |j1 | 〉 Haar = 1 / (6 π) ≈ 0.053 respectively, in agreement with the experimental hint from T2K of | jDexp | ≈ 0.032 ± 0.005 (or ≈ 0.033 ± 0.003) for the normal (or inverted) hierarchy.
Structural Reliability Using Probability Density Estimation Methods Within NESSUS
Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric
2003-01-01
A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been
Interactive design of probability density functions for shape grammars
Dang, Minh
2015-11-02
A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density function (pdf) over such a shape space and to sample models according to the designed pdf. First, we propose a user interface that enables a user to quickly provide preference scores for selected shapes and suggest sampling strategies to decide which models to present to the user to evaluate. Second, we propose a novel kernel function to encode the similarity between two procedural models. Third, we propose a framework to interpolate user preference scores by combining multiple techniques: function factorization, Gaussian process regression, autorelevance detection, and l1 regularization. Fourth, we modify the original grammars to generate models with a pdf proportional to the user preference scores. Finally, we provide evaluations of our user interface and framework parameters and a comparison to other exploratory modeling techniques using modeling tasks in five example shape spaces: furniture, low-rise buildings, skyscrapers, airplanes, and vegetation.
Probability Density Function Method for Observing Reconstructed Attractor Structure
Institute of Scientific and Technical Information of China (English)
陆宏伟; 陈亚珠; 卫青
2004-01-01
Probability density function (PDF) method is proposed for analysing the structure of the reconstructed attractor in computing the correlation dimensions of RR intervals of ten normal old men. PDF contains important information about the spatial distribution of the phase points in the reconstructed attractor. To the best of our knowledge, it is the first time that the PDF method is put forward for the analysis of the reconstructed attractor structure. Numerical simulations demonstrate that the cardiac systems of healthy old men are about 6 - 6.5 dimensional complex dynamical systems. It is found that PDF is not symmetrically distributed when time delay is small, while PDF satisfies Gaussian distribution when time delay is big enough. A cluster effect mechanism is presented to explain this phenomenon. By studying the shape of PDFs, that the roles played by time delay are more important than embedding dimension in the reconstruction is clearly indicated. Results have demonstrated that the PDF method represents a promising numerical approach for the observation of the reconstructed attractor structure and may provide more information and new diagnostic potential of the analyzed cardiac system.
International Nuclear Information System (INIS)
Bakosi, Jozsef; Ristorcelli, Raymond J.
2010-01-01
Probability density function (PDF) methods are extended to variable-density pressure-gradient-driven turbulence. We apply the new method to compute the joint PDF of density and velocity in a non-premixed binary mixture of different-density molecularly mixing fluids under gravity. The full time-evolution of the joint PDF is captured in the highly non-equilibrium flow: starting from a quiescent state, transitioning to fully developed turbulence and finally dissipated by molecular diffusion. High-Atwood-number effects (as distinguished from the Boussinesq case) are accounted for: both hydrodynamic turbulence and material mixing are treated at arbitrary density ratios, with the specific volume, mass flux and all their correlations in closed form. An extension of the generalized Langevin model, originally developed for the Lagrangian fluid particle velocity in constant-density shear-driven turbulence, is constructed for variable-density pressure-gradient-driven flows. The persistent small-scale anisotropy, a fundamentally 'non-Kolmogorovian' feature of flows under external acceleration forces, is captured by a tensorial diffusion term based on the external body force. The material mixing model for the fluid density, an active scalar, is developed based on the beta distribution. The beta-PDF is shown to be capable of capturing the mixing asymmetry and that it can accurately represent the density through transition, in fully developed turbulence and in the decay process. The joint model for hydrodynamics and active material mixing yields a time-accurate evolution of the turbulent kinetic energy and Reynolds stress anisotropy without resorting to gradient diffusion hypotheses, and represents the mixing state by the density PDF itself, eliminating the need for dubious mixing measures. Direct numerical simulations of the homogeneous Rayleigh-Taylor instability are used for model validation.
Lei, Youming; Zheng, Fan
2016-12-01
Stochastic chaos induced by diffusion processes, with identical spectral density but different probability density functions (PDFs), is investigated in selected lightly damped Hamiltonian systems. The threshold amplitude of diffusion processes for the onset of chaos is derived by using the stochastic Melnikov method together with a mean-square criterion. Two quasi-Hamiltonian systems, namely, a damped single pendulum and damped Duffing oscillator perturbed by stochastic excitations, are used as illustrative examples. Four different cases of stochastic processes are taking as the driving excitations. It is shown that in such two systems the spectral density of diffusion processes completely determines the threshold amplitude for chaos, regardless of the shape of their PDFs, Gaussian or otherwise. Furthermore, the mean top Lyapunov exponent is employed to verify analytical results. The results obtained by numerical simulations are in accordance with the analytical results. This demonstrates that the stochastic Melnikov method is effective in predicting the onset of chaos in the quasi-Hamiltonian systems.
Assumed Probability Density Functions for Shallow and Deep Convection
Directory of Open Access Journals (Sweden)
Steven K Krueger
2010-10-01
Full Text Available The assumed joint probability density function (PDF between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PDF families are based on the double Gaussian form and the remaining two are the single Gaussian and a Double Delta Function (analogous to a mass flux model. The assumed PDF method is tested for grid sizes as small as 0.4 km to as large as 204.8 km. In addition, studies are performed for PDF sensitivity to errors in the input moments and for how well the PDFs diagnose some higher-order moments. In general, the double Gaussian PDFs more accurately represent SGS cloud structure and turbulence moments in the boundary layer compared to the single Gaussian and Double Delta Function PDFs for the range of grid sizes tested. This is especially true for small SGS cloud fractions. While the most complex PDF, Lewellen-Yoh, better represents shallow convective cloud properties (cloud fraction and liquid water mixing ratio compared to the less complex Analytic Double Gaussian 1 PDF, there appears to be no advantage in implementing Lewellen-Yoh for deep convection. However, the Analytic Double Gaussian 1 PDF better represents the liquid water flux, is less sensitive to errors in the input moments, and diagnoses higher order moments more accurately. Between the Lewellen-Yoh and Analytic Double Gaussian 1 PDFs, it appears that neither family is distinctly better at representing cloudy layers. However, due to the reduced computational cost and fairly robust results, it appears that the Analytic Double Gaussian 1 PDF could be an ideal family for SGS cloud and turbulence
On the evolution of the density probability density function in strongly self-gravitating systems
International Nuclear Information System (INIS)
Girichidis, Philipp; Konstandin, Lukas; Klessen, Ralf S.; Whitworth, Anthony P.
2014-01-01
The time evolution of the probability density function (PDF) of the mass density is formulated and solved for systems in free-fall using a simple approximate function for the collapse of a sphere. We demonstrate that a pressure-free collapse results in a power-law tail on the high-density side of the PDF. The slope quickly asymptotes to the functional form P V (ρ)∝ρ –1.54 for the (volume-weighted) PDF and P M (ρ)∝ρ –0.54 for the corresponding mass-weighted distribution. From the simple approximation of the PDF we derive analytic descriptions for mass accretion, finding that dynamically quiet systems with narrow density PDFs lead to retarded star formation and low star formation rates (SFRs). Conversely, strong turbulent motions that broaden the PDF accelerate the collapse causing a bursting mode of star formation. Finally, we compare our theoretical work with observations. The measured SFRs are consistent with our model during the early phases of the collapse. Comparison of observed column density PDFs with those derived from our model suggests that observed star-forming cores are roughly in free-fall.
On the discretization of probability density functions and the ...
Indian Academy of Sciences (India)
important for most applications or theoretical problems of interest. In statistics ... In probability theory, statistics, statistical mechanics, communication theory, and other .... (1) by taking advantage of SMVT as a general mathematical approach.
Probability density fittings of corrosion test-data: Implications on ...
Indian Academy of Sciences (India)
Steel-reinforced concrete; probability distribution functions; corrosion ... to be present in the corrosive system at a suitable concentration (Holoway et al 2004; Söylev & ..... voltage, equivalent to voltage drop, across a resistor divided by the ...
Influence of nucleon density distribution in nucleon emission probability
International Nuclear Information System (INIS)
Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.
2014-01-01
Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield
What is presumed when we presume consent?
Directory of Open Access Journals (Sweden)
Pierscionek Barbara K
2008-04-01
Full Text Available Abstract Background The organ donor shortfall in the UK has prompted calls to introduce legislation to allow for presumed consent: if there is no explicit objection to donation of an organ, consent should be presumed. The current debate has not taken in account accepted meanings of presumption in law and science and the consequences for rights of ownership that would arise should presumed consent become law. In addition, arguments revolve around the rights of the competent autonomous adult but do not always consider the more serious implications for children or the disabled. Discussion Any action or decision made on a presumption is accepted in law and science as one based on judgement of a provisional situation. It should therefore allow the possibility of reversing the action or decision. Presumed consent to organ donation will not permit such reversal. Placing prime importance on the functionality of body organs and their capacity to sustain life rather than on explicit consent of the individual will lead to further debate about rights of ownership and potentially to questions about financial incentives and to whom benefits should accrue. Factors that influence donor rates are not fully understood and attitudes of the public to presumed consent require further investigation. Presuming consent will also necessitate considering how such a measure would be applied in situations involving children and mentally incompetent adults. Summary The presumption of consent to organ donation cannot be understood in the same way as is presumption when applied to science or law. Consideration should be given to the consequences of presuming consent and to the questions of ownership and organ monetary value as these questions are likely to arise should presumed consent be permitted. In addition, the implications of presumed consent on children and adults who are unable to object to organ donation, requires serious contemplation if these most vulnerable
Improved Variable Window Kernel Estimates of Probability Densities
Hall, Peter; Hu, Tien Chung; Marron, J. S.
1995-01-01
Variable window width kernel density estimators, with the width varying proportionally to the square root of the density, have been thought to have superior asymptotic properties. The rate of convergence has been claimed to be as good as those typical for higher-order kernels, which makes the variable width estimators more attractive because no adjustment is needed to handle the negativity usually entailed by the latter. However, in a recent paper, Terrell and Scott show that these results ca...
Shiryaev, A N
1996-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.
2013-01-01
Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
Blue functions: probability and current density propagators in non-relativistic quantum mechanics
International Nuclear Information System (INIS)
Withers, L P Jr
2011-01-01
Like a Green function to propagate a particle's wavefunction in time, a Blue function is introduced to propagate the particle's probability and current density. Accordingly, the complete Blue function has four components. They are constructed from path integrals involving a quantity like the action that we call the motion. The Blue function acts on the displaced probability density as the kernel of an integral operator. As a result, we find that the Wigner density occurs as an expression for physical propagation. We also show that, in quantum mechanics, the displaced current density is conserved bilocally (in two places at one time), as expressed by a generalized continuity equation. (paper)
The force distribution probability function for simple fluids by density functional theory.
Rickayzen, G; Heyes, D M
2013-02-28
Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.
Directory of Open Access Journals (Sweden)
Jiang Ge
2017-01-01
Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.
International Nuclear Information System (INIS)
Robinett, R.W.
2002-01-01
After briefly reviewing the definitions of classical probability densities for position, P C L(x), and for momentum, P C L(p), we present several examples of classical mechanical potential systems, mostly variations on such familiar cases as the infinite well and the uniformly accelerated particle for which the classical distributions can be easily derived and visualized. We focus especially on a simple potential which interpolates between the symmetric linear potential, V(x)=F vertical bar x vertical bar, and the infinite well, which can illustrate, in a mathematically straightforward way, how the divergent δ-function classical probability density for momentum for the infinite well can be seen to arise. Such examples can help students understand the quantum mechanical momentum-space wavefunction (and its corresponding probability density) in much the same way that other semiclassical techniques, such as the WKB approximation, can be used to visualize position-space wavefunctions. (author)
Unification of field theory and maximum entropy methods for learning probability densities
Kinney, Justin B.
2014-01-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy de...
Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation
Directory of Open Access Journals (Sweden)
Michal Halas
2012-01-01
Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.
Heisler, Lori; Goffman, Lisa
A word learning paradigm was used to teach children novel words that varied in phonotactic probability and neighborhood density. The effects of frequency and density on speech production were examined when phonetic forms were non-referential (i.e., when no referent was attached) and when phonetic forms were referential (i.e., when a referent was attached through fast mapping). Two methods of analysis were included: (1) kinematic variability of speech movement patterning; and (2) measures of segmental accuracy. Results showed that phonotactic frequency influenced the stability of movement patterning whereas neighborhood density influenced phoneme accuracy. Motor learning was observed in both non-referential and referential novel words. Forms with low phonotactic probability and low neighborhood density showed a word learning effect when a referent was assigned during fast mapping. These results elaborate on and specify the nature of interactivity observed across lexical, phonological, and articulatory domains.
Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function
Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.
2011-01-01
In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.
Probability density of wave function of excited photoelectron: understanding XANES features
Czech Academy of Sciences Publication Activity Database
Šipr, Ondřej
2001-01-01
Roč. 8, - (2001), s. 232-234 ISSN 0909-0495 R&D Projects: GA ČR GA202/99/0404 Institutional research plan: CEZ:A02/98:Z1-010-914 Keywords : XANES * PED - probability density of wave function Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.519, year: 2001
DEFF Research Database (Denmark)
Falk, Anne Katrine Vinther; Gryning, Sven-Erik
1997-01-01
In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials...
Energy Technology Data Exchange (ETDEWEB)
Wampler, William R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Myers, Samuel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Modine, Normand A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-09-01
The energy-dependent probability density of tunneled carrier states for arbitrarily specified longitudinal potential-energy profiles in planar bipolar devices is numerically computed using the scattering method. Results agree accurately with a previous treatment based on solution of the localized eigenvalue problem, where computation times are much greater. These developments enable quantitative treatment of tunneling-assisted recombination in irradiated heterojunction bipolar transistors, where band offsets may enhance the tunneling effect by orders of magnitude. The calculations also reveal the density of non-tunneled carrier states in spatially varying potentials, and thereby test the common approximation of uniform- bulk values for such densities.
DEFF Research Database (Denmark)
Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri
2016-01-01
The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...
Probability density cloud as a geometrical tool to describe statistics of scattered light.
Yaitskova, Natalia
2017-04-01
First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.
Bakosi, J.; Franzese, P.; Boybeyi, Z.
2010-01-01
Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth & Pope with Durbin's method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous ...
Moser , Gabriele; Zerubia , Josiane; Serpico , Sebastiano B.
2006-01-01
International audience; In remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of the pixel intensities. This paper deals with the problem of probability density function (pdf) estimation in the context of synthetic aperture radar (SAR) amplitude data analysis. Several theoretical and heuristic models for the pdfs of SAR data have been proposed in the literature, which have been proved to be effective for different land-cov...
Unification of field theory and maximum entropy methods for learning probability densities
Kinney, Justin B.
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Unification of field theory and maximum entropy methods for learning probability densities.
Kinney, Justin B
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon
2013-01-01
Purpose: Phonotactic probability or neighborhood density has predominately been defined through the use of gross distinctions (i.e., low vs. high). In the current studies, the authors examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method: The authors examined the full range of…
International Nuclear Information System (INIS)
Zhang Zijing; Song Jie; Zhao Yuan; Wu Long
2017-01-01
Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)
Exact joint density-current probability function for the asymmetric exclusion process.
Depken, Martin; Stinchcombe, Robin
2004-07-23
We study the asymmetric simple exclusion process with open boundaries and derive the exact form of the joint probability function for the occupation number and the current through the system. We further consider the thermodynamic limit, showing that the resulting distribution is non-Gaussian and that the density fluctuations have a discontinuity at the continuous phase transition, while the current fluctuations are continuous. The derivations are performed by using the standard operator algebraic approach and by the introduction of new operators satisfying a modified version of the original algebra. Copyright 2004 The American Physical Society
International Nuclear Information System (INIS)
Li Qianshu; Lue Liqiang; Wei Gongmin
2004-01-01
This paper discusses the relationship between the Wigner function, along with other related quasiprobability distribution functions, and the probability density distribution function constructed from the wave function of the Schroedinger equation in quantum phase space, as formulated by Torres-Vega and Frederick (TF). At the same time, a general approach in solving the wave function of the Schroedinger equation of TF quantum phase space theory is proposed. The relationship of the wave functions between the TF quantum phase space representation and the coordinate or momentum representation is thus revealed
Observability of the probability current density using spin rotator as a quantum clock
International Nuclear Information System (INIS)
Home, D.; Alok Kumar Pan; Md Manirul Ali
2005-01-01
Full text: An experimentally realizable scheme is formulated which can test any quantum mechanical approach for calculating the arrival time distribution. This is specifically illustrated by using the modulus of the probability current density for calculating the arrival time distribution of spin-1/2 neutral particles at the exit point of a spin rotator (SR) which contains a constant magnetic field. Such a calculated time distribution is then used for evaluating the distribution of spin orientations along different directions for these particles emerging from the SR. Based on this, the result of spin measurement along any arbitrary direction for such an ensemble is predicted. (author)
Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon
2013-01-01
Purpose Phonotactic probability or neighborhood density have predominately been defined using gross distinctions (i.e., low vs. high). The current studies examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method The full range of probability or density was examined by sampling five nonwords from each of four quartiles. Three- and 5-year-old children received training on nonword-nonobject pairs. Learning was measured in a picture-naming task immediately following training and 1-week after training. Results were analyzed using multi-level modeling. Results A linear spline model best captured nonlinearities in phonotactic probability. Specifically word learning improved as probability increased in the lowest quartile, worsened as probability increased in the midlow quartile, and then remained stable and poor in the two highest quartiles. An ordinary linear model sufficiently described neighborhood density. Here, word learning improved as density increased across all quartiles. Conclusion Given these different patterns, phonotactic probability and neighborhood density appear to influence different word learning processes. Specifically, phonotactic probability may affect recognition that a sound sequence is an acceptable word in the language and is a novel word for the child, whereas neighborhood density may influence creation of a new representation in long-term memory. PMID:23882005
International Nuclear Information System (INIS)
Moriya, Netzer
2010-01-01
A method, based on binomial filtering, to estimate the noise level of an arbitrary, smoothed pure signal, contaminated with an additive, uncorrelated noise component is presented. If the noise characteristics of the experimental spectrum are known, as for instance the type of the corresponding probability density function (e.g., Gaussian), the noise properties can be extracted. In such cases, both the noise level, as may arbitrarily be defined, and a simulated white noise component can be generated, such that the simulated noise component is statistically indistinguishable from the true noise component present in the original signal. In this paper we present a detailed analysis of the noise level extraction when the additive noise is Gaussian or Lorentzian. We show that the statistical parameters in these cases (mainly the variance and the half width at half maximum, respectively) can directly be obtained from the experimental spectrum even when the pure signal is erratic. Further discussion is given for cases where the noise probability density function is initially unknown.
Audio Query by Example Using Similarity Measures between Probability Density Functions of Features
Directory of Open Access Journals (Sweden)
Marko Helén
2010-01-01
Full Text Available This paper proposes a query by example system for generic audio. We estimate the similarity of the example signal and the samples in the queried database by calculating the distance between the probability density functions (pdfs of their frame-wise acoustic features. Since the features are continuous valued, we propose to model them using Gaussian mixture models (GMMs or hidden Markov models (HMMs. The models parametrize each sample efficiently and retain sufficient information for similarity measurement. To measure the distance between the models, we apply a novel Euclidean distance, approximations of Kullback-Leibler divergence, and a cross-likelihood ratio test. The performance of the measures was tested in simulations where audio samples are automatically retrieved from a general audio database, based on the estimated similarity to a user-provided example. The simulations show that the distance between probability density functions is an accurate measure for similarity. Measures based on GMMs or HMMs are shown to produce better results than that of the existing methods based on simpler statistics or histograms of the features. A good performance with low computational cost is obtained with the proposed Euclidean distance.
Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.
2018-05-01
As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.
Protein single-model quality assessment by feature-based probability density functions.
Cao, Renzhi; Cheng, Jianlin
2016-04-04
Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.
Directory of Open Access Journals (Sweden)
Han Liwei
2014-07-01
Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based
Riggs, Peter J.
2013-01-01
Students often wrestle unsuccessfully with the task of correctly calculating momentum probability densities and have difficulty in understanding their interpretation. In the case of a particle in an "infinite" potential well, its momentum can take values that are not just those corresponding to the particle's quantised energies but…
Probability density functions of photochemicals over a coastal area of Northern Italy
International Nuclear Information System (INIS)
Georgiadis, T.; Fortezza, F.; Alberti, L.; Strocchi, V.; Marani, A.; Dal Bo', G.
1998-01-01
The present paper surveys the findings of experimental studies and analyses of statistical probability density functions (PDFs) applied to air pollutant concentrations to provide an interpretation of the ground-level distributions of photochemical oxidants in the coastal area of Ravenna (Italy). The atmospheric-pollution data set was collected from the local environmental monitoring network for the period 1978-1989. Results suggest that the statistical distribution of surface ozone, once normalised over the solar radiation PDF for the whole measurement period, follows a log-normal law as found for other pollutants. Although the Weibull distribution also offers a good fit of the experimental data, the area's meteorological features seem to favour the former distribution once the statistical index estimates have been analysed. Local transport phenomena are discussed to explain the data tail trends
General Exact Solution to the Problem of the Probability Density for Sums of Random Variables
Tribelsky, Michael I.
2002-07-01
The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.
Directory of Open Access Journals (Sweden)
Feihu Zhang
2014-01-01
Full Text Available This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments.
Directory of Open Access Journals (Sweden)
Osmar Abílio de Carvalho Júnior
2014-04-01
Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.
Development and evaluation of probability density functions for a set of human exposure factors
Energy Technology Data Exchange (ETDEWEB)
Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.
1999-06-01
The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.
Development and evaluation of probability density functions for a set of human exposure factors
International Nuclear Information System (INIS)
Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.
1999-01-01
The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors
International Nuclear Information System (INIS)
Watterson, Ian G.
2007-01-01
Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections
Robust functional statistics applied to Probability Density Function shape screening of sEMG data.
Boudaoud, S; Rix, H; Al Harrach, M; Marin, F
2014-01-01
Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.
Exact probability function for bulk density and current in the asymmetric exclusion process
Depken, Martin; Stinchcombe, Robin
2005-03-01
We examine the asymmetric simple exclusion process with open boundaries, a paradigm of driven diffusive systems, having a nonequilibrium steady-state transition. We provide a full derivation and expanded discussion and digression on results previously reported briefly in M. Depken and R. Stinchcombe, Phys. Rev. Lett. 93, 040602 (2004). In particular we derive an exact form for the joint probability function for the bulk density and current, both for finite systems, and also in the thermodynamic limit. The resulting distribution is non-Gaussian, and while the fluctuations in the current are continuous at the continuous phase transitions, the density fluctuations are discontinuous. The derivations are done by using the standard operator algebraic techniques and by introducing a modified version of the original operator algebra. As a by-product of these considerations we also arrive at a very simple way of calculating the normalization constant appearing in the standard treatment with the operator algebra. Like the partition function in equilibrium systems, this normalization constant is shown to completely characterize the fluctuations, albeit in a very different manner.
Tygert, Mark
2010-09-21
We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
Beghein, Caroline; Trampert, Jeannot
2004-01-01
The presence of radial anisotropy in the upper mantle, transition zone and top of the lower mantle is investigated by applying a model space search technique to Rayleigh and Love wave phase velocity models. Probability density functions are obtained independently for S-wave anisotropy, P-wave anisotropy, intermediate parameter η, Vp, Vs and density anomalies. The likelihoods for P-wave and S-wave anisotropy beneath continents cannot be explained by a dry olivine-rich upper mantle at depths larger than 220 km. Indeed, while shear-wave anisotropy tends to disappear below 220 km depth in continental areas, P-wave anisotropy is still present but its sign changes compared to the uppermost mantle. This could be due to an increase with depth of the amount of pyroxene relative to olivine in these regions, although the presence of water, partial melt or a change in the deformation mechanism cannot be ruled out as yet. A similar observation is made for old oceans, but not for young ones where VSH> VSV appears likely down to 670 km depth and VPH> VPV down to 400 km depth. The change of sign in P-wave anisotropy seems to be qualitatively correlated with the presence of the Lehmann discontinuity, generally observed beneath continents and some oceans but not beneath ridges. Parameter η shows a similar age-related depth pattern as shear-wave anisotropy in the uppermost mantle and it undergoes the same change of sign as P-wave anisotropy at 220 km depth. The ratio between dln Vs and dln Vp suggests that a chemical component is needed to explain the anomalies in most places at depths greater than 220 km. More tests are needed to infer the robustness of the results for density, but they do not affect the results for anisotropy.
Using probability density function in the procedure for recognition of the type of physical exercise
Directory of Open Access Journals (Sweden)
Cakić Nikola
2017-01-01
Full Text Available This paper presents a method for recognition of physical exercises, using only a triaxial accelerometer of a smartphone. The smartphone itself is free to move inside subject's pocket. Exercises for leg muscle strengthening from subject's standing position squat, right knee rise and lunge with right leg were analyzed. All exercises were performed with the accelerometric sensor of a smartphone placed in the pocket next to the leg used for exercises. In order to test the proposed recognition method, the knee rise exercise of the opposite leg with the same position of the sensor was randomly selected. Filtering of the raw accelerometric signals was carried out using Butterworth tenth-order low-pass filter. The filtered signals from each of the three axes were described using three signal descriptors. After the descriptors were calculated, a probability density function was constructed for each of the descriptors. The program that implemented the proposed recognition method was executed online within an Android application of the smartphone. Signals from two male and two female subjects were considered as a reference for exercise recognition. The exercise recognition accuracy was 94.22% for three performed exercises, and 85.33% for all four considered exercises.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
Directory of Open Access Journals (Sweden)
Carmen Moret-Tatay
2018-05-01
Full Text Available The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area. The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei
2016-01-01
Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851
On the method of logarithmic cumulants for parametric probability density function estimation.
Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane
2013-10-01
Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.
Power probability density function control and performance assessment of a nuclear research reactor
International Nuclear Information System (INIS)
Abharian, Amir Esmaeili; Fadaei, Amir Hosein
2014-01-01
Highlights: • In this paper, the performance assessment of static PDF control system is discussed. • The reactor PDF model is set up based on the B-spline functions. • Acquaints of Nu, and Th-h. equations solve concurrently by reformed Hansen’s method. • A principle of performance assessment is put forward for the PDF of the NR control. - Abstract: One of the main issues in controlling a system is to keep track of the conditions of the system function. The performance condition of the system should be inspected continuously, to keep the system in reliable working condition. In this study, the nuclear reactor is considered as a complicated system and a principle of performance assessment is used for analyzing the performance of the power probability density function (PDF) of the nuclear research reactor control. First, the model of the power PDF is set up, then the controller is designed to make the power PDF for tracing the given shape, that make the reactor to be a closed-loop system. The operating data of the closed-loop reactor are used to assess the control performance with the performance assessment criteria. The modeling, controller design and the performance assessment of the power PDF are all applied to the control of Tehran Research Reactor (TRR) power in a nuclear process. In this paper, the performance assessment of the static PDF control system is discussed, the efficacy and efficiency of the proposed method are investigated, and finally its reliability is proven
Probability density function of a puff dispersing from the wall of a turbulent channel
Nguyen, Quoc; Papavassiliou, Dimitrios
2015-11-01
Study of dispersion of passive contaminants in turbulence has proved to be helpful in understanding fundamental heat and mass transfer phenomena. Many simulation and experimental works have been carried out to locate and track motions of scalar markers in a flow. One method is to combine Direct Numerical Simulation (DNS) and Lagrangian Scalar Tracking (LST) to record locations of markers. While this has proved to be useful, high computational cost remains a concern. In this study, we develop a model that could reproduce results obtained by DNS and LST for turbulent flow. Puffs of markers with different Schmidt numbers were released into a flow field at a frictional Reynolds number of 150. The point of release was at the channel wall, so that both diffusion and convection contribute to the puff dispersion pattern, defining different stages of dispersion. Based on outputs from DNS and LST, we seek the most suitable and feasible probability density function (PDF) that represents distribution of markers in the flow field. The PDF would play a significant role in predicting heat and mass transfer in wall turbulence, and would prove to be helpful where DNS and LST are not always available.
Representation of Probability Density Functions from Orbit Determination using the Particle Filter
Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell
2012-01-01
Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.
Bakosi, J.; Franzese, P.; Boybeyi, Z.
2007-11-01
Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth and Pope [Phys. Fluids 29, 387 (1986)] with Durbin's [J. Fluid Mech. 249, 465 (1993)] method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous transport with a nonlocal representation of the near-wall Reynolds stress anisotropy. The presence of walls is incorporated through the imposition of no-slip and impermeability conditions on particles without the use of damping or wall-functions. Information on the turbulent time scale is supplied by the gamma-distribution model of van Slooten et al. [Phys. Fluids 10, 246 (1998)]. Two different micromixing models are compared that incorporate the effect of small scale mixing on the transported scalar: the widely used interaction by exchange with the mean and the interaction by exchange with the conditional mean model. Single-point velocity and concentration statistics are compared to direct numerical simulation and experimental data at Reτ=1080 based on the friction velocity and the channel half width. The joint model accurately reproduces a wide variety of conditional and unconditional statistics in both physical and composition space.
A joint probability density function of wind speed and direction for wind energy analysis
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Bueno, Celia
2008-01-01
A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy
Su, Nan-Yao; Lee, Sang-Hee
2008-04-01
Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.
Liu, Z.; Kar, J.; Zeng, S.; Tackett, J. L.; Vaughan, M.; Trepte, C. R.; Omar, A. H.; Hu, Y.; Winker, D. M.
2017-12-01
In the CALIPSO retrieval algorithm, detection layers in the lidar measurements is followed by their classification as a "cloud" or "aerosol" using 5-dimensional probability density functions (PDFs). The five dimensions are the mean attenuated backscatter at 532 nm, the layer integrated total attenuated color ratio, the mid-layer altitude, integrated volume depolarization ratio and latitude. The new version 4 (V4) level 2 (L2) data products, released in November 2016, are the first major revision to the L2 product suite since May 2010. Significant calibration changes in the V4 level 1 data necessitated substantial revisions to the V4 L2 CAD algorithm. Accordingly, a new set of PDFs was generated to derive the V4 L2 data products. The V4 CAD algorithm is now applied to layers detected in the stratosphere, where volcanic layers and occasional cloud and smoke layers are observed. Previously, these layers were designated as `stratospheric', and not further classified. The V4 CAD algorithm is also applied to all layers detected at single shot (333 m) resolution. In prior data releases, single shot detections were uniformly classified as clouds. The CAD PDFs used in the earlier releases were generated using a full year (2008) of CALIPSO measurements. Because the CAD algorithm was not applied to stratospheric features, the properties of these layers were not incorporated into the PDFs. When building the V4 PDFs, the 2008 data were augmented with additional data from June 2011, and all stratospheric features were included. The Nabro and Puyehue-Cordon volcanos erupted in June 2011, and volcanic aerosol layers were observed in the upper troposphere and lower stratosphere in both the northern and southern hemispheres. The June 2011 data thus provides the stratospheric aerosol properties needed for comprehensive PDF generation. In contrast to earlier versions of the PDFs, which were generated based solely on observed distributions, construction of the V4 PDFs considered the
Kinetic and dynamic probability-density-function descriptions of disperse turbulent two-phase flows
Minier, Jean-Pierre; Profeta, Christophe
2015-11-01
This article analyzes the status of two classical one-particle probability density function (PDF) descriptions of the dynamics of discrete particles dispersed in turbulent flows. The first PDF formulation considers only the process made up by particle position and velocity Zp=(xp,Up) and is represented by its PDF p (t ;yp,Vp) which is the solution of a kinetic PDF equation obtained through a flux closure based on the Furutsu-Novikov theorem. The second PDF formulation includes fluid variables into the particle state vector, for example, the fluid velocity seen by particles Zp=(xp,Up,Us) , and, consequently, handles an extended PDF p (t ;yp,Vp,Vs) which is the solution of a dynamic PDF equation. For high-Reynolds-number fluid flows, a typical formulation of the latter category relies on a Langevin model for the trajectories of the fluid seen or, conversely, on a Fokker-Planck equation for the extended PDF. In the present work, a new derivation of the kinetic PDF equation is worked out and new physical expressions of the dispersion tensors entering the kinetic PDF equation are obtained by starting from the extended PDF and integrating over the fluid seen. This demonstrates that, under the same assumption of a Gaussian colored noise and irrespective of the specific stochastic model chosen for the fluid seen, the kinetic PDF description is the marginal of a dynamic PDF one. However, a detailed analysis reveals that kinetic PDF models of particle dynamics in turbulent flows described by statistical correlations constitute incomplete stand-alone PDF descriptions and, moreover, that present kinetic-PDF equations are mathematically ill posed. This is shown to be the consequence of the non-Markovian characteristic of the stochastic process retained to describe the system and the use of an external colored noise. Furthermore, developments bring out that well-posed PDF descriptions are essentially due to a proper choice of the variables selected to describe physical systems
Rispens, Judith; Baker, Anne; Duinmeijer, Iris
2015-01-01
Purpose: The effects of neighborhood density (ND) and lexical frequency on word recognition and the effects of phonotactic probability (PP) on nonword repetition (NWR) were examined to gain insight into processing at the lexical and sublexical levels in typically developing (TD) children and children with developmental language problems. Method:…
International Nuclear Information System (INIS)
Mysina, N Yu; Maksimova, L A; Ryabukho, V P; Gorbatenko, B B
2015-01-01
Investigated are statistical properties of the phase difference of oscillations in speckle-fields at two points in the far-field diffraction region, with different shapes of the scatterer aperture. Statistical and spatial nonuniformity of the probability density function of the field phase difference is established. Numerical experiments show that, for the speckle-fields with an oscillating alternating-sign transverse correlation function, a significant nonuniformity of the probability density function of the phase difference in the correlation region of the field complex amplitude, with the most probable values 0 and p, is observed. A natural statistical interference experiment using Young diagrams has confirmed the results of numerical experiments. (laser applications and other topics in quantum electronics)
Organ procurement: let's presume consent
Moustarah, F
1998-01-01
IN WINNING FIRST PRIZE in the Logie Medical Ethics Essay Contest in 1997, Dr. Fady Moustarah made a strong and compelling argument in favour of presumed consent in the procurement of donor organs. He stressed that a major education campaign will be needed when such a policy is adopted lest some people begin to regard physicians as "organ vultures."
Neokosmidis, Ioannis; Kamalakis, Thomas; Chipouras, Aristides; Sphicopoulos, Thomas
2005-01-01
The performance of high-powered wavelength-division multiplexed (WDM) optical networks can be severely degraded by four-wave-mixing- (FWM-) induced distortion. The multicanonical Monte Carlo method (MCMC) is used to calculate the probability-density function (PDF) of the decision variable of a receiver, limited by FWM noise. Compared with the conventional Monte Carlo method previously used to estimate this PDF, the MCMC method is much faster and can accurately estimate smaller error probabilities. The method takes into account the correlation between the components of the FWM noise, unlike the Gaussian model, which is shown not to provide accurate results.
PDE-Foam - a probability-density estimation method using self-adapting phase-space binning
Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter
2009-01-01
Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...
Protein distance constraints predicted by neural networks and probability density functions
DEFF Research Database (Denmark)
Lund, Ole; Frimand, Kenneth; Gorodkin, Jan
1997-01-01
We predict interatomic C-α distances by two independent data driven methods. The first method uses statistically derived probability distributions of the pairwise distance between two amino acids, whilst the latter method consists of a neural network prediction approach equipped with windows taki...... method based on the predicted distances is presented. A homepage with software, predictions and data related to this paper is available at http://www.cbs.dtu.dk/services/CPHmodels/...
Complications of presumed ocular tuberculosis.
Hamade, Issam H; Tabbara, Khalid F
2010-12-01
To determine the effect of steroid treatment on visual outcome and ocular complications in patients with presumed ocular tuberculosis. Retrospective review of patients with presumptive ocular tuberculosis. The clinical diagnosis was made based on ocular findings, positive purified protein derivative (PPD) testing of more than 15 mm induration, exclusion of other causes of uveitis and positive ocular response to anti-tuberculous therapy (ATT) within 4 weeks. Group 1 included patients who had received oral prednisone or subtenon injection of triamcinolone acetonide prior to ATT. Group 2 included patients who did not receive corticosteroid therapy prior to administration of ATT. Among 500 consecutive new cases of uveitis encountered in 1997-2007 there were 49 (10%) patients with presumed ocular tuberculosis. These comprised 28 (57%) male and 21 (43%) female patients with a mean age of 45 years (range 12-76 years). Four (20%) patients in group 1 had initial visual acuity of 20/40 or better, in comparison to eight (28%) patients in group 2. At 1-year follow-up, six (30%) patients in group 1 had a visual acuity of 20/40 or better compared with 20 (69%) patients in group 2 (p = 0.007). Of 20 eyes (26%) in group 1 that had visual acuity of < 20/50 at 1-year follow up, 14 (70%) eyes developed severe chorioretinal lesion (p = 0.019). Early administration of corticosteroids without anti-tuberculous therapy in presumed ocular tuberculosis may lead to poor visual outcome compared with patients who did not receive corticosteroids prior to presentation. Furthermore, the severity of chorioretinitis lesion in the group of patients given corticosteroid prior to ATT may account for the poor visual outcome. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.
Kim, Jeonglae; Pope, Stephen B.
2014-05-01
A turbulent lean-premixed propane-air flame stabilised by a triangular cylinder as a flame-holder is simulated to assess the accuracy and computational efficiency of combined dimension reduction and tabulation of chemistry. The computational condition matches the Volvo rig experiments. For the reactive simulation, the Lagrangian Large-Eddy Simulation/Probability Density Function (LES/PDF) formulation is used. A novel two-way coupling approach between LES and PDF is applied to obtain resolved density to reduce its statistical fluctuations. Composition mixing is evaluated by the modified Interaction-by-Exchange with the Mean (IEM) model. A baseline case uses In Situ Adaptive Tabulation (ISAT) to calculate chemical reactions efficiently. Its results demonstrate good agreement with the experimental measurements in turbulence statistics, temperature, and minor species mass fractions. For dimension reduction, 11 and 16 represented species are chosen and a variant of Rate Controlled Constrained Equilibrium (RCCE) is applied in conjunction with ISAT to each case. All the quantities in the comparison are indistinguishable from the baseline results using ISAT only. The combined use of RCCE/ISAT reduces the computational time for chemical reaction by more than 50%. However, for the current turbulent premixed flame, chemical reaction takes only a minor portion of the overall computational cost, in contrast to non-premixed flame simulations using LES/PDF, presumably due to the restricted manifold of purely premixed flame in the composition space. Instead, composition mixing is the major contributor to cost reduction since the mean-drift term, which is computationally expensive, is computed for the reduced representation. Overall, a reduction of more than 15% in the computational cost is obtained.
Probability density adjoint for sensitivity analysis of the Mean of Chaos
Energy Technology Data Exchange (ETDEWEB)
Blonigan, Patrick J., E-mail: blonigan@mit.edu; Wang, Qiqi, E-mail: qiqi@mit.edu
2014-08-01
Sensitivity analysis, especially adjoint based sensitivity analysis, is a powerful tool for engineering design which allows for the efficient computation of sensitivities with respect to many parameters. However, these methods break down when used to compute sensitivities of long-time averaged quantities in chaotic dynamical systems. This paper presents a new method for sensitivity analysis of ergodic chaotic dynamical systems, the density adjoint method. The method involves solving the governing equations for the system's invariant measure and its adjoint on the system's attractor manifold rather than in phase-space. This new approach is derived for and demonstrated on one-dimensional chaotic maps and the three-dimensional Lorenz system. It is found that the density adjoint computes very finely detailed adjoint distributions and accurate sensitivities, but suffers from large computational costs.
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
Chowdhury, Shakhawat
2013-05-01
The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
Energy Technology Data Exchange (ETDEWEB)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
2018-01-01
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.
Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei
2016-03-01
Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.
Stephanik, Brian Michael
This dissertation describes the results of two related investigations into introductory student understanding of ideas from classical physics that are key elements of quantum mechanics. One investigation probes the extent to which students are able to interpret and apply potential energy diagrams (i.e., graphs of potential energy versus position). The other probes the extent to which students are able to reason classically about probability and spatial probability density. The results of these investigations revealed significant conceptual and reasoning difficulties that students encounter with these topics. The findings guided the design of instructional materials to address the major problems. Results from post-instructional assessments are presented that illustrate the impact of the curricula on student learning.
International Nuclear Information System (INIS)
Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.; Fitzgerald, R.
2010-01-01
Numerical values of charged-particle thermonuclear reaction rates for nuclei in the A=14 to 40 region are tabulated. The results are obtained using a method, based on Monte Carlo techniques, that has been described in the preceding paper of this issue (Paper I). We present a low rate, median rate and high rate which correspond to the 0.16, 0.50 and 0.84 quantiles, respectively, of the cumulative reaction rate distribution. The meaning of these quantities is in general different from the commonly reported, but statistically meaningless expressions, 'lower limit', 'nominal value' and 'upper limit' of the total reaction rate. In addition, we approximate the Monte Carlo probability density function of the total reaction rate by a lognormal distribution and tabulate the lognormal parameters μ and σ at each temperature. We also provide a quantitative measure (Anderson-Darling test statistic) for the reliability of the lognormal approximation. The user can implement the approximate lognormal reaction rate probability density functions directly in a stellar model code for studies of stellar energy generation and nucleosynthesis. For each reaction, the Monte Carlo reaction rate probability density functions, together with their lognormal approximations, are displayed graphically for selected temperatures in order to provide a visual impression. Our new reaction rates are appropriate for bare nuclei in the laboratory. The nuclear physics input used to derive our reaction rates is presented in the subsequent paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Energy Technology Data Exchange (ETDEWEB)
Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)
2016-10-15
This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.
International Nuclear Information System (INIS)
Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo
2016-01-01
This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage
Directory of Open Access Journals (Sweden)
Jong Kyeom Lee
2016-10-01
Full Text Available This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.
Liang, Yingjie; Chen, Wen
2018-04-01
The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.
The effect of fog on the probability density distribution of the ranging data of imaging laser radar
Song, Wenhua; Lai, JianCheng; Ghassemlooy, Zabih; Gu, Zhiyong; Yan, Wei; Wang, Chunyong; Li, Zhenhua
2018-02-01
This paper outlines theoretically investigations of the probability density distribution (PDD) of ranging data for the imaging laser radar (ILR) system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.
The effect of fog on the probability density distribution of the ranging data of imaging laser radar
Directory of Open Access Journals (Sweden)
Wenhua Song
2018-02-01
Full Text Available This paper outlines theoretically investigations of the probability density distribution (PDD of ranging data for the imaging laser radar (ILR system operating at a wavelength of 905 nm under the fog condition. Based on the physical model of the reflected laser pulses from a standard Lambertian target, a theoretical approximate model of PDD of the ranging data is developed under different fog concentrations, which offer improved precision target ranging and imaging. An experimental test bed for the ILR system is developed and its performance is evaluated using a dedicated indoor atmospheric chamber under homogeneously controlled fog conditions. We show that the measured results are in good agreement with both the accurate and approximate models within a given margin of error of less than 1%.
Chowdhury, Snehaunshu; Boyette, Wesley; Roberts, William L.
2017-01-01
In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating
International Nuclear Information System (INIS)
Varella, Marcio Teixeira do Nascimento
2001-12-01
We have calculated annihilation probability densities (APD) for positron collisions against He atom and H 2 molecule. It was found that direct annihilation prevails at low energies, while annihilation following virtual positronium (Ps) formation is the dominant mechanism at higher energies. In room-temperature collisions (10 -2 eV) the APD spread over a considerable extension, being quite similar to the electronic densities of the targets. The capture of the positron in an electronic Feshbach resonance strongly enhanced the annihilation rate in e + -H 2 collisions. We also discuss strategies to improve the calculation of the annihilation parameter (Z eff ), after debugging the computational codes of the Schwinger Multichannel Method (SMC). Finally, we consider the inclusion of the Ps formation channel in the SMC and show that effective configurations (pseudo eigenstates of the Hamiltonian of the collision ) are able to significantly reduce the computational effort in positron scattering calculations. Cross sections for electron scattering by polyatomic molecules were obtained in three different approximations: static-exchange (SE); tatic-exchange-plus-polarization (SEP); and multichannel coupling. The calculations for polar targets were improved through the rotational resolution of scattering amplitudes in which the SMC was combined with the first Born approximation (FBA). In general, elastic cross sections (SE and SEP approximations) showed good agreement with available experimental data for several targets. Multichannel calculations for e - -H 2 O scattering, on the other hand, presented spurious structures at the electronic excitation thresholds (author)
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
A Student’s t Mixture Probability Hypothesis Density Filter for Multi-Target Tracking with Outliers
Liu, Zhuowei; Chen, Shuxin; Wu, Hao; He, Renke; Hao, Lin
2018-01-01
In multi-target tracking, the outliers-corrupted process and measurement noises can reduce the performance of the probability hypothesis density (PHD) filter severely. To solve the problem, this paper proposed a novel PHD filter, called Student’s t mixture PHD (STM-PHD) filter. The proposed filter models the heavy-tailed process noise and measurement noise as a Student’s t distribution as well as approximates the multi-target intensity as a mixture of Student’s t components to be propagated in time. Then, a closed PHD recursion is obtained based on Student’s t approximation. Our approach can make full use of the heavy-tailed characteristic of a Student’s t distribution to handle the situations with heavy-tailed process and the measurement noises. The simulation results verify that the proposed filter can overcome the negative effect generated by outliers and maintain a good tracking accuracy in the simultaneous presence of process and measurement outliers. PMID:29617348
de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander
2017-11-01
To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.
Energy Technology Data Exchange (ETDEWEB)
Rullaud, M
2004-06-01
A new modelization of turbulent combustion is proposed with detailed chemistry and probability density functions (PDFs). The objective is to capture temperature and species concentrations, mainly the CO. The PCM-FTC model, Presumed Conditional Moment - Flame Tabulated Chemistry, is based on the tabulation of laminar premixed and diffusion flames to capture partial pre-mixing present in aeronautical engines. The presumed PDFs is introduced to predict averaged values. The tabulation method is based on the analysis of the chemical structure of laminar premixed and diffusion flames. Hypothesis are presented, tested and validated with Sandia experimental data jet flames. Then, the model is introduced in a turbulent flow simulation software. Three configurations are retained to quantify the level of prediction of this formulation: the D and F-Flames of Sandia and lifted jet flames of methane/air of Stanford. A good agreement is observed between experiments and simulations. The validity of this method is then demonstrated. (author)
Cannon, Alex J.
2018-01-01
Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin
Presumed hereditary retinal degenerations: Ibadan experience ...
African Journals Online (AJOL)
This study describes the clinical presentation of RP, the prevalence of associated treatable disorders and the characteristics of patients with severe visual impairment and blindness. Method: A retrospective review of 52 cases presumed and diagnosed to have RP was performed on patients who presented at the Eye Clinic, ...
Stockall, Linnaea; Stringfellow, Andrew; Marantz, Alec
2004-01-01
Visually presented letter strings consistently yield three MEG response components: the M170, associated with letter-string processing (Tarkiainen, Helenius, Hansen, Cornelissen, & Salmelin, 1999); the M250, affected by phonotactic probability, (Pylkkanen, Stringfellow, & Marantz, 2002); and the M350, responsive to lexical frequency (Embick,…
International Nuclear Information System (INIS)
Burgazzi, Luciano
2011-01-01
PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest
International Nuclear Information System (INIS)
Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.
1992-10-01
The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed
International Nuclear Information System (INIS)
Olejnik, S.
1989-01-01
It is shown that the leading and next-to-leading non-gaussian effects have a minor inlfuence on the instanton density for the double-well potential: it is slightly increased, contrary to the claims of other authors. We point out a connection to recent quantitative studies of topological effects in gauge theories. (orig.)
Huang, N. E.; Long, S. R.; Bliven, L. F.; Tung, C.-C.
1984-01-01
On the basis of the mapping method developed by Huang et al. (1983), an analytic expression for the non-Gaussian joint probability density function of slope and elevation for nonlinear gravity waves is derived. Various conditional and marginal density functions are also obtained through the joint density function. The analytic results are compared with a series of carefully controlled laboratory observations, and good agreement is noted. Furthermore, the laboratory wind wave field observations indicate that the capillary or capillary-gravity waves may not be the dominant components in determining the total roughness of the wave field. Thus, the analytic results, though derived specifically for the gravity waves, may have more general applications.
2018-01-30
home range maintenance or attraction to or avoidance of landscape features, including roads (Morales et al. 2004, McClintock et al. 2012). For example...radiotelemetry and extensive road survey data are used to generate the first density estimates available for the species. The results show that southern...secretive snakes that combines behavioral observations of snake road crossing speed, systematic road survey data, and simulations of spatial
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Velazquez, Sergio
2008-01-01
Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error ε made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R 2 statistic (R a 2 ). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R a 2 statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R a 2 increases
Energy Technology Data Exchange (ETDEWEB)
Carta, Jose A. [Department of Mechanical Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Las Palmas de Gran Canaria, Canary Islands (Spain); Ramirez, Penelope; Velazquez, Sergio [Department of Renewable Energies, Technological Institute of the Canary Islands, Pozo Izquierdo Beach s/n, 35119 Santa Lucia, Gran Canaria, Canary Islands (Spain)
2008-10-15
Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error {epsilon} made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R{sup 2} statistic (R{sub a}{sup 2}). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R{sub a}{sup 2} statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R{sub a}{sup 2} increases. (author)
Presumed choroidal metastasis of Merkel cell carcinoma
International Nuclear Information System (INIS)
Small, K.W.; Rosenwasser, G.O.; Alexander, E. III; Rossitch, G.; Dutton, J.J.
1990-01-01
Merkel cell carcinoma is a rare skin tumor of neural crest origin and is part of the amine precursor uptake and decarboxylase system. It typically occurs on the face of elderly people. Distant metastasis is almost uniformly fatal. Choroidal metastasis, to our knowledge, has not been described. We report a patient with Merkel cell carcinoma who had a synchronous solid choroidal tumor and a biopsy-proven brain metastasis. Our 56-year-old patient presented with a rapidly growing, violaceous preauricular skin tumor. Computed tomography of the head disclosed incidental brain and choroidal tumors. Light and electron microscopy of biopsy specimens of both the skin and the brain lesions showed Merkel cell carcinoma. Ophthalmoscopy, fluorescein angiography, and A and B echography revealed a solid choroidal mass. The brain and skin tumors responded well to irradiation. A radioactive episcleral plaque was applied subsequently to the choroidal tumor. All tumors regressed, and the patient was doing well 28 months later. To our knowledge this is the first case of presumed choroidal metastasis of Merkel cell carcinoma
International Nuclear Information System (INIS)
Croce, R.P.; Demma, Th.; Pierro, V.; Pinto, I.M.; Longo, M.; Marano, S.; Matta, V.
2004-01-01
The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness
20 CFR 219.24 - Evidence of presumed death.
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Evidence of presumed death. 219.24 Section... EVIDENCE REQUIRED FOR PAYMENT Evidence of Age and Death § 219.24 Evidence of presumed death. When a person cannot be proven dead but evidence of death is needed, the Board may presume he or she died at a certain...
Laparoscopic power morcellation of presumed fibroids.
Brolmann, Hans A; Sizzi, Ornella; Hehenkamp, Wouter J; Rossetti, Alfonso
2016-06-01
Uterine leiomyoma is a highly prevalent benign gynecologic neoplasm that affects women of reproductive age. Surgical procedures commonly employed to treat symptomatic uterine fibroids include myomectomy or total or sub-total hysterectomy. These procedures, when performed using minimally invasive techniques, reduce the risks of intraoperative and postoperative morbidity and mortality; however, in order to remove bulky lesions from the abdominal cavity through laparoscopic ports, a laparoscopic power morcellator must be used, a device with rapidly spinning blades to cut the uterine tissue into fragments so that it can be removed through a small incision. Although the minimal invasive approach in gynecological surgery has been firmly established now in terms of recovery and quality of life, morcellation is associated with rare but sometimes serious adverse events. Parts of the morcellated specimen may be spread into the abdominal cavity and enable implantation of cells on the peritoneum. In case of unexpected sarcoma the dissemination may upstage disease and affect survival. Myoma cells may give rise to 'parasitic' fibroids, but also implantation of adenomyotic cells and endometriosis has been reported. Finally the morcellation device may cause inadvertent injury to internal structures, such as bowel and vessels, with its rotating circular knife. In this article it is described how to estimate the risk of sarcoma in a presumed fibroid based on epidemiologic, imaging and laboratory data. Furthermore the first literature results of the in-bag morcellation are reviewed. With this procedure the specimen is contained in an insufflated sterile bag while being morcellated, potentially preventing spillage of tissue but also making direct morcellation injuries unlikely to happen.
Energy Technology Data Exchange (ETDEWEB)
Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br
2009-07-01
This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.
Conjuntivite presumível por Acanthamoeba Conjunctivitis presumably due to Acanthamoeba
Directory of Open Access Journals (Sweden)
Ana Cristina de Carvalho Ruthes
2004-12-01
Full Text Available OBJETIVO: Abordar quatro casos de conjuntivite presumível por Acanthamoeba, descrevendo o diagnóstico, considerando sinais e sintomas e o tratamento instituído. MÉTODOS: Foram estudados casos de conjuntivite presumível por Acanthamoeba diagnosticados no Hospital de Olhos do Paraná (HOP, no período de setembro/1998 a janeiro/2002. Todos os olhos estudados foram submetidos a um protocolo de investigação que incluía exame oftalmológico completo, microbiologia e cultura de secreções conjuntivais. RESULTADOS: Os exames laboratoriais de microscopia e cultura do material colhido estes pacientes revelaram o diagnóstico de Acanthamoeba. A maioria dos pacientes referia olhos vermelhos e irritação ocular de longa data. Os autores encontraram correlação entre a cultura e o exame direto, em que se evidenciou a presença de cistos e trofozoítas do protozoário. CONCLUSÃO: Este é o primeiro relato de conjuntivite provavelmente por Acanthamoeba de acordo com a literatura revisada. Pacientes selecionados e refratários ao tratamento habitual de infecção ocular externa devem ser considerados para estudo laboratorial adequado à procura etiológica da doença.PURPOSE: To describe four cases of conjunctivitis presumably due to Acanthamoeba considering diagnosis, signs, symptoms and treatment. METHODS: We reviewed the medical records of all patients who presented a clinical diagnosis of Acanthamoeba conjunctivitis between September/1998 to January/2001 at the "Hospital de Olhos do Paraná (HOP". All eyes were submitted to a protocol of investigation that included ophthalmologic examination, microscopic examination and culture exams of conjunctival smears for adequate treatment. RESULTS: The laboratorial results of conjunctival smears revealed contamination with Acanthamoeba by direct examination and thereafter, confirmed by culture. The authors observed cysts and trophozoites of Acanthamoeba. CONCLUSION: This is the first report of
The probability factor in establishing causation
International Nuclear Information System (INIS)
Hebert, J.
1988-01-01
This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr
Venturi, D.; Karniadakis, G. E.
2012-08-01
By using functional integral methods we determine new evolution equations satisfied by the joint response-excitation probability density function (PDF) associated with the stochastic solution to first-order nonlinear partial differential equations (PDEs). The theory is presented for both fully nonlinear and for quasilinear scalar PDEs subject to random boundary conditions, random initial conditions or random forcing terms. Particular applications are discussed for the classical linear and nonlinear advection equations and for the advection-reaction equation. By using a Fourier-Galerkin spectral method we obtain numerical solutions of the proposed response-excitation PDF equations. These numerical solutions are compared against those obtained by using more conventional statistical approaches such as probabilistic collocation and multi-element probabilistic collocation methods. It is found that the response-excitation approach yields accurate predictions of the statistical properties of the system. In addition, it allows to directly ascertain the tails of probabilistic distributions, thus facilitating the assessment of rare events and associated risks. The computational cost of the response-excitation method is order magnitudes smaller than the one of more conventional statistical approaches if the PDE is subject to high-dimensional random boundary or initial conditions. The question of high-dimensionality for evolution equations involving multidimensional joint response-excitation PDFs is also addressed.
Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente
2017-04-29
Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.
Chowdhury, Snehaunshu
2017-01-23
In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating existing soot models are reported at intervals of ∆x/D∆x/D = 5 along the centerline of turbulent, non-premixed, C2H4/N2 flames. The jet exit Reynolds numbers of the flames investigated were 10,000 and 20,000. A simplified burner geometry based on a published design was chosen to aid modelers. Soot was sampled directly from the flame using a sampling probe with a 0.5-mm diameter orifice and diluted with N2 by a two-stage dilution process. The overall dilution ratio was not evaluated. An SMPS system was used to analyze soot particle concentrations in the diluted samples. Sampling conditions were optimized over a wide range of dilution ratios to eliminate the effect of agglomeration in the sampling probe. Two differential mobility analyzers (DMAs) with different size ranges were used separately in the SMPS measurements to characterize the entire size range of particles. In both flames, the PDFs were found to be mono-modal in nature near the jet exit. Further downstream, the profiles were flatter with a fall-off at larger particle diameters. The geometric mean of the soot size distributions was less than 10 nm for all cases and increased monotonically with axial distance in both flames.
Institute of Scientific and Technical Information of China (English)
陆宏伟; 陈亚珠; 卫青
2004-01-01
Probability density function (PDF) method is proposed for analysing the structure of the reconstructed attractor in computing the correlation dimensions of RR intervals of ten normal old men.PDF contains important information about the spatial distribution of the phase points in the reconstructed attractor.To the best of our knowledge, it is the first time that the PDF method is put forward for the analysis of the reconstructed attractor structure.Numerical simulations demonstrate that the cardiac systems of healthy old men are about 6-6.5 dimensional complex dynamical systems.It is found that PDF is not symmetrically distributed when time delay is small, while PDF satisfies Gaussian distribution when time delay is big enough.A cluster effect mechanism is presented to explain this phenomenon.By studying the shape of PDFs, that the roles played by time delay are more important than embedding dimension in the reconstruction is clearly indicated.Results have demonstrated that the PDF method represents a promising numerical approach for the observation of the reconstructed attractor structure and may provide more information and new diagnostic potential of the analyzed cardiac system.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Retributivist Arguments against Presuming Innocence : Answering to Duff
van Dijk, A.A.
2013-01-01
Factors justifying not presuming innocence are generally incorporated into the Presumption of Innocence (PoI). A confusing discourse has resulted: numerous guilt-presuming acts are deemed consistent with the PoI. I argue for an unusually broad PoI: any act that might convey to a reasonable actor
10 CFR 436.13 - Presuming cost-effectiveness results.
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false Presuming cost-effectiveness results. 436.13 Section 436... Methodology and Procedures for Life Cycle Cost Analyses § 436.13 Presuming cost-effectiveness results. (a) If the investment and other costs for an energy or water conservation measure considered for retrofit to...
27 CFR 70.52 - Signature presumed authentic.
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Signature presumed authentic. 70.52 Section 70.52 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE... Collection of Excise and Special (Occupational) Tax Collection-General Provisions § 70.52 Signature presumed...
26 CFR 301.6064-1 - Signature presumed authentic.
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Signature presumed authentic. 301.6064-1 Section 301.6064-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED....6064-1 Signature presumed authentic. An individual's name signed to a return, statement, or other...
Irreversibility and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Directory of Open Access Journals (Sweden)
Ahmad El Sayed
2015-01-01
Full Text Available A lifted hydrogen/nitrogen turbulent jet flame issuing into a vitiated coflow is investigated using the conditional moment closure (CMC supplemented by the presumed mapping function (PMF approach for the modelling of conditional mixing and velocity statistics. Using a prescribed reference field, the PMF approach yields a presumed probability density function (PDF for the mixture fraction, which is then used in closing the conditional scalar dissipation rate (CSDR and conditional velocity in a fully consistent manner. These closures are applied to a lifted flame and the findings are compared to previous results obtained using β-PDF-based closures over a range of coflow temperatures (Tc. The PMF results are in line with those of the β-PDF and compare well to measurements. The transport budgets in mixture fraction and physical spaces and the radical history ahead of the stabilisation height indicate that the stabilisation mechanism is susceptible to Tc. As in the previous β-PDF calculations, autoignition around the “most reactive” mixture fraction remains the controlling mechanism for sufficiently high Tc. Departure from the β-PDF predictions is observed when Tc is decreased as PMF predicts stabilisation by means of premixed flame propagation. This conclusion is based on the observation that lean mixtures are heated by downstream burning mixtures in a preheat zone developing ahead of the stabilization height. The spurious sources, which stem from inconsistent CSDR modelling, are further investigated. The findings reveal that their effect is small but nonnegligible, most notably within the flame zone.
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
DEFF Research Database (Denmark)
Ekelund, Flemming; Christensen, Søren; Rønn, Regin
1999-01-01
An automated modification of the most-probable-number (MPN) technique has been developed for enumeration of phagotrophic protozoa. The method is based on detection of prey depletion in micro titre plates rather than on presence of protozoa. A transconjugant Pseudomonas fluorescens DR54 labelled w...
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...
Directory of Open Access Journals (Sweden)
Farnoosh Basaligheh
2015-12-01
Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.
Tveito, Aslak; Lines, Glenn T; Edwards, Andrew G; McCulloch, Andrew
2016-07-01
Markov models are ubiquitously used to represent the function of single ion channels. However, solving the inverse problem to construct a Markov model of single channel dynamics from bilayer or patch-clamp recordings remains challenging, particularly for channels involving complex gating processes. Methods for solving the inverse problem are generally based on data from voltage clamp measurements. Here, we describe an alternative approach to this problem based on measurements of voltage traces. The voltage traces define probability density functions of the functional states of an ion channel. These probability density functions can also be computed by solving a deterministic system of partial differential equations. The inversion is based on tuning the rates of the Markov models used in the deterministic system of partial differential equations such that the solution mimics the properties of the probability density function gathered from (pseudo) experimental data as well as possible. The optimization is done by defining a cost function to measure the difference between the deterministic solution and the solution based on experimental data. By evoking the properties of this function, it is possible to infer whether the rates of the Markov model are identifiable by our method. We present applications to Markov model well-known from the literature. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Bunder, J.E.J.E.; McKenzie, R.H.Ross H.
2001-01-01
We consider the statistical properties of the local density of states of a one-dimensional Dirac equation in the presence of various types of disorder with Gaussian white-noise distribution. It is shown how either the replica trick or supersymmetry can be used to calculate exactly all the moments of the local density of states. Careful attention is paid to how the results change if the local density of states is averaged over atomic length scales. For both the replica trick and supersymmetry the problem is reduced to finding the ground state of a zero-dimensional Hamiltonian which is written solely in terms of a pair of coupled 'spins' which are elements of u(1,1). This ground state is explicitly found for the particular case of the Dirac equation corresponding to an infinite metallic quantum wire with a single conduction channel. The calculated moments of the local density of states agree with those found previously by Al'tshuler and Prigodin [Sov. Phys. JETP 68 (1989) 198] using a technique based on recursion relations for Feynman diagrams
Grinstead, Charles M; Snell, J Laurie
2011-01-01
This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.
Presumed Optic Disc Melanocytoma in a Young Nigerian: A ...
African Journals Online (AJOL)
homogenous soft tissue mass with broad base arising from the choroid in the optic nerve area and projecting into the vitreous cavity. No retinal detachment or sub-retinal fluid was seen. An assessment of right presumed ODM was made. She was refracted with visual acuity improvement to 6/5 in either eye and spectacles ...
Compensatory cerebral motor control following presumed perinatal ischemic stroke
van der Hoorn, Anouk; Potgieser, Adriaan R E; Brouwer, Oebele F; de Jong, Bauke M
Case: A fifteen year-old left-handed girl presented with right-sided focal motor seizures. Neuroimaging showed a large left hemisphere lesion compatible with a middle cerebral artery stroke of presumed perinatal origin. She was not previously diagnosed with a motor deficit, although neurological
Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V
1997-01-01
This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.
Directory of Open Access Journals (Sweden)
Thomas S Churcher
2017-01-01
Full Text Available Over a century since Ronald Ross discovered that malaria is caused by the bite of an infectious mosquito it is still unclear how the number of parasites injected influences disease transmission. Currently it is assumed that all mosquitoes with salivary gland sporozoites are equally infectious irrespective of the number of parasites they harbour, though this has never been rigorously tested. Here we analyse >1000 experimental infections of humans and mice and demonstrate a dose-dependency for probability of infection and the length of the host pre-patent period. Mosquitoes with a higher numbers of sporozoites in their salivary glands following blood-feeding are more likely to have caused infection (and have done so quicker than mosquitoes with fewer parasites. A similar dose response for the probability of infection was seen for humans given a pre-erythrocytic vaccine candidate targeting circumsporozoite protein (CSP, and in mice with and without transfusion of anti-CSP antibodies. These interventions prevented infection more efficiently from bites made by mosquitoes with fewer parasites. The importance of parasite number has widespread implications across malariology, ranging from our basic understanding of the parasite, how vaccines are evaluated and the way in which transmission should be measured in the field. It also provides direct evidence for why the only registered malaria vaccine RTS,S was partially effective in recent clinical trials.
International Nuclear Information System (INIS)
Kumar, A.; Rao, K.S.; Srinivasan, M.
1983-01-01
The Trombay criticality formula (TCF) has been derived by incorporating a number of well-known concepts of criticality physics to enable prediction of changes in critical size or k /SUB eff/ following alterations in geometrical and physical parameters of uniformly reflected small reactor assemblies characterized by large neutron leakage from the core. The variant parameters considered are size, shape, density and diluent concentration of the core, and density and thickness of the reflector. The effect of these changes (except core size) manifests, through sigma /SUB c/ the critical surface mass density of the ''corresponding critical core,'' that sigma, the massto-surface-area ratio of the core,'' is essentially a measure of the product /rho/ extended to nonspherical systems and plays a dominant role in the TCF. The functional dependence of k /SUB eff/ on sigma/sigma /SUB c/ , the system size relative to critical, is expressed in the TCF through two alternative representations, namely the modified Wigner rational form and, an exponential form, which is given
Janković, Bojan
2009-10-01
The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Sucessfull management of bilateral presumed Candida endogenous endophtalmitis following pancreatitis
Directory of Open Access Journals (Sweden)
Ricardo Evangelista Marrocos de Aragão
2016-06-01
Full Text Available ABSTRACT Endogenous endophthalmitis is a rare, and frequently devastating, ophthalmic disease. It occurs mostly in immunocompromised patients, or those with diabetes mellitus, cancer or intravenous drugs users. Candida infection is the most common cause of endogenous endophthalmitis. Ocular candidiasis develops within days to weeks of fungemia. The association of treatment for pancreatitis with endophthalmitis is unusual. Treatment with broad-spectrum antibiotics and total parenteral nutrition may explain endogenous endophthalmitis. We report the case of a patient with pancreatitis treated with broad-spectrum antibiotics and total parenteral nutrition who developed bilateral presumed Candida endogenous endophthalmitis that was successfully treated with vitrectomy and intravitreal amphotericin B.
Detection of periprosthetic joint infections in presumed aseptic patients
DEFF Research Database (Denmark)
Xu, Yijuan; Lorenzen, Jan; Thomsen, Trine Rolighed
2016-01-01
Title: Detection of periprosthetic joint infections in presumed aseptic patients Yijuan Xu1, Jan Lorenzen1, Trine Rolighed Thomsen1,2, Kathrin Kluba3, Kathrin Chamaon3, Christoph Lohmann3 1. Danish Technological Institute, Aarhus, Denmark 2. Center for Microbial Communities, Department of Biotech......Title: Detection of periprosthetic joint infections in presumed aseptic patients Yijuan Xu1, Jan Lorenzen1, Trine Rolighed Thomsen1,2, Kathrin Kluba3, Kathrin Chamaon3, Christoph Lohmann3 1. Danish Technological Institute, Aarhus, Denmark 2. Center for Microbial Communities, Department...... of Biotechnology, Chemistry and Environmental Engineering, Aalborg University, Denmark 3. Department of Orthopaedics, Otto-von-Guericke University of Magdeburg, Germany Aim: ”The HypOrth project (New approaches in the development of Hypoallergenic implant material in Orthopaedics: Steps to personalised medicine......) aims to investigate adverse immune reactions to implant materials. For this project, it is of utmost importance to exclude patients with periprosthetic joint infections (PJIs). The aim of this study was to rule out PJIs in included patients using prolonged culture and next generation sequencing (NGS...
Fleming, C; Momin, Z A; Brensilver, J M; Brandstetter, R D
1995-03-01
Decisional capacity includes ability to comprehend information, to make an informed choice, and to communicate that choice; it is specific to the decision at hand. Presume a patient has decisional capacity; an evaluation of incapacity must be justified. Administer a standardized mental status test to help assess alertness, attention, memory, and reasoning ability. A patient scoring below 10 on the Folstein Mini-Mental State Examination (maximum score, 30) probably does not have decisional capacity; one scoring from 10 to 15 probably can designate a proxy but not make complex health care decisions. Obtain psychiatric consultations for a patient who exhibits psychological barriers to decision making.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Reactivation of presumed adenoviral keratitis after laser in situ keratomileusis.
Safak, Nilgün; Bilgihan, Kamil; Gürelik, Gökhan; Ozdek, Sengül; Hasanreisoğlu, Berati
2002-04-01
We report a patient with reactivation of presumed adenoviral keratoconjunctivitis after laser in situ keratomileusis (LASIK) to correct high myopia. The preoperative refraction was -13.00 diopters (D) in the right eye and -14.00 D in the left eye, and the best corrected visual acuity was 20/20 in both eyes. On the first postoperative day, mild conjunctival hyperemia and multiple subepithelial infiltrations localized in the flap zone consistent with adenoviral keratoconjunctivitis were seen. After prompt treatment, the lesions resolved. As a consequence, LASIK successfully corrected the high myopia. Adenoviral keratoconjunctivitis can be reactivated after LASIK, unlike after photorefractive keratectomy, despite the absence of symptomatic and clinical findings before the procedure.
Presumed symbolic use of diurnal raptors by Neanderthals.
Directory of Open Access Journals (Sweden)
Eugène Morin
Full Text Available In Africa and western Eurasia, occurrences of burials and utilized ocher fragments during the late Middle and early Late Pleistocene are often considered evidence for the emergence of symbolically-mediated behavior. Perhaps less controversial for the study of human cognitive evolution are finds of marine shell beads and complex designs on organic and mineral artifacts in early modern human (EMH assemblages conservatively dated to ≈ 100-60 kilo-years (ka ago. Here we show that, in France, Neanderthals used skeletal parts of large diurnal raptors presumably for symbolic purposes at Combe-Grenal in a layer dated to marine isotope stage (MIS 5b (≈ 90 ka and at Les Fieux in stratigraphic units dated to the early/middle phase of MIS 3 (60-40 ka. The presence of similar objects in other Middle Paleolithic contexts in France and Italy suggest that raptors were used as means of symbolic expression by Neanderthals in these regions.
Presumed symbolic use of diurnal raptors by Neanderthals.
Morin, Eugène; Laroulandie, Véronique
2012-01-01
In Africa and western Eurasia, occurrences of burials and utilized ocher fragments during the late Middle and early Late Pleistocene are often considered evidence for the emergence of symbolically-mediated behavior. Perhaps less controversial for the study of human cognitive evolution are finds of marine shell beads and complex designs on organic and mineral artifacts in early modern human (EMH) assemblages conservatively dated to ≈ 100-60 kilo-years (ka) ago. Here we show that, in France, Neanderthals used skeletal parts of large diurnal raptors presumably for symbolic purposes at Combe-Grenal in a layer dated to marine isotope stage (MIS) 5b (≈ 90 ka) and at Les Fieux in stratigraphic units dated to the early/middle phase of MIS 3 (60-40 ka). The presence of similar objects in other Middle Paleolithic contexts in France and Italy suggest that raptors were used as means of symbolic expression by Neanderthals in these regions.
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Sahin, Ozlem; Ziaei, Alireza
2014-07-01
This study was designed to investigate whether the antiinflammatory and antiproliferative activity of oral and intravitreal methotrexate (MTX) suppresses intraocular inflammation in patients with presumed latent syphilitic uveitis and presumed tuberculosis-related uveitis. Interventional prospective study including three cases with presumed latent syphilitic uveitis treated with intravenous penicillin and oral MTX, and two cases with presumed tuberculosis-related uveitis treated with standard antituberculosis therapy and intravitreal MTX injections. Treatment efficacy of all cases was assessed by best-corrected visual acuity, fundus fluorescein angiography, and optical coherence tomography. Four eyes of 3 patients with presumed latent syphilitic uveitis had improved best-corrected visual acuity, suppression of intraocular inflammation, and resolution of cystoid macular edema in 6 months with oral MTX therapy. No recurrence of intraocular inflammation was observed in 6 months to 18 months of follow-up period after cessation of MTX. Two eyes of two patients with presumed tuberculosis-related uveitis showed improved best-corrected visual acuity, suppression of intraocular inflammation, and resolution of cystoid macular edema after intravitreal injections of MTX. No recurrence of intraocular inflammation was observed in 6 months to 8 months of follow-up period after cessation of antituberculous therapy. For the first time in the treatment of presumed latent syphilitic uveitis and presumed tuberculosis-related uveitis, we believe that MTX might have an adjunctive role to suppress intraocular inflammation, reduce uveitic macular edema, and prevent the recurrences of the diseases.
Directory of Open Access Journals (Sweden)
Renato Travassos Beltrame
2009-08-01
Full Text Available Several models have been developed to evaluate reproductive status of cows through concentration of progesterone in milk, the effect of sex selection in the commercial production of herds and bioeconomic performance of the multiple ovulation and embryo transfer system in select herds. However, models describing the production of embryos in superovulated females have yet to be developed. A probability density function of the number of embryos collected by donors of the Nelore breed was determined. Records of 61,928 embryo collections from 26,767 donors from 1991 to 2005 were analyzed. Data were provided by the Brazilian Association of Creators of Zebu and Controlmax Consultoria e Sistemas Ltda. The probability density function of the number of viable embryos was modeled using exponential and gamma distributions. Parameter fitting was carried out for maximum likelihood using a non-linear gradient method. Both distributions presented similar level of precision: root mean square error (RMSE = 0.0072 and 0.0071 for the exponential and gamma distributions, respectively; both distributions are thus deemed suitable for representing the probability density function of embryo production by Nelore females.Diversos modelos têm sido desenvolvidos para avaliar o estado reprodutivo de vacas por meio da concentração de progesterona no leite, o efeito da seleção do sexo na produção comercial de rebanhos e o desempenho bioeconômico da ovulação múltipla e transferência de embriões em rebanhos selecionados. No entanto, modelos que descrevem a produção de embriões em fêmeas superovulados ainda têm de ser desenvolvidos. Uma função de densidade probabilidade para o número de embriões viáveis recuperados de doadoras da raça Nelore foi determinada. Dados de 61.928 coletas de 26.767 doadoras entre 1991 e 2005 foram analisados. Os resultados foram fornecidos pela Associação Brasileira de Criadores de Zebu (ABCZ e pela empresa Controlmax
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Energy Technology Data Exchange (ETDEWEB)
Varella, Marcio Teixeira do Nascimento
2001-12-15
We have calculated annihilation probability densities (APD) for positron collisions against He atom and H{sub 2} molecule. It was found that direct annihilation prevails at low energies, while annihilation following virtual positronium (Ps) formation is the dominant mechanism at higher energies. In room-temperature collisions (10{sup -2} eV) the APD spread over a considerable extension, being quite similar to the electronic densities of the targets. The capture of the positron in an electronic Feshbach resonance strongly enhanced the annihilation rate in e{sup +}-H{sub 2} collisions. We also discuss strategies to improve the calculation of the annihilation parameter (Z{sub eff} ), after debugging the computational codes of the Schwinger Multichannel Method (SMC). Finally, we consider the inclusion of the Ps formation channel in the SMC and show that effective configurations (pseudo eigenstates of the Hamiltonian of the collision ) are able to significantly reduce the computational effort in positron scattering calculations. Cross sections for electron scattering by polyatomic molecules were obtained in three different approximations: static-exchange (SE); tatic-exchange-plus-polarization (SEP); and multichannel coupling. The calculations for polar targets were improved through the rotational resolution of scattering amplitudes in which the SMC was combined with the first Born approximation (FBA). In general, elastic cross sections (SE and SEP approximations) showed good agreement with available experimental data for several targets. Multichannel calculations for e{sup -} -H{sub 2}O scattering, on the other hand, presented spurious structures at the electronic excitation thresholds (author)
[Allergy and autoimmunity: Molecular diagnostics, therapy, and presumable pathogenesis].
Arefieva, A S; Smoldovskaya, O V; Tikhonov, A A; Rubina, A Yu
2017-01-01
Allergic and autoimmune diseases represent immunopathological reactions of an organism to antigens. Despite that the allergy is a result of exaggerated immune response to foreign antigens (allergens) and autoimmune diseases are characterized by the pathological response to internal antigens (autoantigens), the underlying mechanisms of these diseases are probably common. Thus, both types of diseases represent variations in the hypersensitivity reaction. A large percentage of both the adult and pediatric population is in need of early diagnostics of these pathologies of the immune system. Considering the diversity of antibodies produced in allergic and autoimmune disease and the difficulties accompanying clinical diagnosing, molecular diagnostics of these pathological processes should be carried out in several stages, including screening and confirmatory studies. In this review, we summarize the available data on the molecular diagnostics and therapy of allergic and autoimmune diseases and discuss the basic similarities and differences in the mechanisms of their development.
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Directory of Open Access Journals (Sweden)
Carlos Alexánder Grajales Correa
2007-07-01
Full Text Available En este trabajo se consideran los rendimientos diarios de un activo financiero con el propósito de modelar y comparar la densidad de probabilidad de la volatilidad estocástica de los retornos. Para tal fin, se proponen los modelos ARCH y sus extensiones, que son en tiempo discreto, así como un modelo empírico de volatilidad estocástica, desarrollado por Paul Wilmott. Para el caso discreto se muestran los modelos que permiten estimar la volatilidad condicional heterocedástica en un instante t del tiempo, t∈[1,T]. En el caso continuo se asocia un proceso de difusión de Itô a la volatilidad estocástica de la serie financiera, lo cual posibilita discretizar dicho proceso y simularlo para obtener densidades de probabilidad empíricas de la volatilidad. Finalmente se ilustran y se comparan los resultados obtenidos con las metodologías expuestas para el caso de las series financieras S&P 500 de EEUU, el Índice de Precios y Cotizaciones de la Bolsa Mexicana de Valores (IPC y el IGBC de Colombia.This work considers daily yields of financial assets in order to model and compare returns stochastic volatility probability density. For such aim, ARCH models and its extensions are proposed - they are in discrete time- as well as an Empirical Stochastic Volatility Model, developed by Paul Wilmott. For the discrete case, models that allow to estimate heteroscedasticity conditional volatility in a time, t, t,t∈[1,T], are shown. In the continuous case, there is an association of an Itô diffusion process to stochastic volatility of the financial series, which allows to write a discretization of this process and to simulate it to obtain empirical probabilistic densities from the volatility. Finally the results are illustrated and compared with methodologies exposed by the case of the financial series S&P 500 of the U.S.A., Index of Prices and Quotations of stock-market Mexican of Values (IPC and IGBC of Colombia.
Dysphagia associated with presumed pharyngeal dysfunction in 16 neonatal foals.
Holcombe, S J; Hurcombe, S D; Barr, B S; Schott, H C
2012-02-01
Dysphagia due to pharyngeal dysfunction occurs in human neonates and is associated with prematurity and hypoxic episodes. This syndrome probably occurs in neonatal foals but has not been reported. The objectives of this study were to describe 1) a series of neonatal foals with dysphagia due to pharyngeal dysfunction; 2) the progression, treatment and resolution of the dysphagia; 3) the comorbidities; and 4) the prognosis for life and athleticism for affected foals. Records from 3 referral equine hospitals were reviewed from neonatal foals with dysphagia of pharyngeal origin. Inclusion criteria were a normal to strong suckle, dysphagia evidenced by milk at the nostrils after nursing the dam, and endoscopic examination of the airway. Foals with mechanical reasons for dysphagia, botulism or hyperkalaemic periodic paralysis were not included. Sixteen neonatal foals qualified for the study. Eight (50%) were premature and/or diagnosed with hypoxic ischaemic encephalopathy. Twelve (75%) had aspiration pneumonia. Fifteen foals were discharged alive from the hospital, nursing the mare with no evidence of dysphagia (n = 14), or mild dysphagia (n = 1), a mean +/- s.d. of 7 +/- 6 days (median = 6.3 days, range 0-22 days) after hospital admission. One foal was subjectedto euthanasia in hospital. Follow-up nformation was available for 14 animals. Thirteen of 16 (81%) were alive and included one yearling and 12 horses >2 years old. Seven of the 14 (50%) were racing, training or in work, and 6 horses were pets, breeding animals or had unknown athletic status. Two had laryngeal deficits. One foal was subjected to euthanasia within weeks of discharge from the hospital due to aspiration pneumonia. Dysphagia related to pharyngeal dysfunction occurs in equine neonates and can resolve, but may require days to weeks of supportive care. Prognosis for life is favourable and for athleticism fair.
Toward a generalized probability theory: conditional probabilities
International Nuclear Information System (INIS)
Cassinelli, G.
1979-01-01
The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)
Rutstein, Sarah E; Siedhoff, Matthew T; Geller, Elizabeth J; Doll, Kemi M; Wu, Jennifer M; Clarke-Pearson, Daniel L; Wheeler, Stephanie B
2016-02-01
Hysterectomy for presumed leiomyomata is 1 of the most common surgical procedures performed in nonpregnant women in the United States. Laparoscopic hysterectomy (LH) with morcellation is an appealing alternative to abdominal hysterectomy (AH) but may result in dissemination of malignant cells and worse outcomes in the setting of an occult leiomyosarcoma (LMS). We sought to evaluate the cost-effectiveness of LH versus AH. Decision-analytic model of 100 000 women in the United States assessing the incremental cost-effectiveness ratio (ICER) in dollars per quality-adjusted life-year (QALY) gained (Canadian Task Force classification III). U.S. hospitals. Adult premenopausal women undergoing LH or AH for presumed benign leiomyomata. We developed a decision-analytic model from a provider perspective across 5 years, comparing the cost-effectiveness of LH to AH in terms of dollar (2014 US dollars) per QALY gained. The model included average total direct medical costs and utilities associated with the procedures, complications, and clinical outcomes. Baseline estimates and ranges for cost and probability data were drawn from the existing literature. Estimated overall deaths were lower in LH versus AH (98 vs 103). Death due to LMS was more common in LH versus AH (86 vs 71). Base-case assumptions estimated that average per person costs were lower in LH versus AH, with a savings of $2193 ($24 181 vs $26 374). Over 5 years, women in the LH group experienced 4.99 QALY versus women in the AH group with 4.91 QALY (incremental gain of .085 QALYs). LH dominated AH in base-case estimates: LH was both less expensive and yielded greater QALY gains. The ICER was sensitive to operative costs for LH and AH. Varying operative costs of AH yielded an ICER of $87 651/QALY gained (minimum) to AH being dominated (maximum). Probabilistic sensitivity analyses, in which all input parameters and costs were varied simultaneously, demonstrated a relatively robust model. The AH approach was dominated
Aetiological study of the presumed ocular histoplasmosis syndrome in the Netherlands
Ongkosuwito, J.V.; Kortbeek, L.M.; Lelij, van der A.; Molicka, E.; Kijlstra, A.; Smet, de M.D.; Suttrop-Schulten, M.S.A.
1999-01-01
Aim. To investigate whether presumed ocular histoplasmosis syndrome in the Netherlands is caused by Histoplasma capsulatum and whether other risk factors might play a role in the pathogenesis of this syndrome. Methods. 23 patients were clinically diagnosed as having presumed ocular histoplasmosis
Feline dry eye syndrome of presumed neurogenic origin: a case report.
Sebbag, Lionel; Pesavento, Patricia A; Carrasco, Sebastian E; Reilly, Christopher M; Maggs, David J
2018-01-01
A 14-year-old female spayed Abyssinian cat, which about 1 year previously underwent thoracic limb amputation, radiotherapy and chemotherapy for an incompletely excised vaccine-related fibrosarcoma, was presented for evaluation of corneal opacity in the left eye (OS). The ocular surface of both eyes (OU) had a lackluster appearance and there was a stromal corneal ulcer OS. Results of corneal aesthesiometry, Schirmer tear test-1 (STT-1) and tear film breakup time revealed corneal hypoesthesia, and quantitative and qualitative tear film deficiency OU. Noxious olfactory stimulation caused increased lacrimation relative to standard STT-1 values suggesting an intact nasolacrimal reflex. Various lacrimostimulants were administered in succession; namely, 1% pilocarpine administered topically (15 days) or orally (19 days), and topically applied 0.03% tacrolimus (47 days). Pilocarpine, especially when given orally, was associated with notable increases in STT-1 values, but corneal ulceration remained/recurred regardless of administration route, and oral pilocarpine resulted in gastrointestinal upset. Tacrolimus was not effective. After 93 days, the cat became weak and lame and a low thyroxine concentration was detected in serum. The cat was euthanized and a necropsy performed. Both lacrimal glands were histologically normal, but chronic neutrophilic keratitis and reduced conjunctival goblet cell density were noted OU. The final diagnosis was dry eye syndrome (DES) of presumed neurogenic origin, associated with corneal hypoesthesia. This report reinforces the importance of conducting tearfilm testing in cats with ocular surface disease, as clinical signs of DES were different from those described in dogs.
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Presumed consent in organ donation: the devil is in the detail
Hutchinson, Odette
2008-01-01
This article follows the recent publication of the Organs for Donation Task Force report, "Organs for Transplants", and considers the debate surrounding a change in the law in favour of presumed consent in organ donation.
Evaluation of autopsy imaging (postmortem CT) to presume causes of death
International Nuclear Information System (INIS)
Nishihara, Keisuke; Sugihara, Shuji; Morioka, Nobuo; Sato, Shinya; Tsukamoto, Kazumichi; Ogawa, Toshihide
2010-01-01
A total of 123 patients arrived at the emergency room in a state of cardiopulmonary arrest were examined by CT after death. Forty one patients (33.3%) were presumed the causes of death by autopsy imaging (Ai). Only 30 patients (24.4%) could be presumed causes of death with postmortem inspection and clinical information. However, presumption rate of cause of death was improved up to 46.3% (22.0 points increase) by adding information provided in Ai. (author)
Feline dry eye syndrome of presumed neurogenic origin: a case report
Directory of Open Access Journals (Sweden)
Lionel Sebbag
2017-12-01
Full Text Available Case summary A 14-year-old female spayed Abyssinian cat, which about 1 year previously underwent thoracic limb amputation, radiotherapy and chemotherapy for an incompletely excised vaccine-related fibrosarcoma, was presented for evaluation of corneal opacity in the left eye (OS. The ocular surface of both eyes (OU had a lackluster appearance and there was a stromal corneal ulcer OS. Results of corneal aesthesiometry, Schirmer tear test-1 (STT-1 and tear film breakup time revealed corneal hypoesthesia, and quantitative and qualitative tear film deficiency OU. Noxious olfactory stimulation caused increased lacrimation relative to standard STT-1 values suggesting an intact nasolacrimal reflex. Various lacrimostimulants were administered in succession; namely, 1% pilocarpine administered topically (15 days or orally (19 days, and topically applied 0.03% tacrolimus (47 days. Pilocarpine, especially when given orally, was associated with notable increases in STT-1 values, but corneal ulceration remained/recurred regardless of administration route, and oral pilocarpine resulted in gastrointestinal upset. Tacrolimus was not effective. After 93 days, the cat became weak and lame and a low thyroxine concentration was detected in serum. The cat was euthanized and a necropsy performed. Both lacrimal glands were histologically normal, but chronic neutrophilic keratitis and reduced conjunctival goblet cell density were noted OU. Relevance and novel information The final diagnosis was dry eye syndrome (DES of presumed neurogenic origin, associated with corneal hypoesthesia. This report reinforces the importance of conducting tearfilm testing in cats with ocular surface disease, as clinical signs of DES were different from those described in dogs.
International Nuclear Information System (INIS)
Fraassen, B.C. van
1979-01-01
The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Data-driven probability concentration and sampling on manifold
Energy Technology Data Exchange (ETDEWEB)
Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)
2016-09-15
A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.
Cervera, Vicente; Mai, Wilfried; Vite, Charles H; Johnson, Victoria; Dayrell-Hart, Betsy; Seiler, Gabriela S
2011-01-01
Cerebrovascular accidents, or strokes, and gliomas are common intraaxial brain lesions in dogs. An accurate differentiation of these two lesions is necessary for prognosis and treatment decisions. The magnetic resonance (MR) imaging characteristics of 21 dogs with a presumed cerebrovascular accident and 17 with a glioma were compared. MR imaging findings were reviewed retrospectively by three observers unaware of the final diagnosis. Statistically significant differences between the appearance of gliomas and cerebrovascular accidents were identified based on lesion location, size, mass effect, perilesional edema, and appearance of the apparent diffusion coefficient map. Gliomas were predominantly located in the cerebrum (76%) compared with presumed cerebrovascular accidents that were located mainly in the cerebellum, thalamus, caudate nucleus, midbrain, and brainstem (76%). Gliomas were significantly larger compared with presumed cerebrovascular accidents and more commonly associated with mass effect and perilesional edema. Wedge-shaped lesions were seen only in 19% of presumed cerebrovascular accidents. Between the three observers, 10-47% of the presumed cerebrovascular accidents were misdiagnosed as gliomas, and 0-12% of the gliomas were misdiagnosed as cerebrovascular accidents. Diffusion weighted imaging increased the accuracy of the diagnosis for both lesions. Agreement between observers was moderate (kappa = 0.48, P < 0.01).
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
Bitsakis, E.I.; Nicolaides, C.A.
1989-01-01
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
2010-07-01
... presumed to be the most advantageous method of transportation? 301-72.1 Section 301-72.1 Public Contracts... Transportation § 301-72.1 Why is common carrier presumed to be the most advantageous method of transportation? Travel by common carrier is presumed to be the most advantageous method of transportation because it...
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
Energy Technology Data Exchange (ETDEWEB)
Duong, Hong Phuoc; Janssen, Francoise; Hall, Michelle; Ismaili, Khalid [Universite Libre de Bruxelles (ULB), Department of Pediatric Nephrology, Hopital Universitaire des Enfants Reine Fabiola, Brussels (Belgium); Piepsz, Amy [Hopital Universitaire Saint-Pierre, Department of Radioisotopes, Ghent (Belgium); Khelif, Karim; Collier, Frank [Universite Libre de Bruxelles (ULB), Department of Pediatric Urology, Hopital Universitaire des Enfants Reine Fabiola, Brussel (Belgium); Man, Kathia de [University Hospital Ghent, Department of Nuclear Medicine, Ghent (Belgium); Damry, Nash [Universite Libre de Bruxelles (ULB), Department of Pediatric Radiology, Hopital Universitaire des Enfants Reine Fabiola, Brussel (Belgium)
2015-05-01
The main criteria used for deciding on surgery in children with presumed antenatally detected pelviureteric junction obstruction (PPUJO) are the level of hydronephrosis (ultrasonography), the level of differential renal function (DRF) and the quality of renal drainage after a furosemide challenge (renography), the importance of each factor being far from generally agreed. Can we predict, on the basis of ultrasound parameters, the patient in whom radionuclide renography can be avoided? We retrospectively analysed the medical charts of 81 consecutive children with presumed unilateral PPUJO detected antenatally. Ultrasound and renographic studies performed at the same time were compared. Anteroposterior pelvic diameter (APD) and calyceal size were both divided into three levels of dilatation. Parenchymal thickness was considered either normal or significantly decreased. Acquisition of renograms under furosemide stimulation provided quantification of DRF, quality of renal drainage and cortical transit. The percentages of patients with low DRF and poor drainage were significantly higher among those with major hydronephrosis, severe calyceal dilatation or parenchymal thinning. Moreover, impaired cortical transit, which is a major risk factor for functional decline, was seen more frequently among those with very severe calyceal dilatation. However, none of the structural parameters obtained by ultrasound examination was able to predict whether the level of renal function or the quality of drainage was normal or abnormal. Alternatively, an APD <30 mm, a calyceal dilatation of <10 mm and a normal parenchymal thickness were associated with a low probability of decreased renal function or poor renal drainage. In the management strategy of patients with prenatally detected PPUJO, nuclear medicine examinations may be postponed in those with an APD <30 mm, a calyceal dilatation of <10 mm and a normal parenchymal thickness. On the contrary, precise estimation of DRF and renal
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Vromans, P.; van Engen, M.L.; Mol, S.
2013-01-01
Purpose To introduce the presumed cultural similarity paradox as a possible explanation for the findings that adjusting to a culturally similar country is just as difficult as adjusting to a culturally dissimilar country. We provide a conceptual framework, enabling further understanding and research
Vromans, P.; van Engen, M.; Mol, S.
2013-01-01
Purpose - To introduce the presumed cultural similarity paradox as a possible explanation for the findings that adjusting to a culturally similar country is just as difficult as adjusting to a culturally dissimilar country. We provide a conceptual framework, enabling further understanding and
Presumed atypical HDR syndrome associated with Band Keratopathy and pigmentary retinopathy.
Kim, Cinoo; Cheong, Hae Il; Kim, Jeong Hun; Yu, Young Suk; Kwon, Ji Won
2011-01-01
This report describes presumed atypical hypoparathyroidism, deafness, and renal dysplasia (HDR) syndrome associated with unexpected ocular findings. The patient had exotropia, bilateral band keratopathy, and pigmentary retinopathy, including attenuated retinal vessels and atrophy of the retinal pigment epithelium. Even though the calcific plaques were successfully removed, visual acuity in both eyes gradually decreased and electroretinography was extinguished. Copyright 2009, SLACK Incorporated.
A. Algra (Ale); P.J. Koudstaal (Peter Jan); J. van Gijn (Jan)
1999-01-01
textabstractPatients who have had a transient ischaemic attack or nondisabling ischaemic stroke of presumed arterial origin have an annual risk of death from all vascular causes, non-fatal stroke, or non-fatal myocardial infarction that ranges between 4% and 11% without treatment. In the
Presumed PDF modeling of microjet assisted CH4–H2/air turbulent flames
International Nuclear Information System (INIS)
Chouaieb, Sirine; Kriaa, Wassim; Mhiri, Hatem; Bournot, Philippe
2016-01-01
Highlights: • Microjet assisted CH 4 –H 2 /air turbulent flames are numerically investigated. • Temperature, species and soot are well predicted by the Presumed PDF model. • An inner flame is identified due to the microjet presence. • The addition of hydrogen to the microjet assisted flames enhances mixing. • Soot emission is reduced by 36% for a 10% enriched microjet assisted flame. - Abstract: The characteristics of microjet assisted CH 4 –H 2 /air flames in a turbulent mode are numerically investigated. Simulations are performed using the Computational Fluid Dynamics code Fluent. The Presumed PDF and the Discrete Ordinates models are considered respectively for combustion and radiation modeling. The k–ε Realizable model is adopted as a turbulence closure model. The Tesner model is used to calculate soot particle quantities. In the first part of this paper, the Presumed PDF model is compared to the Eddy Dissipation model and to slow chemistry combustion models from literature. Results show that the Presumed PDF model predicts correctly thermal and species fields, as well as soot formation. The effect of hydrogen enrichment on CH 4 /air confined flames under the addition of an air microjet is investigated in the second part of this work. The found results show that an inner flame was identified due to the air microjet for the CH 4 –H 2 /air flames. Moreover, the increase of hydrogen percentage in the fuel mixture leads to mixing enhancement and consequently to considerable soot emission reduction.
Presumed Perinatal Stroke in a Child with Down Syndrome and Moyamoya Disease
Pysden, Karen; Fallon, Penny; Moorthy, Bhagavatheswaran; Ganesan, Vijeya
2010-01-01
Moyamoya disease describes a cerebral arteriopathy characterized by stenosis or occlusion of the terminal internal carotid and/or the proximal middle cerebral arteries. We report a female child with trisomy 21 and bilateral moyamoya disease who presented, unusually, with a presumed perinatal cerebral infarct. The clinical, radiological, and…
28 CFR 104.44 - Determination of presumed noneconomic losses for decedents.
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Determination of presumed noneconomic losses for decedents. 104.44 Section 104.44 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) SEPTEMBER 11TH VICTIM COMPENSATION FUND OF 2001 Amount of Compensation for Eligible Claimants. § 104.44...
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Determination of presumed noneconomic losses for claimants who suffered physical harm. 104.46 Section 104.46 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) SEPTEMBER 11TH VICTIM COMPENSATION FUND OF 2001 Amount of Compensation for...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Transition probabilities for atoms
International Nuclear Information System (INIS)
Kim, Y.K.
1980-01-01
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Histogram Estimators of Bivariate Densities
National Research Council Canada - National Science Library
Husemann, Joyce A
1986-01-01
One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Presuming consent in the ethics of posthumous sperm procurement and conception.
Kroon, Frederick
2015-12-01
This paper compares standard conceptions of consent with the conception of consent defended by Kelton Tremellen and Julian Savulescu in their attempt to re-orient the ethical debate around posthumous sperm procurement and conception, as published in Reproductive BioMedicine Online in 2015. According to their radical proposal, the surviving partner's wishes are, in effect, the only condition that needs to be considered for there to be a legitimate moral case for these procedures: the default should be presumed consent to the procedures, whether or not the agent did consent or would have consented. The present paper argues that Tremellen and Savulescu's case for this position is flawed, but offers a reconstruction that articulates what may well be a hidden, and perhaps reasonable, assumption behind the argument. But while the new argument appears more promising, the reconstruction also suggests that the position of presumed consent is currently unlikely to be acceptable as policy.
Presuming consent in the ethics of posthumous sperm procurement and conception
Directory of Open Access Journals (Sweden)
Frederick Kroon
2015-12-01
Full Text Available This paper compares standard conceptions of consent with the conception of consent defended by Kelton Tremellen and Julian Savulescu in their attempt to re-orient the ethical debate around posthumous sperm procurement and conception, as published in Reproductive BioMedicine Online in 2015. According to their radical proposal, the surviving partner’s wishes are, in effect, the only condition that needs to be considered for there to be a legitimate moral case for these procedures: the default should be presumed consent to the procedures, whether or not the agent did consent or would have consented. The present paper argues that Tremellen and Savulescu’s case for this position is flawed, but offers a reconstruction that articulates what may well be a hidden, and perhaps reasonable, assumption behind the argument. But while the new argument appears more promising, the reconstruction also suggests that the position of presumed consent is currently unlikely to be acceptable as policy.
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Scheuermeyer, Frank X; DeWitt, Christopher; Christenson, Jim; Grunau, Brian; Kestler, Andrew; Grafstein, Eric; Buxton, Jane; Barbic, David; Milanovic, Stefan; Torkjari, Reza; Sahota, Indy; Innes, Grant
2018-03-09
Fentanyl overdoses are increasing and few data guide emergency department (ED) management. We evaluate the safety of an ED protocol for patients with presumed fentanyl overdose. At an urban ED, we used administrative data and explicit chart review to identify and describe consecutive patients with uncomplicated presumed fentanyl overdose (no concurrent acute medical issues) from September to December 2016. We linked regional ED and provincial vital statistics databases to ascertain admissions, revisits, and mortality. Primary outcome was a composite of admission and death within 24 hours. Other outcomes included treatment with additional ED naloxone, development of a new medical issue while in the ED, and length of stay. A prespecified subgroup analysis assessed low-risk patients with normal triage vital signs. There were 1,009 uncomplicated presumed fentanyl overdose, mainly by injection. Median age was 34 years, 85% were men, and 82% received out-of-hospital naloxone. One patient was hospitalized and one discharged patient died within 24 hours (combined outcome 0.2%; 95% confidence interval [CI] 0.04% to 0.8%). Sixteen patients received additional ED naloxone (1.6%; 95% CI 1.0% to 2.6%), none developed a new medical issue (0%; 95% CI 0% to 0.5%), and median length of stay was 173 minutes (interquartile range 101 to 267). For 752 low-risk patients, no patients were admitted or developed a new issue, and one died postdischarge; 3 (0.4%; 95% CI 0.01% to 1.3%) received ED naloxone. In our cohort of ED patients with uncomplicated presumed fentanyl overdose-typically after injection-deterioration, admission, mortality, and postdischarge complications appear low; the majority can be discharged after brief observation. Patients with normal triage vital signs are unlikely to require ED naloxone. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Langley, Ross J; McFadzean, Jillian; McCormack, Jon
2016-01-01
We describe a 2-day-old male infant who received rocuronium as part of general anesthesia for a tracheal esophageal fistula repair. Postoperatively, he had prolonged central and peripheral neuromuscular blockade despite cessation of the rocuronium infusion several hours previously. This case discusses the presumed central nervous system effects of rocuronium in a neonate and its effective reversal with sugammadex. © 2015 John Wiley & Sons Ltd.
Presumed Cases of Mumps in Pregnancy: Clinical and Infection Control Implications
Directory of Open Access Journals (Sweden)
Svjetlana Lozo
2012-01-01
Full Text Available Recently, a mumps outbreak in New York and New Jersey was reported by the Centers for Disease Control and Prevention (CDC. Subsequently, the dissemination of the disease was rapid, and, from June 28th 2009 through January 29th 2010, a total of 1,521 cases of mumps were reported in New York and New Jersey. Seven presumed cases occurred in pregnant women cared for at our institution. Mumps diagnosis as per the NYC Department of Health and Mental Hygiene was based on clinical manifestations, particularly parotitis. Prior immunizations with mumps vaccine and negative IgM were not adequate to rule out mumps infections. All of our seven patients had exposure to mumps in either their household or their community, and some of the them had symptoms of mumps. Due to the difficulties in interpreting serologies of these patients, their cases led to a presumed diagnosis of mumps. The diagnosis of mumps lead to the isolation of patients and health care personnel that were in contact with them. In this paper, we detail the presenting findings, diagnostic dilemmas and infection control challenges associated with presumed cases of mumps in pregnancy.
Directory of Open Access Journals (Sweden)
Baskaran, Prabu
2017-10-01
Full Text Available Aim: Presumed congenital simple retinal pigment epithelium hamartoma is a rare benign lesion of the macula that mimics congenital hypertrophy of the retinal pigment epithelium (RPE and combined hamartoma of the retina and the RPE; newer imaging modalities can help in diagnosis. We report three patients with presumed congenital simple RPE hamartoma, and describe the enhanced-depth imaging optical coherence tomography (EDI-OCT and fundus autofluorescence (FAF findings. Methods: Two patients were asymptomatic; one had an intraocular foreign body in addition to the hamartoma. All had a similar jet black, elevated lesion in the macula, sparing the fovea. EDI-OCT showed a characteristic hyperreflective layer with complete optical shadowing of the deeper layers; FAF showed pronounced hypoautofluorescence of the lesion. Conclusion: Multimodal imaging with FAF and EDI-OCT can help to differentiate simple RPE hamartoma from similar RPE lesions, and may serve as a useful adjunct to clinical diagnosis of this rare tumor. We present the second largest series of presumed congenital simple RPE hamartoma, and – to the best of our knowledge – the first report of FAF findings of this tumor.
International Nuclear Information System (INIS)
Zhou, J; Ding, X; Liang, J; Zhang, J; Wang, Y; Yan, D
2016-01-01
Purpose: With energy repainting in lung IMPT, the dose delivered is approximate to the convolution of dose in each phase with corresponding breathing PDF. This study is to compute breathing PDF weighted 4D dose in lung IMPT treatment and compare to its initial robust plan. Methods: Six lung patients were evaluated in this study. Amsterdam shroud image were generated from pre-treatment 4D cone-beam projections. Diaphragm motion curve was extract from the shroud image and the breathing PDF was generated. Each patient was planned to 60 Gy (12GyX5). In initial plans, ITV density on average CT was overridden with its maximum value for planning, using two IMPT beams with robust optimization (5mm uncertainty in patient position and 3.5% range uncertainty). The plan was applied to all 4D CT phases. The dose in each phase was deformed to a reference phase. 4D dose is reconstructed by summing all these doses based on corresponding weighting from the PDF. Plan parameters, including maximum dose (Dmax), ITV V100, homogeneity index (HI=D2/D98), R50 (50%IDL/ITV), and the lung-GTV’s V12.5 and V5 were compared between the reconstructed 4D dose to initial plans. Results: The Dmax is significantly less dose in the reconstructed 4D dose, 68.12±3.5Gy, vs. 70.1±4.3Gy in the initial plans (p=0.015). No significant difference is found for the ITV V100, HI, and R50, 92.2%±15.4% vs. 96.3%±2.5% (p=0.565), 1.033±0.016 vs. 1.038±0.017 (p=0.548), 19.2±12.1 vs. 18.1±11.6 (p=0.265), for the 4D dose and initial plans, respectively. The lung-GTV V12.5 and V5 are significantly high in the 4D dose, 13.9%±4.8% vs. 13.0%±4.6% (p=0.021) and 17.6%±5.4% vs. 16.9%±5.2% (p=0.011), respectively. Conclusion: 4D dose reconstruction based on phase PDF can be used to evaluate the dose received by the patient. A robust optimization based on the phase PDF may even further improve patient care.
Del Brutto, Oscar H; Mera, Robertino M; Del Brutto, Victor J; Zambrano, Mauricio; Lama, Julio
2015-04-01
Cerebral small vessel disease is probably one of the most common pathogenetic mechanisms underlying stroke in Latin America. However, the importance of silent markers of small vessel disease, including white matter hyperintensities of presumed vascular origin, has not been assessed so far. The study aims to evaluate prevalence and correlates of white matter hyperintensities in community-dwelling elders living in Atahualpa (rural Ecuador). Atahualpa residents aged ≥ 60 years were identified during a door-to-door survey and invited to undergo brain magnetic resonance imaging for identification and grading white matter hyperintensities and other markers of small vessel disease. Using multivariate logistic regression models, we evaluated whether white matter hyperintensities is associated with demographics, cardiovascular health status, stroke, cerebral microbleeds, and cortical atrophy, after adjusting for the other variables. Out of 258 enrolled persons (mean age, 70 ± 8 years; 59% women), 172 (67%) had white matter hyperintensities, which were moderate to severe in 63. Analyses showed significant associations of white matter hyperintensities presence and severity with age and cardiovascular health status, as well as with overt and silent strokes, and a trend for association with cerebral microbleeds and cortical atrophy. Prevalence and correlates of white matter hyperintensities in elders living in rural Ecuador is almost comparable with that reported from industrialized nations, reinforcing the concept that the burden of small vessel disease is on the rise in underserved Latin American populations. © 2014 World Stroke Organization.
Probability mapping of contaminants
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)
1994-04-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
2014-06-30
precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux
Probability distribution relationships
Directory of Open Access Journals (Sweden)
Yousry Abdelkader
2013-05-01
Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.
On density forecast evaluation
Diks, C.
2008-01-01
Traditionally, probability integral transforms (PITs) have been popular means for evaluating density forecasts. For an ideal density forecast, the PITs should be uniformly distributed on the unit interval and independent. However, this is only a necessary condition, and not a sufficient one, as
Probable maximum flood control
International Nuclear Information System (INIS)
DeGabriele, C.E.; Wu, C.L.
1991-11-01
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
The Antibiotic Prescribing Pathway for Presumed Urinary Tract Infections in Nursing Home Residents.
Kistler, Christine E; Zimmerman, Sheryl; Scales, Kezia; Ward, Kimberly; Weber, David; Reed, David; McClester, Mallory; Sloane, Philip D
2017-08-01
Due to the high rates of inappropriate antibiotic prescribing for presumed urinary tract infections (UTIs) in nursing home (NH) residents, we sought to examine the antibiotic prescribing pathway and the extent to which it agrees with the Loeb criteria; findings can suggest strategies for antibiotic stewardship. Chart review of 260 randomly-selected cases from 247 NH residents treated with an antibiotic for a presumed UTI in 31 NHs in North Carolina. We examined the prescribing pathway from presenting illness, to the prescribing event, illness work-up and subsequent clinical events including emergency department use, hospitalization, and death. Analyses described the decision-making processes and outcomes and compared decisions made with Loeb criteria for initiation of antibiotics. Of 260 cases, 60% had documented signs/symptoms of the presenting illness and 15% met the Loeb criteria. Acute mental status change was the most commonly documented sign/symptom (24%). NH providers (81%) were the most common prescribers and ciprofloxacin (32%) was the most commonly prescribed antibiotic. Fourteen percent of presumed UTI cases included a white blood cell count, 71% included a urinalysis, and 72% had a urine culture. Seventy-five percent of cultures grew at least one organism with ≥100,000 colony-forming units/milliliter and 12% grew multi-drug resistant organisms; 28% of antibiotics were prescribed for more than 7 days, and 7% of cases had a subsequent death, emergency department visit, or hospitalization within 7 days. Non-specific signs/symptoms appeared to influence prescribing more often than urinary tract-specific signs/symptoms. Prescribers rarely stopped antibiotics, and a minority prescribed for overly long periods. Providers may need additional support to guide the decision-making process to reduce antibiotic overuse and antibiotic resistance. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
Guandalini, Adolfo; Di Girolamo, Nicola; Santillo, Daniele; Andreani, Valentina; Corvi, Roberta; Bandini, Marina; Peruccio, Claudio
2017-09-01
To describe the epidemiology and the types of eye disorders that are presumed to be inherited (PIED) in three large Italian dog breeds. Three large Italian dog breeds: Neapolitan Mastiff (FCI code: 197), Maremma Sheepdog (FCI code: 201), and Italian Corso dog (FCI code: 343). All dogs that underwent a complete ophthalmic examination between 1992 and 2012 were included in this prospective observational study. The prevalence of eye disorders with 95% confidence intervals was reported for presumed healthy dogs and for dogs referred to a veterinary center for an ophthalmic consultation. Univariate and multivariate logistic regression techniques were used to generate odds ratios. Of 605 dogs examined during the study period, 351 dogs were affected by at least one PIED (58%; 95% CI: 54-62%). The prevalence of PIED was significantly lower in dogs presented for ophthalmic examination (53.8%) as compared to presumed healthy dogs (62.2%)(OR: 1.4; 95% CI: 1.02-1.9; P = 0.037). Also after multivariate adjustment for the period of observation, the odds of Neapolitan Mastiff (92.1%; OR: 21.4; 95% CI: 11.1-41.4) and of Cane Corso (57.7%; OR: 2.5; 95% CI: 1.7-3.6) suffering a PIED were greater than the Maremma Sheepdog (35.4%). The most common PIED in each breed were entropion (24.3% of all the PIED) in the Neapolitan Mastiff, ectropion (36.6%) in the Corso dog, and cataract (27.9%) in the Maremma Sheepdog. Clinicians should be aware that three large Italian dog breeds frequently suffer PIED. Breed standards should be reconsidered, and breeding programs should be directed at limiting such disorders. © 2016 American College of Veterinary Ophthalmologists.
Path probabilities of continuous time random walks
International Nuclear Information System (INIS)
Eule, Stephan; Friedrich, Rudolf
2014-01-01
Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)
Collision warning system based on probability density functions
Broek, T.H.A. van den; Ploeg, J.
2010-01-01
In this paper, a collision warning method between the host vehicle and target object(s) is studied. A probabilistic collision warning method is proposed, which is, in particular, useful for objects, e.g. vulnerable road users, which trajectories can rapidly change heading and/or velocity with
On the probability density interpretation of smoothed Wigner functions
International Nuclear Information System (INIS)
De Aguiar, M.A.M.; Ozorio de Almeida, A.M.
1990-01-01
It has been conjectured that the averages of the Wigner function over phase space volumes, larger than those of minimum uncertainty, are always positive. This is true for Gaussian averaging, so that the Husimi distribution is positive. However, we provide a specific counterexample for the averaging with a discontinuous hat function. The analysis of the specific system of a one-dimensional particle in a box also elucidates the respective advantages of the Wigner and the Husimi functions for the study of the semiclassical limit. The falsification of the averaging conjecture is shown not to depend on the discontinuities of the hat function, by considering the latter as the limit of a sequence of analytic functions. (author)
Indoor Localization with Probability Density Functionsd based on Bluetooth
Wendlandt, Kai; Robertson, Patrick; Berbig, Marcus
2005-01-01
We present a simple system to help people navigate inside of buildings or even in outside areas close to buildings. It is based on the “RSSI” and “Transmit power” data of an established Bluetooth link. The system is in principle sufficient for the intended application (pedestrian, indoor), but it is certainly not a high resolution indoor location system. The achievable accuracy is dependent on the setup (number of access points and their constellation and available Bluetooth devices) but will...
Classical-Quantum Correspondence by Means of Probability Densities
Vegas, Gabino Torres; Morales-Guzman, J. D.
1996-01-01
Within the frame of the recently introduced phase space representation of non relativistic quantum mechanics, we propose a Lagrangian from which the phase space Schrodinger equation can be derived. From that Lagrangian, the associated conservation equations, according to Noether's theorem, are obtained. This shows that one can analyze quantum systems completely in phase space as it is done in coordinate space, without additional complications.
METAPHOR: Probability density estimation for machine learning based photometric redshifts
Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.
2017-06-01
We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).
Goal-Oriented Probability Density Function Methods for Uncertainty Quantification
2015-12-11
approximations or data-driven approaches. We investigated the accuracy of analytical tech- niques based Kubo -Van Kampen operator cumulant expansions for...analytical techniques based Kubo -Van Kampen operator cumulant expansions for Langevin equations driven by fractional Brownian motion and other noises
On the Probability Density Functions of Forster-Greer-Thorbecke ...
African Journals Online (AJOL)
Distributional properties of poverty indices are generally unknown due to the fact that statistical inference for poverty measures are mostly ignored in the field of poverty analysis where attention is usually based on identification and aggregation problems. This study considers the possibility of using Pearson system of ...
Santos, Cleusa C.; Feitosa, Fabiana G.; Ribeiro, Maria C.; Menge, Paulo; Lira, Izabelle M.
2017-01-01
Objective To report the echocardiographic evaluation of 103 infants with presumed congenital Zika syndrome. Methods An observational retrospective study was performed at Instituto de Medicina Integral Prof. Fernando Figueira (IMIP), Recife, Brazil. 103 infants with presumed congenital Zika syndrome. All infants had microcephaly and head computed tomography findings compatible with congenital Zika syndrome. Zika IgM antibody was detected in cerebrospinal fluid samples of 23 infants. In 80 infants, the test was not performed because it was not available at that time. All infants had negative serology for HIV, syphilis, rubella, cytomegalovirus and toxoplasmosis. A complete transthoracic two-dimensional, M-mode, continuous wave and pulsed wave Doppler and color Doppler echocardiographic (PHILIPS HD11XE or HD15) examination was performed on all infants. Results 14/103 (13.5%) echocardiograms were compatible with congenital heart disease: 5 with an ostium secundum atrial septal defect, 8 had a hemodynamically insignificant small apical muscular ventricular septal defect and one infant with dyspnea had a large membranous ventricular septal defect. The echocardiograms considered normal included 45 infants with a persistent foramen ovale and 16 with a minimum patent ductus arteriosus. Conclusions Preliminarily this study suggests that congenital Zika syndrome may be associated with an increase prevalence of congenital heart disease. However the types of defects noted were septal defects, a proportion of which would not be hemodynamically significant. PMID:28426680
Directory of Open Access Journals (Sweden)
Danielle Di Cavalcanti
Full Text Available To report the echocardiographic evaluation of 103 infants with presumed congenital Zika syndrome.An observational retrospective study was performed at Instituto de Medicina Integral Prof. Fernando Figueira (IMIP, Recife, Brazil. 103 infants with presumed congenital Zika syndrome. All infants had microcephaly and head computed tomography findings compatible with congenital Zika syndrome. Zika IgM antibody was detected in cerebrospinal fluid samples of 23 infants. In 80 infants, the test was not performed because it was not available at that time. All infants had negative serology for HIV, syphilis, rubella, cytomegalovirus and toxoplasmosis. A complete transthoracic two-dimensional, M-mode, continuous wave and pulsed wave Doppler and color Doppler echocardiographic (PHILIPS HD11XE or HD15 examination was performed on all infants.14/103 (13.5% echocardiograms were compatible with congenital heart disease: 5 with an ostium secundum atrial septal defect, 8 had a hemodynamically insignificant small apical muscular ventricular septal defect and one infant with dyspnea had a large membranous ventricular septal defect. The echocardiograms considered normal included 45 infants with a persistent foramen ovale and 16 with a minimum patent ductus arteriosus.Preliminarily this study suggests that congenital Zika syndrome may be associated with an increase prevalence of congenital heart disease. However the types of defects noted were septal defects, a proportion of which would not be hemodynamically significant.
Probability and rational choice
Directory of Open Access Journals (Sweden)
David Botting
2014-05-01
Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.
COVAL, Compound Probability Distribution for Function of Probability Distribution
International Nuclear Information System (INIS)
Astolfi, M.; Elbaz, J.
1979-01-01
1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions
Learning Grasp Affordance Densities
DEFF Research Database (Denmark)
Detry, Renaud; Kraft, Dirk; Kroemer, Oliver
2011-01-01
and relies on kernel density estimation to provide a continuous model. Grasp densities are learned and refined from exploration, by letting a robot “play” with an object in a sequence of graspand-drop actions: The robot uses visual cues to generate a set of grasp hypotheses; it then executes......We address the issue of learning and representing object grasp affordance models. We model grasp affordances with continuous probability density functions (grasp densities) which link object-relative grasp poses to their success probability. The underlying function representation is nonparametric...... these and records their outcomes. When a satisfactory number of grasp data is available, an importance-sampling algorithm turns these into a grasp density. We evaluate our method in a largely autonomous learning experiment run on three objects of distinct shapes. The experiment shows how learning increases success...
Ignition probabilities for Compact Ignition Tokamak designs
International Nuclear Information System (INIS)
Stotler, D.P.; Goldston, R.J.
1989-09-01
A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs
Geometric modeling in probability and statistics
Calin, Ovidiu
2014-01-01
This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Young, John; Peacock, Sheila
2016-04-01
The year 1996 has particular significance for forensic seismologists. This was the year when the Comprehensive Test Ban Treaty (CTBT) was signed in September at the United Nations, setting an international norm against nuclear testing. Blacknest, as a long time seismic centre for research into detecting and identifying underground explosions using seismology, provided significant technical advice during the CTBT negotiations. Since 1962 seismic recordings of both presumed nuclear explosions and earthquakes from the four seismometer arrays Eskdalemuir, Scotland (EKA), Yellowknife, Canada (YKA), Gauribidanur, India (GBA), and Warramunga, Australia (WRA) have been copied, digitised, and saved. There was a possibility this archive would be lost. It was decided to process the records and catalogue them for distribution to other groups and institutions. This work continues at Blacknest but the archive is no longer under threat. In addition much of the archive of analogue tape recordings has been re-digitised with modern equipment, allowing sampling rates of 100 rather than 20 Hz.
Kim, Su Sun; Kim, Kyung Up; Kim, Sung Jun; Seo, Seung In; Kim, Hyoung Su; Jang, Myoung Kuk; Kim, Hak Yang; Shin, Woon Geon
2017-12-15
Selecting patients with an urgent need for endoscopic hemostasis is difficult based only on simple parameters of presumed acute upper gastrointestinal bleeding. This study assessed easily applicable factors to predict cases in need of urgent endoscopic hemostasis due to acute upper gastrointestinal bleeding. The consecutively included patients were divided into the endoscopic hemostasis and nonendoscopic hemostasis groups. We reviewed the enrolled patients' medical records and analyzed various variables and parameters for acute upper gastrointestinal bleeding outcomes such as demographic factors, comorbidities, symptoms, signs, laboratory findings, rebleeding rate, and mortality to evaluate simple predictive factors for endoscopic treatment. A total of 613 patients were analyzed, including 329 patients in the endoscopic hemostasis and 284 patients in the non-endoscopic hemostasis groups. In the multivariate analysis, a bloody nasogastric lavage (adjusted odds ratio [AOR], 6.786; 95% confidence interval [CI], 3.990 to 11.543; p upper gastrointestinal bleeding.
Energy Technology Data Exchange (ETDEWEB)
Yang, Jeannie C.; Ostlie, Daniel J. [Children' s Mercy Hospital, Department of Surgery, Kansas City, MO (United States); Rivard, Douglas C.; Morello, Frank P. [Children' s Mercy Hospital, Department of Radiology, Kansas City, MO (United States)
2008-08-15
A Meckel diverticulum is an embryonic remnant of the omphalomesenteric duct that occurs in approximately 2% of the population. Most are asymptomatic; however, they are vulnerable to inflammation with subsequent consequences including diverticulitis and perforation. We report an 11-year-old boy who underwent laparoscopic appendectomy for perforated appendicitis at an outside institution. During his convalescence he underwent percutaneous drainage of a presumed postoperative abscess. A follow-up drain study demonstrated an enteric fistula. The drain was slowly removed from the abdomen over a period of 1 week. Three weeks following drain removal the patient reported recurrent nausea and abdominal pain. A CT scan demonstrated a 3.7-cm rim-enhancing air-fluid level with dependent contrast consistent with persistent enteric fistula and abscess. Exploratory laparoscopy was performed, at which time a Meckel diverticulum was identified and resected. This case highlights the diagnostic challenge and limitations of conventional radiology in complicated Meckel diverticulum. (orig.)
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
On the shake-off probability for atomic systems
Energy Technology Data Exchange (ETDEWEB)
Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)
2016-07-15
Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.
Undiagnosed and comorbid disorders in patients with presumed chronic fatigue syndrome.
Mariman, An; Delesie, Liesbeth; Tobback, Els; Hanoulle, Ignace; Sermijn, Erica; Vermeir, Peter; Pevernagie, Dirk; Vogelaers, Dirk
2013-11-01
To assess undiagnosed and comorbid disorders in patients referred to a tertiary care center with a presumed diagnosis of chronic fatigue syndrome (CFS). Patients referred for chronic unexplained fatigue entered an integrated diagnostic pathway, including internal medicine assessment, psychodiagnostic screening, physiotherapeutic assessment and polysomnography+multiple sleep latency testing. Final diagnosis resulted from a multidisciplinary team discussion. Fukuda criteria were used for the diagnosis of CFS, DSM-IV-TR criteria for psychiatric disorders, ICSD-2 criteria for sleep disorders. Out of 377 patients referred, 279 (74.0%) were included in the study [84.9% female; mean age 38.8years (SD 10.3)]. A diagnosis of unequivocal CFS was made in 23.3%. In 21.1%, CFS was associated with a sleep disorder and/or psychiatric disorder, not invalidating the diagnosis of CFS. A predominant sleep disorder was found in 9.7%, 19.0% had a psychiatric disorder and 20.8% a combination of both. Only 2.2% was diagnosed with a classical internal disease. In the total sample, a sleep disorder was found in 49.8%, especially obstructive sleep apnea syndrome, followed by psychophysiologic insomnia and periodic limb movement disorder. A psychiatric disorder was diagnosed in 45.2%; mostly mood and anxiety disorder. A multidisciplinary approach to presumed CFS yields unequivocal CFS in only a minority of patients, and reveals a broad spectrum of exclusionary or comorbid conditions within the domains of sleep medicine and psychiatry. These findings favor a systematic diagnostic approach to CFS, suitable to identify a wide range of diagnostic categories that may be subject to dedicated care. © 2013. Published by Elsevier Inc. All rights reserved.
Consenting options for posthumous organ donation: presumed consent and incentives are not favored
Directory of Open Access Journals (Sweden)
Hammami Muhammad M
2012-11-01
Full Text Available Abstract Background Posthumous organ procurement is hindered by the consenting process. Several consenting systems have been proposed. There is limited information on public relative attitudes towards various consenting systems, especially in Middle Eastern/Islamic countries. Methods We surveyed 698 Saudi Adults attending outpatient clinics at a tertiary care hospital. Preference and perception of norm regarding consenting options for posthumous organ donation were explored. Participants ranked (1, most agreeable the following, randomly-presented, options from 1 to 11: no-organ-donation, presumed consent, informed consent by donor-only, informed consent by donor-or-surrogate, and mandatory choice; the last three options ± medical or financial incentive. Results Mean(SD age was 32(9 year, 27% were males, 50% were patients’ companions, 60% had ≥ college education, and 20% and 32%, respectively, knew an organ donor or recipient. Mandated choice was among the top three choices for preference of 54% of respondents, with an overall median[25%,75%] ranking score of 3[2,6], and was preferred over donor-or-surrogate informed consent (4[2,7], p vs. 11[6,11], respectively, p = 0.002. Compared to females, males more perceived donor-or-surrogate informed consent as the norm (3[1,6] vs. 5[3,7], p vs. 8[4,9], p vs. 5[2,7], p Conclusions We conclude that: 1 most respondents were in favor of posthumous organ donation, 2 mandated choice system was the most preferred and presumed consent system was the least preferred, 3 there was no difference between preference and perception of norm in consenting systems ranking, and 4 financial (especially in females and medical (especially in males incentives reduced preference.
Jump probabilities in the non-Markovian quantum jump method
International Nuclear Information System (INIS)
Haerkoenen, Kari
2010-01-01
The dynamics of a non-Markovian open quantum system described by a general time-local master equation is studied. The propagation of the density operator is constructed in terms of two processes: (i) deterministic evolution and (ii) evolution of a probability density functional in the projective Hilbert space. The analysis provides a derivation for the jump probabilities used in the recently developed non-Markovian quantum jump (NMQJ) method (Piilo et al 2008 Phys. Rev. Lett. 100 180402).
The probability of a tornado missile hitting a target
International Nuclear Information System (INIS)
Goodman, J.; Koch, J.E.
1983-01-01
It is shown that tornado missile transportation is a diffusion Markovian process. Therefore, the Green's function method is applied for the estimation of the probability of hitting a unit target area. This propability is expressed through a joint density of tornado intensity and path area, a probability of tornado missile injection and a tornado missile height distribution. (orig.)
Effects of Potential Lane-Changing Probability on Uniform Flow
International Nuclear Information System (INIS)
Tang Tieqiao; Huang Haijun; Shang Huayan
2010-01-01
In this paper, we use the car-following model with the anticipation effect of the potential lane-changing probability (Acta Mech. Sin. 24 (2008) 399) to investigate the effects of the potential lane-changing probability on uniform flow. The analytical and numerical results show that the potential lane-changing probability can enhance the speed and flow of uniform flow and that their increments are related to the density.
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Consenting options for posthumous organ donation: presumed consent and incentives are not favored
2012-01-01
Background Posthumous organ procurement is hindered by the consenting process. Several consenting systems have been proposed. There is limited information on public relative attitudes towards various consenting systems, especially in Middle Eastern/Islamic countries. Methods We surveyed 698 Saudi Adults attending outpatient clinics at a tertiary care hospital. Preference and perception of norm regarding consenting options for posthumous organ donation were explored. Participants ranked (1, most agreeable) the following, randomly-presented, options from 1 to 11: no-organ-donation, presumed consent, informed consent by donor-only, informed consent by donor-or-surrogate, and mandatory choice; the last three options ± medical or financial incentive. Results Mean(SD) age was 32(9) year, 27% were males, 50% were patients’ companions, 60% had ≥ college education, and 20% and 32%, respectively, knew an organ donor or recipient. Mandated choice was among the top three choices for preference of 54% of respondents, with an overall median[25%,75%] ranking score of 3[2,6], and was preferred over donor-or-surrogate informed consent (4[2,7], p < 0.001), donor-only informed consent (5[3,7], p < 0.001), and presumed consent (7[3,10], p < 0.001). The addition of a financial or medical incentive, respectively, reduced ranking of mandated choice to 7[4,9], p < 0.001, and 5[3,8], p < 0.001; for donor-or-surrogate informed consent to 7[5,9], p < 0.001, and 5[3,7], p = 0.004; and for donor-only informed consent to 8[6,10], p < 0.001, and 5[3,7], p = 0.56. Distribution of ranking score of perception of norm and preference were similar except for no-organ donation (11[7,11] vs. 11[6,11], respectively, p = 0.002). Compared to females, males more perceived donor-or-surrogate informed consent as the norm (3[1,6] vs. 5[3,7], p < 0.001), more preferred mandated choice with financial incentive option (6[3,8] vs. 8[4,9], p < 0.001), and
Prediction and probability in sciences
International Nuclear Information System (INIS)
Klein, E.; Sacquin, Y.
1998-01-01
This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Probability measures, Lévy measures and analyticity in time
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich
2008-01-01
We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators, we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...
Probability Measures, Lévy Measures, and Analyticity in Time
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich
We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...
Poisson Processes in Free Probability
An, Guimei; Gao, Mingchu
2015-01-01
We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Directory of Open Access Journals (Sweden)
Francisco Alves Filho
2015-08-01
Full Text Available This article aims to present how the image of the presumed reader has been built in the printed newspapers of the city of Teresina (Piaui State, Brazil, with them being the following: O Piauí, Folha da Manhã, O Dia, and Diário do Povo. In order to do so, we have researched the bibliography referent to the studies of authors such as Charles Bazerman (2011, Carolyn Miller (2009 [1984], and Amy Devitt (2004, representatives of Gender Rhetorical Studies. Besides these, we have consulted Bakhtin's assumptions for studying genre, mainly extracted from Marxism and the Philosophy of Language, and Aesthetics of Verbal Creation, as well as Brazilian authors related to the study of discursive genres, such as Fiorin (2008, Marcuschi (2008, and Faraco (2009. With the research, we could confirm that the advertisements actually build social-historic blocks capable of mirroring the values of a society, but also capable of contributing directly for determined values to be restored or forgotten.
ISC origin times for announced and presumed underground nuclear explosions at several test sites
International Nuclear Information System (INIS)
Rodean, H.C.
1979-01-01
Announced data for US and French underground nuclear explosions indicate that nearly all detonations have occurred within one or two tenths of a second after the minute. This report contains ISC origin-time data for announced explosions at two US test sites and one French test site, and includes similar data for presumed underground nuclear explosions at five Soviet sites. Origin-time distributions for these sites are analyzed for those events that appeared to be detonated very close to the minute. Particular attention is given to the origin times for the principal US and Soviet test sites in Nevada and Eastern Kazakhstan. The mean origin times for events at the several test sites range from 0.4 s to 2.8 s before the minute, with the earlier mean times associated with the Soviet sites and the later times with the US and French sites. These times indicate lower seismic velocities beneath the US and French sites, and higher velocities beneath the sites in the USSR 9 figures, 8 tables
A discussion supporting presumed consent for posthumous sperm procurement and conception.
Tremellen, Kelton; Savulescu, Julian
2015-01-01
Conception of a child using cryopreserved sperm from a deceased man is generally considered ethically sound provided explicit consent for its use has been made, thereby protecting the man's autonomy. When death is sudden (trauma, unexpected illness), explicit consent is not possible, thereby preventing posthumous sperm procurement (PSP) and conception according to current European Society of Human Reproduction and Embryology and the American Society for Reproductive Medicine guidelines. Here, we argue that autonomy of a deceased person should not be considered the paramount ethical concern, but rather consideration of the welfare of the living (widow and prospective child) should be the primary focus. Posthumous conception can bring significant advantages to the widow and her resulting child, with most men supporting such practice. We suggest that a deceased man can benefit from posthumous conception (continuation of his 'bloodline', allowing his widow's wishes for a child to be satisfied), and has a moral duty to allow his widow access to his sperm, if she so wishes, unless he clearly indicated that he did not want children when alive. We outline the arguments favouring presumed consent over implied or proxy consent, plus practical considerations for recording men's wishes to opt-out of posthumous conception. Copyright © 2014 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Evidence for presumable feline origin of sporadic G6P[9] rotaviruses in humans.
Pietsch, Corinna; Liebert, Uwe G
2018-05-31
Species A rotaviruses are highly diverse and impose a substantial burden to human and animal health. Interspecies transmission between livestock, domestic animals and humans is commonly observed, but spread of animal-like rotaviruses within the human population is limited. During the continued monitoring of rotavirus strains in Germany, an unusual G6P[9] rotavirus strain was detected in feces of a child. The complete rotavirus coding sequences revealed a unique G6-P[9]-I2-R2-C2-M2-A3-N2-T3-E2-H3 genotype constellation. The virus was phylogenetically related to feline G3P[9] strains and other human G6P[9] rotaviruses of presumable zoonotic origin. Analysis of primer binding sites of G6 specific genotyping revealed further evidence of a G6P[9] feline reservoir. Moreover, substantial deficits of conventional semi-nested PCR genotyping approaches in detecting contemporary G6P[9] were revealed. Rotavirus strain GER29-14 most likely resulted from a direct or recent interspecies transmission from a cat to human. Further studies could assess nucleic acid sequences and genotype constellations of feline rotavirus to confirm the likely feline origin of sporadic human G6P[9] strains. Copyright © 2018 Elsevier B.V. All rights reserved.
Presumed congenital infection by Zika virus: findings on psychomotor development - a case report
Directory of Open Access Journals (Sweden)
Ana Carla Gomes Botelho
Full Text Available Abstract Introduction: the identification of Zika virus (ZikV in the amniotic fluid, in the placenta and in newborns' brains suggests a neurotropism of this agent in the brain development, resulting in neuro-psycho-motor alterations. Thus, this present study reports the assessment of children diagnosed by a congenital infection, presumably by ZikV, followed-up at the Rehabilitation Center Prof. Ruy Neves Baptist at the Instituto de Medicina Integral Prof. Fernando Figueira (IMIP. Description: as proposed by the Ministry of Health, the following instruments were used to evaluate the neuro-motor functions of four children with microcephaly aged between three and four months: The Test of Infant Motor Performance (TIMP; the functional vision assessment; the manual function scale development; and the clinical evaluation protocol on pediatric dysphagia (PAD-PED. Discussion: the children evaluated presented atypical motor performance, muscle tone and spontaneous motricity which encompass the symmetry and the motion range of the upper and lower limbs proven to be altered. The functional vision showed alterations which can cause limitations in the performance of functional activities and the learning process. Regarding to the speech articulator's functions observed that the maturation and coordination of sucking, swallowing and breathing did not yet encounter the appropriate age maturity level.
Excimer Laser Phototherapeutic Keratectomy for the Treatment of Clinically Presumed Fungal Keratitis
Directory of Open Access Journals (Sweden)
Liang-Mao Li
2014-01-01
Full Text Available This retrospective study was to evaluate treatment outcomes of excimer laser phototherapeutic keratectomy (PTK for clinically presumed fungal keratitis. Forty-seven eyes of 47 consecutive patients underwent manual superficial debridement and PTK. All corneal lesions were located in the anterior stroma and were resistant to medication therapy for at least one week. Data were collected by a retrospective chart review with at least six months of follow-up data available. After PTK, infected corneal lesions were completely removed and the clinical symptoms resolved in 41 cases (87.2%. The mean ablation depth was 114.39±45.51 μm and diameter of ablation was 4.06±1.07 mm. The mean time for healing of the epithelial defect was 8.8±5.6 days. Thirty-four eyes (82.9% showed an improvement in best spectacle-corrected visual acuity of two or more lines. PTK complications included mild to moderate corneal haze, hyperopic shift, irregular astigmatism, and thinning cornea. Six eyes (12.8% still showed progressed infection, and conjunctival flap covering, amniotic membrane transplantation, or penetrating keratoplasty were given. PTK is a valuable therapeutic alternative for superficial infectious keratitis. It can effectively eradicate lesions, hasten reepithelialization, and restore and preserve useful visual function. However, the selection of surgery candidates should be conducted carefully.
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Approaches to Evaluating Probability of Collision Uncertainty
Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
High-density limit of quantum chromodynamics
International Nuclear Information System (INIS)
Alvarez, E.
1983-01-01
By means of a formal expansion of the partition function presumably valid at large baryon densities, the propagator of the quarks is expressed in terms of the gluon propagator. This result is interpreted as implying that correlations between quarks and gluons are unimportant at high enough density, so that a kind of mean-field approximation gives a very accurate description of the physical system
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
Chorioretinal Lesions Presumed Secondary to Zika Virus Infection in an Immunocompromised Adult.
Henry, Christopher R; Al-Attar, Luma; Cruz-Chacón, Alexis M; Davis, Janet L
2017-04-01
Zika virus has spread rapidly throughout the Americas since 2015. The public health implications of Zika virus infection lend special importance to identifying the virus in unsuspected hosts. To describe relevant imaging studies and clinical features of chorioretinal lesions that are presumably associated with Zika virus and that share analogous features with chorioretinal lesions reported in cases of Dengue fever and West Nile virus. This is a case report from an academic referral center in Miami, Florida, of a woman in her 60s from Guaynabo, Puerto Rico, who presented with reduced visual acuity and bilateral diffuse, subretinal, confluent, placoid, and multifocal chorioretinal lesions. The patient was observed over a 5-month period. Visual acuity, clinical course, and multimodal imaging study results. Fluorescein angiography revealed early hypofluorescence and late staining of the chorioretinal lesions. Optical coherence tomography demonstrated outer retinal disruption in the placoid macular lesions. Zika RNA was detected in a plasma sample by real-time reverse transcription polymerase chain reaction testing and was suspected to be the cause of chorioretinal lesions after other viral and infectious causes were ruled out. Three weeks after the onset of symptoms, the patient's visual acuity had improved to 20/60 OD and 20/25 OS, with intraocular pressures of 18 mm Hg OD and 19 mm Hg OS. In 6 weeks, the chorioretinal lesions had healed and visual acuity had improved to 20/25 OD and 20/20 OS. Follow-up optical coherence tomography demonstrated interval recovery of the outer retina and photoreceptors. Acute-onset, self-resolving, placoid, or multifocal nonnecrotizing chorioretinal lesions may be a feature of active Zika virus chorioretinitis, as reported in other Flavivirus infections in adults. Similar findings in potentially exposed adults suggest that clinicians should consider IgM antibody or polymerase chain reaction testing for Zika virus as well as diagnostic
Joint maximum-likelihood magnitudes of presumed underground nuclear test explosions
Peacock, Sheila; Douglas, Alan; Bowers, David
2017-08-01
Body-wave magnitudes (mb) of 606 seismic disturbances caused by presumed underground nuclear test explosions at specific test sites between 1964 and 1996 have been derived from station amplitudes collected by the International Seismological Centre (ISC), by a joint inversion for mb and station-specific magnitude corrections. A maximum-likelihood method was used to reduce the upward bias of network mean magnitudes caused by data censoring, where arrivals at stations that do not report arrivals are assumed to be hidden by the ambient noise at the time. Threshold noise levels at each station were derived from the ISC amplitudes using the method of Kelly and Lacoss, which fits to the observed magnitude-frequency distribution a Gutenberg-Richter exponential decay truncated at low magnitudes by an error function representing the low-magnitude threshold of the station. The joint maximum-likelihood inversion is applied to arrivals from the sites: Semipalatinsk (Kazakhstan) and Novaya Zemlya, former Soviet Union; Singer (Lop Nor), China; Mururoa and Fangataufa, French Polynesia; and Nevada, USA. At sites where eight or more arrivals could be used to derive magnitudes and station terms for 25 or more explosions (Nevada, Semipalatinsk and Mururoa), the resulting magnitudes and station terms were fixed and a second inversion carried out to derive magnitudes for additional explosions with three or more arrivals. 93 more magnitudes were thus derived. During processing for station thresholds, many stations were rejected for sparsity of data, obvious errors in reported amplitude, or great departure of the reported amplitude-frequency distribution from the expected left-truncated exponential decay. Abrupt changes in monthly mean amplitude at a station apparently coincide with changes in recording equipment and/or analysis method at the station.
Kobayashi, Atsushi; Parchi, Piero; Yamada, Masahito; Mohri, Shirou; Kitamoto, Tetsuyuki
2016-06-01
As an experimental model of acquired Creutzfeldt-Jakob disease (CJD), we performed transmission studies of sporadic CJD using knock-in mice expressing human prion protein (PrP). In this model, the inoculation of the sporadic CJD strain V2 into animals homozygous for methionine at polymorphic codon 129 (129 M/M) of the PRNP gene produced quite distinctive neuropathological and biochemical features, that is, widespread kuru plaques and intermediate type abnormal PrP (PrP(Sc) ). Interestingly, this distinctive combination of molecular and pathological features has been, to date, observed in acquired CJD but not in sporadic CJD. Assuming that these distinctive phenotypic traits are specific for acquired CJD, we revisited the literature and found two cases showing widespread kuru plaques despite the 129 M/M genotype, in a neurosurgeon and in a patient with a medical history of neurosurgery without dura mater grafting. By Western blot analysis of brain homogenates, we revealed the intermediate type of PrP(Sc) in both cases. Furthermore, transmission properties of brain extracts from these two cases were indistinguishable from those of a subgroup of dura mater graft-associated iatrogenic CJD caused by infection with the sporadic CJD strain V2. These data strongly suggest that the two atypical CJD cases, previously thought to represent sporadic CJD, very likely acquired the disease through exposure to prion-contaminated brain tissues. Thus, we propose that the distinctive combination of 129 M/M genotype, kuru plaques, and intermediate type PrP(Sc) , represents a reliable criterion for the identification of acquired CJD cases among presumed sporadic cases. © 2015 Japanese Society of Neuropathology.
Jacquemyn, Hans; Waud, Michael; Lievens, Bart; Brys, Rein
2016-07-01
In orchid species that have populations occurring in strongly contrasting habitats, mycorrhizal divergence and other habitat-specific adaptations may lead to the formation of reproductively isolated taxa and ultimately to species formation. However, little is known about the mycorrhizal communities associated with recently diverged sister taxa that occupy different habitats. In this study, 454 amplicon pyrosequencing was used to investigate mycorrhizal communities associating with Epipactis helleborine in its typical forest habitat and with its presumed sister species E. neerlandica that almost exclusively occurs in coastal dune habitats. Samples of the phylogenetically more distant E. palustris, which co-occurred with E. neerlandica, were also included to investigate the role of habitat-specific conditions on mycorrhizal communities. A total of 105 operational taxonomic units (OTUs) of putative orchid mycorrhizal fungi were observed in the three studied species. The majority of these fungi were endophytic fungi of Helotiales and ectomycorrhizal fungi belonging to Thelephoraceae, Sebacinaceae and Inocybaceae. In addition, a large number of other ectomycorrhizal taxa were detected, including Cortinarius, Cenococcum, Tuber, Geopora, Wilcoxina, Meliniomyces, Hebeloma, Tricholoma, Russula and Peziza Mycorrhizal communities differed significantly between the three species, but differences were most pronounced between the forest species (E. helleborine) and the two dune slack species (E. neerlandica and E. palustris). The results clearly showed that recently diverged orchid species that occupy different habitats were characterized by significantly different mycorrhizal communities and call for more detailed experiments that aim at elucidating the contribution of habitat-specific adaptations in general and mycorrhizal divergence in particular to the process of speciation in orchids. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany
Directory of Open Access Journals (Sweden)
Nguyen Tien Huy
Full Text Available BACKGROUND AND PURPOSE: Successful outcomes from bacterial meningitis require rapid antibiotic treatment; however, unnecessary treatment of viral meningitis may lead to increased toxicities and expense. Thus, improved diagnostics are required to maximize treatment and minimize side effects and cost. Thirteen clinical decision rules have been reported to identify bacterial from viral meningitis. However, few rules have been tested and compared in a single study, while several rules are yet to be tested by independent researchers or in pediatric populations. Thus, simultaneous test and comparison of these rules are required to enable clinicians to select an optimal diagnostic rule for bacterial meningitis in settings and populations similar to ours. METHODS: A retrospective cross-sectional study was conducted at the Infectious Department of Pediatric Hospital Number 1, Ho Chi Minh City, Vietnam. The performance of the clinical rules was evaluated by area under a receiver operating characteristic curve (ROC-AUC using the method of DeLong and McNemar test for specificity comparison. RESULTS: Our study included 129 patients, of whom 80 had bacterial meningitis and 49 had presumed viral meningitis. Spanos's rule had the highest AUC at 0.938 but was not significantly greater than other rules. No rule provided 100% sensitivity with a specificity higher than 50%. Based on our calculation of theoretical sensitivity and specificity, we suggest that a perfect rule requires at least four independent variables that posses both sensitivity and specificity higher than 85-90%. CONCLUSIONS: No clinical decision rules provided an acceptable specificity (>50% with 100% sensitivity when applying our data set in children. More studies in Vietnam and developing countries are required to develop and/or validate clinical rules and more very good biomarkers are required to develop such a perfect rule.
Qubit-qutrit separability-probability ratios
International Nuclear Information System (INIS)
Slater, Paul B.
2005-01-01
Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases
Quantum operations, state transformations and probabilities
International Nuclear Information System (INIS)
Chefles, Anthony
2002-01-01
In quantum operations, probabilities characterize both the degree of the success of a state transformation and, as density operator eigenvalues, the degree of mixedness of the final state. We give a unified treatment of pure→pure state transformations, covering both probabilistic and deterministic cases. We then discuss the role of majorization in describing the dynamics of mixing in quantum operations. The conditions for mixing enhancement for all initial states are derived. We show that mixing is monotonically decreasing for deterministic pure→pure transformations, and discuss the relationship between these transformations and deterministic local operations with classical communication entanglement transformations
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
Grandy, W. T. Jr.
2009-01-01
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
2016-06-01
Reports an error in "Presumed fair: Ironic effects of organizational diversity structures" by Cheryl R. Kaiser, Brenda Major, Ines Jurcevic, Tessa L. Dover, Laura M. Brady and Jenessa R. Shapiro (Journal of Personality and Social Psychology, 2013[Mar], Vol 104[3], 504-519). In the article, a raw data merging error in one racial discrimination claim condition from Experiment 6 inadvertently resulted in data analyses on an inaccurate data set. When the error was discovered by the authors and corrected, all analyses reported in Experiment 6 for claim validity, seriousness of the claim, and support for the claimant were inaccurate and none were statistically significant. The conclusions should be altered to indicate that participants with management experience who reflected on their own workplace diversity policies did not show the predicted effects. The literature review, remaining five studies, and remaining conclusions in the article are unaffected by this error. Experiment 6 should also report that 26.4% (not 26.4.7%) of participants had a graduate degree and eight participants (not 8%) did not provide educational data. Experiment 5 should have referred to the claim validity measure as a six-item measure ( .92) rather than a five-item measure; analyses on claim validity are accurate in text. Table 2's note should have said standard errors, not standard deviations. (The following abstract of the original article appeared in record 2012-31077-001.) This research tests the hypothesis that the presence (vs. absence) of organizational diversity structures causes high-status group members (Whites, men) to perceive organizations with diversity structures as procedurally fairer environments for underrepresented groups (racial minorities, women), even when it is clear that underrepresented groups have been unfairly disadvantaged within these organizations. Furthermore, this illusory sense of fairness derived from the mere presence of diversity structures causes high
Joint probabilities and quantum cognition
International Nuclear Information System (INIS)
Acacio de Barros, J.
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Joint probabilities and quantum cognition
Energy Technology Data Exchange (ETDEWEB)
Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)
2012-12-18
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
The Probabilities of Unique Events
2012-08-30
Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the
Probability Matching, Fast and Slow
Koehler, Derek J.; James, Greta
2014-01-01
A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Converting dose distributions into tumour control probability
International Nuclear Information System (INIS)
Nahum, A.E.
1996-01-01
The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs
Converting dose distributions into tumour control probability
Energy Technology Data Exchange (ETDEWEB)
Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics
1996-08-01
The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.
Consistent probabilities in loop quantum cosmology
International Nuclear Information System (INIS)
Craig, David A; Singh, Parampreet
2013-01-01
A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nonspherical atomic ground-state densities and chemical deformation densities from x-ray scattering
International Nuclear Information System (INIS)
Ruedenberg, K.; Schwarz, W.H.E.
1990-01-01
Presuming that chemical insight can be gained from the difference between the molecular electron density and the superposition of the ground-state densities of the atoms in a molecule, it is pointed out that, for atoms with degenerate ground states, an unpromoted ''atom in a molecule'' is represented by a specific ensemble of the degenerate atomic ground-state wave functions and that this ensemble is determined by the anisotropic local surroundings. The resulting atomic density contributions are termed oriented ground state densities, and the corresponding density difference is called the chemical deformation density. The constraints implied by this conceptual approach for the atomic density contributions are formulated and a method is developed for determining them from x-ray scattering data. The electron density of the appropriate promolecule and its x-ray scattering are derived, the determination of the parameters of the promolecule is outlined, and the chemical deformation density is formulated
U.S. Environmental Protection Agency — Road density is generally highly correlated with amount of developed land cover. High road densities usually indicate high levels of ecological disturbance. More...
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Use of probability tables for propagating uncertainties in neutronics
International Nuclear Information System (INIS)
Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.
2017-01-01
Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.
Electrofishing capture probability of smallmouth bass in streams
Dauwalter, D.C.; Fisher, W.L.
2007-01-01
Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Swedish earthquakes and acceleration probabilities
International Nuclear Information System (INIS)
Slunga, R.
1979-03-01
A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
2015-01-01
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Probability and statistics: A reminder
International Nuclear Information System (INIS)
Clement, B.
2013-01-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...
On probability-possibility transformations
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
Absolute Kr I and Kr II transition probabilities
International Nuclear Information System (INIS)
Brandt, T.; Helbig, V.; Nick, K.P.
1982-01-01
Transition probabilities for 11 KrI and 9 KrII lines between 366.5 and 599.3nm were obtained from measurements with a wall-stabilised arc at atmospheric pressure in pure krypton. The population densities of the excited krypton levels were calculated under the assumption of LTE from electron densities measured by laser interferometry. The uncertainties for the KrI and the KrII data are 15 and 25% respectively. (author)
Ho, Shirley S; Poorisat, Thanomwong; Neo, Rachel L; Detenber, Benjamin H
2014-01-01
This study uses the influence of presumed media influence model as the theoretical framework to examine how perceived social norms (i.e., descriptive, subjective, and injunctive norms) will mediate the influence of pro- and antidrinking media messages on adolescents' intention to consume alcohol in rural Thailand. Data collected from 1,028 high school students indicate that different mechanisms underlie drinking intentions between nondrinkers and those who have consumed alcohol or currently drink. Among nondrinkers, perceived peer attention to prodrinking messages indirectly influenced adolescents' prodrinking attitudes and intentions to consume alcohol through all three types of perceived social norms. Among drinkers, perceived peer attention to pro- and antidrinking messages indirectly influenced adolescents' prodrinking attitudes and intentions to drink alcohol through perceived subjective norm. The findings provide support for the extended influence of presumed media influence model and have practical implications for how antidrinking campaigns targeted at teenagers in Thailand might be designed.
Ang, Leslie; Kee, Aera; Yeo, Tun Hang; Dinesh, V G; Ho, Su Ling; Teoh, Stephen C; Agrawal, Rupesh
2018-02-01
To report the clinical features and outcome of patients with presumed tubercular uveitis (TBU). Retrospective analysis of patients with presumed TBU at a tertiary referral eye care centre in Singapore between 2007 and 2012 was done. Main outcome measures were failure of complete resolution of uveitis or recurrence of inflammation. Fifty three patients with mean age of 44.18 ± 15.26 years with 54.72% being males were included. 19 (35.85%) had bilateral involvement, with panuveitis and anterior uveitis being the most common presentations. 36 (67.92%) patients received antitubercular therapy (ATT), and 28 received concurrent systemic steroids. 15 (28.30%) eyes of 11 (30.55%) patients in the ATT group and 4 (21.05%) eyes of 3 (17.64%) patients in the non-ATT group had treatment failure (p value = 0.51). The use of ATT, with or without concurrent corticosteroid, may not have a statistically significant impact in improving treatment success in patients with presumed TBU.
DEFF Research Database (Denmark)
Garnett, E S; Webber, C E; Coates, G
1977-01-01
The density of a defined volume of the human lung can be measured in vivo by a new noninvasive technique. A beam of gamma-rays is directed at the lung and, by measuring the scattered gamma-rays, lung density is calculated. The density in the lower lobe of the right lung in normal man during quiet...... breathing in the sitting position ranged from 0.25 to 0.37 g.cm-3. Subnormal values were found in patients with emphsema. In patients with pulmonary congestion and edema, lung density values ranged from 0.33 to 0.93 g.cm-3. The lung density measurement correlated well with the findings in chest radiographs...... but the lung density values were more sensitive indices. This was particularly evident in serial observations of individual patients....
International Nuclear Information System (INIS)
Coleman, J.H.
1980-10-01
A technique is discussed for computing the probability distribution of the accumulated dose received by an arbitrary receptor resulting from several single releases from an intermittent source. The probability density of the accumulated dose is the convolution of the probability densities of doses from the intermittent releases. Emissions are not assumed to be constant over the brief release period. The fast fourier transform is used in the calculation of the convolution
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Probability matching and strategy availability.
Koehler, Derek J; James, Greta
2010-09-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.
Probability as a Physical Motive
Directory of Open Access Journals (Sweden)
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Logic, Probability, and Human Reasoning
2015-01-01
accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Probability matching and strategy availability
J. Koehler, Derek; Koehler, Derek J.; James, Greta
2010-01-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...
Analytic formulation of neutrino oscillation probability in constant matter
International Nuclear Information System (INIS)
Kimura, Keiichi; Takamura, Akira; Yokomakura, Hidekazu
2003-01-01
In this paper, based on the work (Kimura K et al 2002 Phys. Lett. B 537 86) we present the simple derivation of an exact and analytic formula for neutrino oscillation probability. We consider three flavour neutrino oscillations in matter with constant density
Probabilities of filaments in a Poissonian distribution of points -I
International Nuclear Information System (INIS)
Betancort-Rijo, J.
1989-01-01
Statistical techniques are devised to assess the likelihood of a Poisson sample of points in two and three dimensions, containing specific filamentary structures. For that purpose, the expression of Otto et al (1986. Astrophys. J., 304) for the probability density of clumps in a Poissonian distribution of points is generalized for any value of the density contrast. A way of counting filaments differing from that of Otto et al. is proposed, because at low density contrast the filaments counted by Otto et al. are distributed in a clumpy fashion, each clump of filaments corresponding to a distinct observed filament. (author)
The probability representation as a new formulation of quantum mechanics
International Nuclear Information System (INIS)
Man'ko, Margarita A; Man'ko, Vladimir I
2012-01-01
We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.
[Biometric bases: basic concepts of probability calculation].
Dinya, E
1998-04-26
The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment
Carpenter, James R.; Markley, F Landis
2014-01-01
This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.
Directory of Open Access Journals (Sweden)
Huping Xue
Full Text Available BACKGROUND: Horizontal gene transfer (HGT is recognized as one of the major forces for bacterial genome evolution. Many clinically important bacteria may acquire virulence factors and antibiotic resistance through HGT. The comparative genomic analysis has become an important tool for identifying HGT in emerging pathogens. In this study, the Serine-Aspartate Repeat (Sdr family has been compared among different sources of Staphylococcus aureus (S. aureus to discover sequence diversities within their genomes. METHODOLOGY/PRINCIPAL FINDINGS: Four sdr genes were analyzed for 21 different S. aureus strains and 218 mastitis-associated S. aureus isolates from Canada. Comparative genomic analyses revealed that S. aureus strains from bovine mastitis (RF122 and mastitis isolates in this study, ovine mastitis (ED133, pig (ST398, chicken (ED98, and human methicillin-resistant S. aureus (MRSA (TCH130, MRSA252, Mu3, Mu50, N315, 04-02981, JH1 and JH9 were highly associated with one another, presumably due to HGT. In addition, several types of insertion and deletion were found in sdr genes of many isolates. A new insertion sequence was found in mastitis isolates, which was presumably responsible for the HGT of sdrC gene among different strains. Moreover, the sdr genes could be used to type S. aureus. Regional difference of sdr genes distribution was also indicated among the tested S. aureus isolates. Finally, certain associations were found between sdr genes and subclinical or clinical mastitis isolates. CONCLUSIONS: Certain sdr gene sequences were shared in S. aureus strains and isolates from different species presumably due to HGT. Our results also suggest that the distributional assay of virulence factors should detect the full sequences or full functional regions of these factors. The traditional assay using short conserved regions may not be accurate or credible. These findings have important implications with regard to animal husbandry practices that may
Directory of Open Access Journals (Sweden)
Shadi S. Yarandi
2014-01-01
Full Text Available Despite using imaging studies, tissue sampling, and serologic tests about 5–10% of surgeries done for presumed pancreatic malignancies will have benign findings on final pathology. Endoscopic ultrasound (EUS is used with increasing frequency to study pancreatic masses. The aim of this study is to examine the effect of EUS on prevalence of benign diseases undergoing Whipple over the last decade. Patients who underwent Whipple procedure for presumed malignancy at Emory University Hospital from 1998 to 2011 were selected. Demographic data, history of smoking and drinking, history of diabetes and pancreatitis, imaging data, pathology reports, and tumor markers were extracted. 878 patients were found. 95 (10.82% patients had benign disease. Prevalence of benign finding had increased over the recent years despite using more EUS. Logistic regression models showed that abdominal pain (OR: 5.829, 95% CI 2.681–12.674, P ≤ 0.001 and alcohol abuse (OR: 3.221, CI 95%: 1.362–7.261, P: 0.002 were predictors of benign diseases. Jaundice (OR: 0.221, 95% CI: 0.084–0.58, P: 0.002, mass (OR: 0.145, 95% CI: 0.043–0.485, P: 0.008, and ductal dilation (OR: 0.297, 95% CI 0.134–0.657, P: 0.003 were associated with malignancy. Use of imaging studies, ERCP, and EUS has not decreased the percentage of benign findings after surgery for presumed pancreatic malignancy.
Lectures on probability and statistics
International Nuclear Information System (INIS)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another
Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...
... Density Exam/Testing › Low Bone Density Low Bone Density Low bone density is when your bone density ... people with normal bone density. Detecting Low Bone Density A bone density test will determine whether you ...
Hartley, Jane E K; Wight, Daniel; Hunt, Kate
2014-06-01
Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers' alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's 'presumed media influence' theory. Media portrayals of romantic/sexual relationships appeared to influence teenagers' constructions of gender-appropriate sexual behaviour more than did media portrayals of drinking behaviour, perhaps because the teenagers had more firsthand experience of observing drinking than of observing sexual relationships. Presumed media influence may be less influential if one has experience of the behaviour portrayed. Drinking and sexual behaviour were highly interrelated: sexual negotiation and activities were reportedly often accompanied by drinking. For teenagers, being drunk or, importantly, pretending to be drunk, may be a useful way to try out what they perceived to be gender-appropriate identities. In sum, teenagers' drinking and sexual/romantic relationships are primary ways in which they do gender and the media's influence on their perceptions of appropriate gendered behaviour is mediated through peer relationships. © 2014 The Authors. Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for SHIL (SHIL).
Hartley, Jane E K; Wight, Daniel; Hunt, Kate
2014-01-01
Using empirical data from group discussions and in-depth interviews with 13 to 15-year olds in Scotland, this study explores how teenagers’ alcohol drinking and sexual/romantic relationships were shaped by their quest for appropriate gendered identities. In this, they acknowledged the influence of the media, but primarily in relation to others, not to themselves, thereby supporting Milkie's ‘presumed media influence’ theory. Media portrayals of romantic/sexual relationships appeared to influence teenagers’ constructions of gender-appropriate sexual behaviour more than did media portrayals of drinking behaviour, perhaps because the teenagers had more firsthand experience of observing drinking than of observing sexual relationships. Presumed media influence may be less influential if one has experience of the behaviour portrayed. Drinking and sexual behaviour were highly interrelated: sexual negotiation and activities were reportedly often accompanied by drinking. For teenagers, being drunk or, importantly, pretending to be drunk, may be a useful way to try out what they perceived to be gender-appropriate identities. In sum, teenagers’ drinking and sexual/romantic relationships are primary ways in which they do gender and the media's influence on their perceptions of appropriate gendered behaviour is mediated through peer relationships. PMID:24443822
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Does probability of occurrence relate to population dynamics?
Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M
2014-12-01
Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence
Does probability of occurrence relate to population dynamics?
Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.
2014-01-01
Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence
Excluding joint probabilities from quantum theory
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
Domestic wells have high probability of pumping septic tank leachate
Bremer, J. E.; Harter, T.
2012-08-01
Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).
Domestic wells have high probability of pumping septic tank leachate
Directory of Open Access Journals (Sweden)
J. E. Bremer
2012-08-01
Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Cross Check of NOvA Oscillation Probabilities
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics
2018-01-12
In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.
The maximum entropy method of moments and Bayesian probability theory
Bretthorst, G. Larry
2013-08-01
The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.
International Nuclear Information System (INIS)
Ignatyuk, A.V.
1998-01-01
For any applications of the statistical theory of nuclear reactions it is very important to obtain the parameters of the level density description from the reliable experimental data. The cumulative numbers of low-lying levels and the average spacings between neutron resonances are usually used as such data. The level density parameters fitted to such data are compiled in the RIPL Starter File for the tree models most frequently used in practical calculations: i) For the Gilber-Cameron model the parameters of the Beijing group, based on a rather recent compilations of the neutron resonance and low-lying level densities and included into the beijing-gc.dat file, are chosen as recommended. As alternative versions the parameters provided by other groups are given into the files: jaeri-gc.dat, bombay-gc.dat, obninsk-gc.dat. Additionally the iljinov-gc.dat, and mengoni-gc.dat files include sets of the level density parameters that take into account the damping of shell effects at high energies. ii) For the backed-shifted Fermi gas model the beijing-bs.dat file is selected as the recommended one. Alternative parameters of the Obninsk group are given in the obninsk-bs.dat file and those of Bombay in bombay-bs.dat. iii) For the generalized superfluid model the Obninsk group parameters included into the obninsk-bcs.dat file are chosen as recommended ones and the beijing-bcs.dat file is included as an alternative set of parameters. iv) For the microscopic approach to the level densities the files are: obninsk-micro.for -FORTRAN 77 source for the microscopical statistical level density code developed in Obninsk by Ignatyuk and coworkers, moller-levels.gz - Moeller single-particle level and ground state deformation data base, moller-levels.for -retrieval code for Moeller single-particle level scheme. (author)
K-forbidden transition probabilities
International Nuclear Information System (INIS)
Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki
2000-01-01
Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.
1993-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Ho, Shirley S; Lee, Edmund W J; Ng, Kaijie; Leong, Grace S H; Tham, Tiffany H M
2016-09-01
Based on the influence of presumed media influence (IPMI) model as the theoretical framework, this study examines how injunctive norms and personal norms mediate the influence of healthy lifestyle media messages on public intentions to engage in two types of healthy lifestyle behaviors-physical activity and healthy diet. Nationally representative data collected from 1,055 adults in Singapore demonstrate partial support for the key hypotheses that make up the extended IPMI model, highlighting the importance of a norms-based approach in health communication. Our results indicate that perceived media influence on others indirectly shaped public intentions to engage in healthy lifestyle behaviors through personal norms and attitude, providing partial theoretical support for the extended IPMI model. Practical implications for health communicators in designing health campaigns media messages to motivate the public to engage in healthy lifestyle are discussed.
Density limit study on the W7-AS stellarator
International Nuclear Information System (INIS)
Grigull, P.; Giannone, L.; Stroth, U.
1998-01-01
Data from currentless NBI discharges in W7-AS strongly indicate that the maximum density for quasi-stationary operation is limited by detachment from limiters. The threshold density at the edge scales with P s 0.5 B 0.8 (with P s being the net power flow across the LCMS) which is consistent with an edge based analytic estimation presuming constant threshold downstream temperatures. (author)
International Nuclear Information System (INIS)
Otake, S.; Yamana, D.; Tsuruta, Y.; Mizutani, H.; Ohba, S.
1998-01-01
The aim of this study was to determine the spectrum of MR findings of presumed amyloid arthropathy of the hip joints in patients on long-term hemodialysis. We prospectively performed T1- and T2-weighted spin-echo imaging on 152 consecutive patients on hemodialysis. The duration of hemodialysis ranged from 5 months to 24 years, 2 months (mean: 8 years, 8 months). The frequency, location, and signal intensity of bone lesions were assessed. In 12 cases with contrast-enhanced MR examination, enhancement pattern of bone lesions, synovial lesions, and intra-articular lesions were characterized. Bone lesions presumed to be amyloid deposits were identified in 60 patients (39 %). Magnetic resonance imaging revealed that amyloid lesions were more extensive than anticipated by plain radiographs. All bone lesions showed decreased signal intensity on T1-weighted images. On T2-weighted images, bone lesions showed increased signal intensity in 32 patients (54 %), decreased signal intensity in 11 patients (18 %), and both increased and decreased signal intensity in 17 patients (28 %). Following intravenous injection of gadolinium-based contrast, all bone lesions showed moderate enhancement. Synovial thickening could not be identified on T1- and T2-weighted images. However, contrast-enhanced images showed thickened synovial membrane, which could be differentiated from joint fluid. Intra-articular nodules showed decreased or intermediate signal intensity on T1-weighted images and decreased signal intensity on T2-weighted images; the intra-articular nodules were contiguous with subchondral bone lesions. Magnetic resonance imaging is useful for evaluating the distribution and extent of amyloidosis of the hip joints in patients undergoing long-term hemodialysis. (orig.) (orig.)
Bouyssou, Sarah; Specchi, Swan; Desquilbet, Loïc; Pey, Pascaline
2017-05-01
Noncardiogenic pulmonary edema is an important cause of respiratory disease in dogs and cats but few reports describe its radiographic appearance. The purpose of this retrospective case series study was to describe radiographic findings in a large cohort of dogs and cats with presumed noncardiogenic pulmonary edema and to test associations among radiographic findings versus cause of edema. Medical records were retrieved for dogs and cats with presumed noncardiogenic edema based on history, radiographic findings, and outcome. Radiographs were reviewed to assess lung pattern and distribution of the edema. Correlation with the cause of noncardiogenic pulmonary edema was evaluated with a Fisher's exact test. A total of 49 dogs and 11 cats were included. Causes for the noncardiogenic edema were airway obstruction (n = 23), direct pulmonary injury (n = 13), severe neurologic stimulation (n = 12), systemic disease (n = 6), near-drowning (n = 3), anaphylaxis (n = 2) and blood transfusion (n = 1). Mixed, symmetric, peripheral, multifocal, bilateral, and dorsal lung patterns were observed in 44 (73.3%), 46 (76.7%), 55 (91.7%), 46 (76.7%), 46 (76.7%), and 34 (57.6%) of 60 animals, respectively. When the distribution was unilateral, pulmonary infiltration involved mainly the right lung lobes (12 of 14, 85.7%). Increased pulmonary opacity was more often asymmetric, unilateral, and dorsal for postobstructive pulmonary edema compared to other types of noncardiogenic pulmonary edema, but no other significant correlations could be identified. In conclusion, noncardiogenic pulmonary edema may present with a quite variable radiographic appearance in dogs and cats. © 2016 American College of Veterinary Radiology.
Exploring effective interactions through transition charge density ...
Indian Academy of Sciences (India)
tematics like reduced transition probabilities B(E2) and static quadrupole moments Q(2) ... approximations of solving large scale shell model problems in Monte Carlo meth- ... We present the theoretical study of transition charge densities.
THE BLACK HOLE FORMATION PROBABILITY
Energy Technology Data Exchange (ETDEWEB)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
The Black Hole Formation Probability
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Evaluation of burst probability for tubes by Weibull distributions
International Nuclear Information System (INIS)
Kao, S.
1975-10-01
The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities
How Life History Can Sway the Fixation Probability of Mutants
Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne
2016-01-01
In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737
International Nuclear Information System (INIS)
Rodrigues, F. K.; Addy, B.; Armah, G.; Fobil, J.; Steiner-Asiedu, M.; Efavi, J.
2012-01-01
Repeated exposures of Shigella dysenteriae strain A to ultra-violet radiation (253.7 nm) with intervening outgrowth of survivors gave rise to clear bacteriophage plaques. Isolation, propagation and partial purification of the new Shd-4LI0 phages showed that they are similar in morphology to the Myxobacteriaphage Mx-4 described earlier. The new phages retained the general characteristics of S. dystenteriae phageShd-4L3, including serological properties and phage typing. It is suggested that ultra-violet irradiation may have played a role in the transformation and excision of the presumed lysogen of S. dysenteriaestrain A into a lytic phase. PhageShd-4LI0 was subsequently partially characterized. It has a density of 1.61, a DNA: protein ratio of 0.42 and thus a cryptogram of D/2:54.3/32.5:X/X: B/O. The phage was further characterised by fractionation of its protein using SDS-polyacrilamide gel electrophoresis. DNA extracted from phages was hydrolysed with restriction endonuclease R., EcoR1. The restriction fragments were catalogued and their apparent molecular weights calculated from electrophoresis gels calibrated with fragments from DNA of coliphageλ λ. From the total fragments obtained with nuclease R., EcoR1, the apparent minimum molecular weight of phage Shd-4LI0DNA was found to be 54.3 x 10 6 Daltons. The molecular weight of the phage DNA was also calculated from measurements of contour length of purified DNA samples, using the formula MW = 1.97 x 10 10 1/(magnification), where 1 is the measured length of DNA in centimetres). The very close relatedness with phage Shd-4L3 was confirmed by these techniques. (au)
Foundations of the theory of probability
Kolmogorov, AN
2018-01-01
This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
Henmi, Shuichi
2013-01-01
The author considered algorithms to presume the lesion location from a series of X-ray images obtained by four direction radiography without blind area for the U region of the stomach. The objects of analysis were six cases that protruding lesions were noticed in the U region. Firstly, from the length of short axis and measure of the lateral width of U region projected on the film, we presumed the length of longitudinal axis and angle between short axis and the film. Secondly, we calculated the rate of length to stomach walls from right side and left side of every image to the lateral width at the height passing through the center of the lesion. Using the lesion location calculated from these values, we presumed that the values that almost agreed between two images to be the lesion location. As the result of analysis, there were some cases that the lesion location could be presumed certainly or un-certainly, on the other hand, there were some cases that the lesion location could not be presumed. Since the form of the U region can be distorted by a change of position, or the angle between longitudinal axis and sagittal plane was changed, the error might have been made in calculation, and so it was considered that the lesion location could not be presumed.
Void probability scaling in hadron nucleus interactions
International Nuclear Information System (INIS)
Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima
2002-01-01
Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability
Variable kernel density estimation in high-dimensional feature spaces
CSIR Research Space (South Africa)
Van der Walt, Christiaan M
2017-02-01
Full Text Available Estimating the joint probability density function of a dataset is a central task in many machine learning applications. In this work we address the fundamental problem of kernel bandwidth estimation for variable kernel density estimation in high...
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.
2011-01-01
Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.
Dependent Human Error Probability Assessment
International Nuclear Information System (INIS)
Simic, Z.; Mikulicic, V.; Vukovic, I.
2006-01-01
This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the
Zhang, Jiaxin; Shields, Michael D.
2018-01-01
This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.
Deal, J. H.
1975-01-01
One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.
Baskerville, Jerry Ray; Herrick, John
2012-02-01
This study focuses on clinically assigned prospective estimated pretest probability and pretest perception of legal risk as independent variables in the ordering of multidetector computed tomographic (MDCT) head scans. Our primary aim is to measure the association between pretest probability of a significant finding and pretest perception of legal risk. Secondarily, we measure the percentage of MDCT scans that physicians would not order if there was no legal risk. This study is a prospective, cross-sectional, descriptive analysis of patients 18 years and older for whom emergency medicine physicians ordered a head MDCT. We collected a sample of 138 patients subjected to head MDCT scans. The prevalence of a significant finding in our population was 6%, yet the pretest probability expectation of a significant finding was 33%. The legal risk presumed was even more dramatic at 54%. These data support the hypothesis that physicians presume the legal risk to be significantly higher than the risk of a significant finding. A total of 21% or 15% patients (95% confidence interval, ±5.9%) would not have been subjected to MDCT if there was no legal risk. Physicians overestimated the probability that the computed tomographic scan would yield a significant result and indicated an even greater perceived medicolegal risk if the scan was not obtained. Physician test-ordering behavior is complex, and our study queries pertinent aspects of MDCT testing. The magnification of legal risk vs the pretest probability of a significant finding is demonstrated. Physicians significantly overestimated pretest probability of a significant finding on head MDCT scans and presumed legal risk. Copyright © 2012 Elsevier Inc. All rights reserved.
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
Chilingarian, L I
2005-01-01
Individual typological features of behavior of dogs were investigated by the method of choice between the low-valuable food available constantly and food of high quality presented with low probability. Animals were subjected to instrumental conditioning with the same conditioned stimuli but different types of reinforcement. Depression of a white pedal was always reinforced with meat-bread-crumb mixture, depression of a black pedal was reinforced with two pieces of liver (with probabilities of 100, 40, 33, 20, or 0%). The choice of reinforcement depended on probability of valuable food and individual typological features of the nervous system of a dog. Decreasing the probability of the reinforcement value to 40-20% revealed differences in behavior of dogs. Dogs of the first group, presumably with the weak type of the nervous system, more frequently pressed the white pedal (always reinforced) than the black pedal thus "avoiding a situation of risk" to receive an empty cup. They displayed symptoms of neurosis: whimper, refusals of food or of the choice of reinforcement, and obtrusive movements. Dogs of the second group, presumably with the strong type of the nervous system, more frequently pressed the black pedal (more valuable food) for the low-probability reward until they obtained the valuable food. They did not show neurosis symptoms and were not afraid of "situation of risk". A decrease in probability of the valuable reinforcement increased a percentage of long-latency depressions of pedals. It can be probably suggested that this phenomenon was associated with increasing involvement of cognitive processes, when contributions of the assessments of probability and value of the reinforcement to decision making became approximately equal. Choice between the probability and value of alimentary reinforcement is a good method for revealing individual typological features of dogs.
Wright, Jason D; Cui, Rosa R; Wang, Anqi; Chen, Ling; Tergas, Ana I; Burke, William M; Ananth, Cande V; Hou, June Y; Neugut, Alfred I; Temkin, Sarah M; Wang, Y Claire; Hershman, Dawn L
2015-11-01
Electric power morcellation during laparoscopic hysterectomy allows some women to undergo minimally invasive surgery but may disrupt underlying occult malignancies and increase the risk of tumor dissemination. We developed a state transition Markov cohort simulation model of the risks and benefits of hysterectomy (abdominal, laparoscopic, and laparoscopic with electric power morcellation) for women with presumed benign gynecologic disease. The model considered perioperative morbidity, mortality, risk of cancer and dissemination, and outcomes in women with an underlying malignancy. We explored the effectiveness from a societal perspective stratified by age (women. Per 10 000 women younger than age 40 years, laparoscopic hysterectomy with morcellation was associated with 1.57 more cases of disseminated cancer and 0.97 fewer deaths than abdominal hysterectomy. The excess cases of disseminated cancer per 10 000 women with morcellation compared with abdominal hysterectomy increased with age to 47.54 per 10 000 in women age 60 years and older. Compared with abdominal hysterectomy, this resulted in 0.30 (age 40-49 years), 5.07 (age 50-59 years), and 18.14 (age 60 years and older) excess deaths per 10 000 women in the respective age groups. Laparoscopic hysterectomy without morcellation is the most beneficial approach of the three methods of hysterectomy studied. In older women, the risks of electric power morcellation may outweigh the benefits of minimally invasive hysterectomy. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Wisselink, Marinus A; Willemse, Ton
2009-04-01
The objective of this study was to compare the efficacy of cyclosporine A (CsA) and prednisolone in feline atopic dermatitis (AD) in a randomised, controlled double blind study. Twenty-nine cats with feline AD were randomly allocated to two groups. Eleven cats were treated orally with prednisolone (1mg/kg SID) and 18 were treated with CsA (5mg/kg/day) for 4 weeks. At day 0 (D0) and D28, skin lesions were graded by means of the canine atopic dermatitis extent and severity index (CADESI). Skin biopsies and intradermal allergy tests were performed at D0 and blood samples for haematology and serum biochemistry were collected at D0 and D28. During the trial the cat owners were asked to evaluate the intensity of the pruritus once weekly on a linear analog scale and to record side effects. Based on the CADESI there was no significant difference between the two groups in the amount of remission (P=0.0562) or in the number of cats that improved by >25% (P=0.0571). The effect of CsA and prednisolone on pruritus as evaluated by the owners was not significantly different (P=0.41) between the two groups. No serious side effects were observed. The conclusion was that CsA is an effective alternative to prednisolone therapy in cats with presumed atopic dermatitis.
Directory of Open Access Journals (Sweden)
T. Y. Alvin Liu
2018-01-01
Full Text Available A 37-year-old Caucasian woman presented with acute decrease in central vision in her right eye and was found to have subfoveal choroidal neovascularization (CNV due to presumed ocular histoplasmosis syndrome (POHS. Her visual acuity improved from 20/70 to 20/20 at her 6-month follow-up, after 3 consecutive monthly intravitreal bevacizumab injections were initiated at her first visit. Although no CNV activity was seen on fluorescein angiography (FA or spectral-domain optical coherence tomography (SD-OCT at her 2-month, 4-month, and 6-month follow-up visits, persistent flow in the CNV lesion was detected on optical coherence tomography angiography (OCTA. OCTA shows persistent vascular flow as well as changes in vascular flow in CNV lesions associated with POHS, indicating the continued presence of patent vessels and changes in these CNV lesions, even when traditional imaging of the lesion with OCT and FA indicates stability of the lesion with no disease activity. Additional cases with longitudinal follow-up are needed to assess how OCTA should be incorporated into clinical practice.
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as
Residual Defect Density in Random Disks Deposits.
Topic, Nikola; Pöschel, Thorsten; Gallas, Jason A C
2015-08-03
We investigate the residual distribution of structural defects in very tall packings of disks deposited randomly in large channels. By performing simulations involving the sedimentation of up to 50 × 10(9) particles we find all deposits to consistently show a non-zero residual density of defects obeying a characteristic power-law as a function of the channel width. This remarkable finding corrects the widespread belief that the density of defects should vanish algebraically with growing height. A non-zero residual density of defects implies a type of long-range spatial order in the packing, as opposed to only local ordering. In addition, we find deposits of particles to involve considerably less randomness than generally presumed.
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL) probability wheel
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Striatal activity is modulated by target probability.
Hon, Nicholas
2017-06-14
Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.
Defining Probability in Sex Offender Risk Assessment.
Elwood, Richard W
2016-12-01
There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
Mechanisms Affecting Population Density in Fragmented Habitat
Directory of Open Access Journals (Sweden)
Lutz Tischendorf
2005-06-01
Full Text Available We conducted a factorial simulation experiment to analyze the relative importance of movement pattern, boundary-crossing probability, and mortality in habitat and matrix on population density, and its dependency on habitat fragmentation, as well as inter-patch distance. We also examined how the initial response of a species to a fragmentation event may affect our observations of population density in post-fragmentation experiments. We found that the boundary-crossing probability from habitat to matrix, which partly determines the emigration rate, is the most important determinant for population density within habitat patches. The probability of crossing a boundary from matrix to habitat had a weaker, but positive, effect on population density. Movement behavior in habitat had a stronger effect on population density than movement behavior in matrix. Habitat fragmentation and inter-patch distance may have a positive or negative effect on population density. The direction of both effects depends on two factors. First, when the boundary-crossing probability from habitat to matrix is high, population density may decline with increasing habitat fragmentation. Conversely, for species with a high matrix-to-habitat boundary-crossing probability, population density may increase with increasing habitat fragmentation. Second, the initial distribution of individuals across the landscape: we found that habitat fragmentation and inter-patch distance were positively correlated with population density when individuals were distributed across matrix and habitat at the beginning of our simulation experiments. The direction of these relationships changed to negative when individuals were initially distributed across habitat only. Our findings imply that the speed of the initial response of organisms to habitat fragmentation events may determine the direction of observed relationships between habitat fragmentation and population density. The time scale of post
Is probability of frequency too narrow?
International Nuclear Information System (INIS)
Martz, H.F.
1993-01-01
Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed
Probability function of breaking-limited surface elevation. [wind generated waves of ocean
Tung, C. C.; Huang, N. E.; Yuan, Y.; Long, S. R.
1989-01-01
The effect of wave breaking on the probability function of surface elevation is examined. The surface elevation limited by wave breaking zeta sub b(t) is first related to the original wave elevation zeta(t) and its second derivative. An approximate, second-order, nonlinear, non-Gaussian model for zeta(t) of arbitrary but moderate bandwidth is presented, and an expression for the probability density function zeta sub b(t) is derived. The results show clearly that the effect of wave breaking on the probability density function of surface elevation is to introduce a secondary hump on the positive side of the probability density function, a phenomenon also observed in wind wave tank experiments.
Directory of Open Access Journals (Sweden)
Morris Robin G
2008-10-01
Full Text Available Abstract Background Presumed obligate carriers (POCs are the first-degree relatives of people with schizophrenia who, although do not exhibit the disorder, are in direct lineage of it. Thus, this subpopulation of first-degree relatives could provide very important information with regard to the investigation of endophenotypes for schizophrenia that could clarify the often contradictory findings in schizophrenia high-risk populations. To date, despite the extant literature on schizophrenia endophenotypes, we are only aware of one other study that examined the neural mechanisms that underlie cognitive abnormalities in this group. The aim of this study was to investigate whether a more homogeneous group of relatives, such as POCs, have neural abnormalities that may be related to schizophrenia. Methods We used functional magnetic resonance imaging (fMRI to collect blood oxygenated level dependent (BOLD response data in six POCs and eight unrelated healthy controls while performing under conditions of sustained, selective and divided attention. Results The POCs indicated alterations in a widely distributed network of regions involved in attention processes, such as the prefrontal and temporal (including the parahippocampal gyrus cortices, in addition to the anterior cingulate gyrus. More specifically, a general reduction in BOLD response was found in these areas compared to the healthy participants during attention processes. Conclusion These preliminary findings of decreased activity in POCs indicate that this more homogeneous population of unaffected relatives share similar neural abnormalities with people with schizophrenia, suggesting that reduced BOLD activity in the attention network may be an intermediate marker for schizophrenia.
Padula, Andrew M; Winkel, Kenneth D
2016-04-01
A fatal outcome of a presumed tiger snake (Notechis scutatus) envenomation in a cat is described. Detectable venom components and antivenom concentrations in serum from clotted and centrifuged whole blood and urine were measured using a sensitive and specific ELISA. The cat presented in a paralysed state with a markedly elevated serum CK but with normal clotting times. The cat was treated with intravenous fluids and received two vials of equine whole IgG bivalent (tiger and brown snake) antivenom. Despite treatment the cat's condition did not improve and it died 36 h post-presentation. Serum concentration of detectable tiger snake venom components at initial presentation was 311 ng/mL and urine 832 ng/mL, this declined to non-detectable levels in serum 15-min after intravenous antivenom. Urine concentration of detectable tiger snake venom components declined to 22 ng/mL at post-mortem. Measurement of equine anti-tiger snake venom specific antibody demonstrated a concentration of 7.2 Units/mL in serum at post-mortem which had declined from an initial high of 13 Units/mL at 15-min post-antivenom. The ELISA data demonstrated the complete clearance of detectable venom components from serum with no recurrence in the post-mortem samples. Antivenom concentrations in serum at initial presentation were at least 100-fold higher than theoretically required to neutralise the circulating concentrations of venom. Despite the fatal outcome in this case it was concluded that this was unlikely that is was due to insufficient antivenom. Copyright © 2016 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Mullié Catherine
2012-03-01
Full Text Available Abstract Background A better anti-malarial efficiency and lower neurotoxicity have been reported for mefloquine (MQ (+- enantiomer. However, the importance of stereoselectivity remains poorly understood as the anti-malarial activity of pure enantiomer MQ analogues has never been described. Building on these observations, a series of enantiopure 4-aminoalcohol quinoline derivatives has previously been synthesized to optimize the efficiency and reduce possible adverse effects. Their in vitro activity on Plasmodium falciparum W2 and 3D7 strains is reported here along with their inhibition of β-haematin formation and peroxidative degradation of haemin, two possible mechanisms of action of anti-malarial drugs. Results The (S-enantiomers of this series of 4-aminoalcohol quinoline derivatives were found to be at least as effective as both chloroquine (CQ and MQ. The derivative with a 5-carbon side-chain length was the more efficient on both P. falciparum strains. (R -enantiomers displayed an activity decreased by 2 to 15-fold as compared to their (S counterparts. The inhibition of β-haematin formation was significantly stronger with all tested compounds than with MQ, irrespective of the stereochemistry. Similarly, the inhibition of haemin peroxidation was significantly higher for both (S and (R-enantiomers of derivatives with a side-chain length of five or six carbons than for MQ and CQ. Conclusions The prominence of stereochemistry in the anti-malarial activity of 4-aminoalcohol quinoline derivatives is confirmed. The inhibition of β-haematin formation and haemin peroxidation can be put forward as presumed mechanisms of action but do not account for the stereoselectivity of action witnessed in vitro.
What probabilities tell about quantum systems, with application to entropy and entanglement
Myers, John M
2010-01-01
The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Introducing Disjoint and Independent Events in Probability.
Kelly, I. W.; Zwiers, F. W.
Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Collective probabilities algorithm for surface hopping calculations
International Nuclear Information System (INIS)
Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto
2003-01-01
General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method
Examples of Neutrosophic Probability in Physics
Directory of Open Access Journals (Sweden)
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Some open problems in noncommutative probability
International Nuclear Information System (INIS)
Kruszynski, P.
1981-01-01
A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Against All Odds: When Logic Meets Probability
van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.
2017-01-01
This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
The probability of the false vacuum decay
International Nuclear Information System (INIS)
Kiselev, V.; Selivanov, K.
1983-01-01
The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
The transition probabilities of the reciprocity model
Snijders, T.A.B.
1999-01-01
The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Measurement of probability distributions for internal stresses in dislocated crystals
Energy Technology Data Exchange (ETDEWEB)
Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)
2014-11-03
Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.
Joint probability discrimination between stationary tissue and blood velocity signals
DEFF Research Database (Denmark)
Schlaikjer, Malene; Jensen, Jørgen Arendt
2001-01-01
before and after echo-canceling, and (b) the amplitude variations between samples in consecutive RF-signals before and after echo-canceling. The statistical discriminator was obtained by computing the probability density functions (PDFs) for each feature through histogram analysis of data....... This study presents a new statistical discriminator. Investigation of the RF-signals reveals that features can be derived that distinguish the segments of the signal, which do an do not carry information on the blood flow. In this study 4 features, have been determined: (a) the energy content in the segments....... The discrimination is performed by determining the joint probability of the features for the segment under investigation and choosing the segment type that is most likely. The method was tested on simulated data resembling RF-signals from the carotid artery....
The enigma of probability and physics
International Nuclear Information System (INIS)
Mayants, L.
1984-01-01
This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
International Nuclear Information System (INIS)
Shimada, Yoshio
2000-01-01
It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)
Assessing the clinical probability of pulmonary embolism
International Nuclear Information System (INIS)
Miniati, M.; Pistolesi, M.
2001-01-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3
Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence
International Nuclear Information System (INIS)
Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del
2009-01-01
Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.
Upgrading Probability via Fractions of Events
Directory of Open Access Journals (Sweden)
Frič Roman
2016-08-01
Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.
Failure probability analysis of optical grid
Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng
2008-11-01
Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.
Cowles, Anne; Beatty, William W; Nixon, Sara Jo; Lutz, Lanna J; Paulk, Jason; Paulk, Kayla; Ross, Elliott D
2003-12-01
Previous studies have described patients with possible or probable Alzheimer's disease (AD) who continued to play familiar songs skillfully, despite their dementias. There are no reports about patients with dementia who successfully learned to play new songs, and two papers describe failures of patients with AD to learn to play a new song although they continued to play familiar songs competently. In the present paper we describe a moderately demented patient (SL) with probable AD who learned to play a song (Cossackaya!) on the violin that was published after the apparent onset of his dementia. He showed modest retention of the song at delays of 0 and 10 minutes. This contrasts with his profound disturbance in both recall and recognition on other anterograde memory tests (word lists, stories, figures, environmental sounds, sounds of musical instruments), and marked impairment on measures of remote memory (famous faces, autobiographical memory). SL showed milder deficits in confrontation naming, verbal fluency and attention, but no dyspraxia or aphasic comprehension deficits. Except for the Block Design test, his visuospatial skills were intact. SL's learning of the new song in the absence of any evidence of episodic memory is reminiscent of patients with temporal lobe amnesia who show better memory for song melody than for lyrics or verse, although his retention was not as good.
Directory of Open Access Journals (Sweden)
Janet M. Wojcicki
2017-01-01
Full Text Available Leukocyte telomere length is shorter in response to chronic disease processes associated with inflammation such as diabetes mellitus and coronary artery disease. Data from the National Health and Nutrition Examination Survey (NHANES from 1999 to 2002 was used to explore the relationship between leukocyte telomere length and presumed NAFLD, as indicated by elevated serum alanine aminotransferase (ALT levels, obesity, or abdominal obesity. Logistic regression models were used to evaluate the relationship between telomere length and presumed markers of NAFLD adjusting for possible confounders. There was no relationship between elevated ALT levels, abdominal obesity, or obesity and telomere length in adjusted models in NHANES (OR 1.13, 95% CI 0.48–2.65; OR 1.17, 95% CI 0.52–2.62, resp.. Mexican-American men had shorter telomere length in relation to presumed NAFLD (OR 0.07, 95% CI 0.006–0.79 and using different indicators of NAFLD (OR 0.012, 95% CI 0.0006–0.24. Mexican origin with presumed NAFLD had shorter telomere length than men in other population groups. Longitudinal studies are necessary to evaluate the role of telomere length as a potential predictor to assess pathogenesis of NALFD in Mexicans.
Uncertainty about probability: a decision analysis perspective
International Nuclear Information System (INIS)
Howard, R.A.
1988-01-01
The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Dependency models and probability of joint events
International Nuclear Information System (INIS)
Oerjasaeter, O.
1982-08-01
Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Probabilities on Streams and Reflexive Games
Directory of Open Access Journals (Sweden)
Andrew Schumann
2014-01-01
Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Computation of the Complex Probability Function
Energy Technology Data Exchange (ETDEWEB)
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....