Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice prob...
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
PROBABILITY MODEL OF GUNTHER GENERATOR
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.
Generating pseudo-random discrete probability distributions
Energy Technology Data Exchange (ETDEWEB)
Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica
2015-08-15
The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)
Conditional Probability Analyses of the Spike Activity of Single Neurons
Gray, Peter R.
1967-01-01
With the objective of separating stimulus-related effects from refractory effects in neuronal spike data, various conditional probability analyses have been developed. These analyses are introduced and illustrated with examples based on electrophysiological data from auditory nerve fibers. The conditional probability analyses considered here involve the estimation of the conditional probability of a firing in a specified time interval (defined relative to the time of the stimulus presentation), given that the last firing occurred during an earlier specified time interval. This calculation enables study of the stimulus-related effects in the spike data with the time-since-the-last-firing as a controlled variable. These calculations indicate that auditory nerve fibers “recover” from the refractory effects that follow a firing in the following sense: after a “recovery time” of approximately 20 msec, the firing probabilities no longer depend on the time-since-the-last-firing. Probabilities conditional on this minimum time since the last firing are called “recovered probabilities.” The recovered probabilities presented in this paper are contrasted with the corresponding poststimulus time histograms, and the differences are related to the refractory properties of the nerve fibers. Imagesp[762]-a PMID:19210997
Estimating probable flaw distributions in PWR steam generator tubes
Energy Technology Data Exchange (ETDEWEB)
Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)
1997-02-01
This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.
Generating Probability Distributions using Multivalued Stochastic Relay Circuits
Lee, David
2011-01-01
The problem of random number generation dates back to von Neumann's work in 1951. Since then, many algorithms have been developed for generating unbiased bits from complex correlated sources as well as for generating arbitrary distributions from unbiased bits. An equally interesting, but less studied aspect is the structural component of random number generation as opposed to the algorithmic aspect. That is, given a network structure imposed by nature or physical devices, how can we build networks that generate arbitrary probability distributions in an optimal way? In this paper, we study the generation of arbitrary probability distributions in multivalued relay circuits, a generalization in which relays can take on any of N states and the logical 'and' and 'or' are replaced with 'min' and 'max' respectively. Previous work was done on two-state relays. We generalize these results, describing a duality property and networks that generate arbitrary rational probability distributions. We prove that these network...
Analysing uncertainties: Towards comparing Bayesian and interval probabilities'
Blockley, David
2013-05-01
Two assumptions, commonly made in risk and reliability studies, have a long history. The first is that uncertainty is either aleatoric or epistemic. The second is that standard probability theory is sufficient to express uncertainty. The purposes of this paper are to provide a conceptual analysis of uncertainty and to compare Bayesian approaches with interval approaches with an example relevant to research on climate change. The analysis reveals that the categorisation of uncertainty as either aleatoric or epistemic is unsatisfactory for practical decision making. It is argued that uncertainty emerges from three conceptually distinctive and orthogonal attributes FIR i.e., fuzziness, incompleteness (epistemic) and randomness (aleatory). Characterisations of uncertainty, such as ambiguity, dubiety and conflict, are complex mixes of interactions in an FIR space. To manage future risks in complex systems it will be important to recognise the extent to which we 'don't know' about possible unintended and unwanted consequences or unknown-unknowns. In this way we may be more alert to unexpected hazards. The Bayesian approach is compared with an interval probability approach to show one way in which conflict due to incomplete information can be managed.
Unturbe, Jesús; Corominas, Josep
2007-09-01
Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.
Sharp Bounds by Probability-Generating Functions and Variable Drift
DEFF Research Database (Denmark)
Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten
2011-01-01
We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al. (G...
Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.
Directory of Open Access Journals (Sweden)
Amber M Sprenger
2011-06-01
Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.
Marewski, Julian N; Hoffrage, Ulrich
2013-06-01
A lot of research in cognition and decision making suffers from a lack of formalism. The quantum probability program could help to improve this situation, but we wonder whether it would provide even more added value if its presumed focus on outcome models were complemented by process models that are, ideally, informed by ecological analyses and integrated into cognitive architectures.
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ？ Probably【解语】作副词，意为“大概、或许”，表示可能性很大，通常指根据目前情况作出积极推测或判断；
The relative impact of sizing errors on steam generator tube failure probability
Energy Technology Data Exchange (ETDEWEB)
Cizelj, L.; Dvorsek, T. [Jozef Stefan Inst., Ljubljana (Slovenia)
1998-07-01
The Outside Diameter Stress Corrosion Cracking (ODSCC) at tube support plates is currently the major degradation mechanism affecting the steam generator tubes made of Inconel 600. This caused development and licensing of degradation specific maintenance approaches, which addressed two main failure modes of the degraded piping: tube rupture; and excessive leakage through degraded tubes. A methodology aiming at assessing the efficiency of a given set of possible maintenance approaches has already been proposed by the authors. It pointed out better performance of the degradation specific over generic approaches in (1) lower probability of single and multiple steam generator tube rupture (SGTR), (2) lower estimated accidental leak rates and (3) less tubes plugged. A sensitivity analysis was also performed pointing out the relative contributions of uncertain input parameters to the tube rupture probabilities. The dominant contribution was assigned to the uncertainties inherent to the regression models used to correlate the defect size and tube burst pressure. The uncertainties, which can be estimated from the in-service inspections, are further analysed in this paper. The defect growth was found to have significant and to some extent unrealistic impact on the probability of single tube rupture. Since the defect growth estimates were based on the past inspection records they strongly depend on the sizing errors. Therefore, an attempt was made to filter out the sizing errors and to arrive at more realistic estimates of the defect growth. The impact of different assumptions regarding sizing errors on the tube rupture probability was studied using a realistic numerical example. The data used is obtained from a series of inspection results from Krsko NPP with 2 Westinghouse D-4 steam generators. The results obtained are considered useful in safety assessment and maintenance of affected steam generators. (author)
Demand and choice probability generating functions for perturbed consumers
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2011-01-01
generating function to be consistent with utility maximization. Within a budget, the convex hull of the demand correspondence is the subdifferential of the demand generating function. The additive random utility discrete choice model (ARUM) is a special case with finite budget sets where utility...
Finite-order universal portfolios generated by probability mass functions
Tan, Choon Peng; Chu, Sin Yen; Pan, Wei Yeing
2015-05-01
It is shown that the finite-order universal portfolios generated by independent discrete random variables are constant rebalanced portfolios. The case where the universal portfolios are generated by the moments of the joint Dirichlet distribution is studied. The performance of the low-order Dirichlet universal portfolios on some stock-price data set is analyzed. It is demonstrated that the performance is comparable and in some cases outperform the moving-order Cover-Ordentlich universal portfolios with faster implementation time and higher wealth achieved.
Inverse probability of censoring weighted estimates of Kendall's τ for gap time analyses.
Lakhal-Chaieb, Lajmi; Cook, Richard J; Lin, Xihong
2010-12-01
In life history studies, interest often lies in the analysis of the interevent, or gap times and the association between event times. Gap time analyses are challenging however, even when the length of follow-up is determined independently of the event process, because associations between gap times induce dependent censoring for second and subsequent gap times. This article discusses nonparametric estimation of the association between consecutive gap times based on Kendall's τ in the presence of this type of dependent censoring. A nonparametric estimator that uses inverse probability of censoring weights is provided. Estimates of conditional gap time distributions can be obtained following specification of a particular copula function. Simulation studies show the estimator performs well and compares favorably with an alternative estimator. Generalizations to a piecewise constant Clayton copula are given. Several simulation studies and illustrations with real data sets are also provided.
Directory of Open Access Journals (Sweden)
Gogarten J Peter
2002-02-01
Full Text Available Abstract Background Horizontal gene transfer (HGT played an important role in shaping microbial genomes. In addition to genes under sporadic selection, HGT also affects housekeeping genes and those involved in information processing, even ribosomal RNA encoding genes. Here we describe tools that provide an assessment and graphic illustration of the mosaic nature of microbial genomes. Results We adapted the Maximum Likelihood (ML mapping to the analyses of all detected quartets of orthologous genes found in four genomes. We have automated the assembly and analyses of these quartets of orthologs given the selection of four genomes. We compared the ML-mapping approach to more rigorous Bayesian probability and Bootstrap mapping techniques. The latter two approaches appear to be more conservative than the ML-mapping approach, but qualitatively all three approaches give equivalent results. All three tools were tested on mitochondrial genomes, which presumably were inherited as a single linkage group. Conclusions In some instances of interphylum relationships we find nearly equal numbers of quartets strongly supporting the three possible topologies. In contrast, our analyses of genome quartets containing the cyanobacterium Synechocystis sp. indicate that a large part of the cyanobacterial genome is related to that of low GC Gram positives. Other groups that had been suggested as sister groups to the cyanobacteria contain many fewer genes that group with the Synechocystis orthologs. Interdomain comparisons of genome quartets containing the archaeon Halobacterium sp. revealed that Halobacterium sp. shares more genes with Bacteria that live in the same environment than with Bacteria that are more closely related based on rRNA phylogeny . Many of these genes encode proteins involved in substrate transport and metabolism and in information storage and processing. The performed analyses demonstrate that relationships among prokaryotes cannot be accurately
Banking on a bad bet. Probability matching in risky choice is linked to expectation generation.
James, Greta; Koehler, Derek J
2011-06-01
Probability matching is the tendency to match choice probabilities to outcome probabilities in a binary prediction task. This tendency is a long-standing puzzle in the study of decision making under risk and uncertainty, because always predicting the more probable outcome across a series of trials (maximizing) would yield greater predictive accuracy and payoffs. In three experiments, we tied the predominance of probability matching over maximizing to a generally adaptive cognitive operation that generates expectations regarding the aggregate outcomes of an upcoming sequence of events. Under conditions designed to diminish the generation or perceived applicability of such expectations, we found that the frequency of probability-matching behavior dropped substantially and maximizing became the norm.
Energy Technology Data Exchange (ETDEWEB)
Hall, Jim W.; Lawry, Jonathan
2004-09-01
Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM.
Inverse Probability of Censoring Weighted Estimates of Kendall’s τ for Gap Time Analyses
Lakhal-Chaieb, Lajmi; Cook, Richard J.; Lin, Xihong
2010-01-01
In life history studies interest often lies in the analysis of the inter-event, or gap times and the association between event times. Gap time analyses are challenging however, even when the length of follow-up is determined independently of the event process, since associations between gap times induce dependent censoring for second and subsequent gap times. This paper discusses nonparametric estimation of the association between consecutive gap times based on Kendall’s τ in the presence of ...
Fortran code for generating random probability vectors, unitaries, and quantum states
Directory of Open Access Journals (Sweden)
Jonas eMaziero
2016-03-01
Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.
Minimization of Handoff Failure Probability for Next-Generation Wireless Systems
Sarddar, Debabrata; Saha, Souvik Kumar; Banerjee, Joydeep; Biswas, Utpal; Naskar, M K; 10.5121/ijngn.2010.2204
2010-01-01
During the past few years, advances in mobile communication theory have enabled the development and deployment of different wireless technologies, complementary to each other. Hence, their integration can realize a unified wireless system that has the best features of the individual networks. Next-Generation Wireless Systems (NGWS) integrate different wireless systems, each of which is optimized for some specific services and coverage area to provide ubiquitous communications to the mobile users. In this paper, we propose to enhance the handoff performance of mobile IP in wireless IP networks by reducing the false handoff probability in the NGWS handoff management protocol. Based on the information of false handoff probability, we analyze its effect on mobile speed and handoff signaling delay.
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based
Fortran code for generating random probability vectors, unitaries, and quantum states
Maziero, Jonas
2015-01-01
The usefulness of generating random configurations is recognized in a variety of contexts, as for instance in the simulation of physical systems, in the verification of bounds and/or ansatz solutions for optimization problems, and in secure communications. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And the several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.
Statistical inference of the generation probability of T-cell receptors from sequence repertoires.
Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G
2012-10-02
Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.
Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A
2016-06-14
A false positive is the mistake of inferring an effect when none exists, and although α controls the false positive (Type I error) rate in classical hypothesis testing, a given α value is accurate only if the underlying model of randomness appropriately reflects experimentally observed variance. Hypotheses pertaining to one-dimensional (1D) (e.g. time-varying) biomechanical trajectories are most often tested using a traditional zero-dimensional (0D) Gaussian model of randomness, but variance in these datasets is clearly 1D. The purpose of this study was to determine the likelihood that analyzing smooth 1D data with a 0D model of variance will produce false positives. We first used random field theory (RFT) to predict the probability of false positives in 0D analyses. We then validated RFT predictions via numerical simulations of smooth Gaussian 1D trajectories. Results showed that, across a range of public kinematic, force/moment and EMG datasets, the median false positive rate was 0.382 and not the assumed α=0.05, even for a simple two-sample t test involving N=10 trajectories per group. The median false positive rate for experiments involving three-component vector trajectories was p=0.764. This rate increased to p=0.945 for two three-component vector trajectories, and to p=0.999 for six three-component vectors. This implies that experiments involving vector trajectories have a high probability of yielding 0D statistical significance when there is, in fact, no 1D effect. Either (a) explicit a priori identification of 0D variables or (b) adoption of 1D methods can more tightly control α.
On the probability of exceeding allowable leak rates through degraded steam generator tubes
Energy Technology Data Exchange (ETDEWEB)
Cizelj, L.; Sorsek, I. [Jozef Stefan Institute, Ljubljana (Slovenia); Riesch-Oppermann, H. [Forschungszentrum Karlsruhe (Germany)
1997-02-01
This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds the predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.
A generative probability model of joint label fusion for multi-atlas based brain segmentation.
Wu, Guorong; Wang, Qian; Zhang, Daoqiang; Nie, Feiping; Huang, Heng; Shen, Dinggang
2014-08-01
Automated labeling of anatomical structures in medical images is very important in many neuroscience studies. Recently, patch-based labeling has been widely investigated to alleviate the possible mis-alignment when registering atlases to the target image. However, the weights used for label fusion from the registered atlases are generally computed independently and thus lack the capability of preventing the ambiguous atlas patches from contributing to the label fusion. More critically, these weights are often calculated based only on the simple patch similarity, thus not necessarily providing optimal solution for label fusion. To address these limitations, we propose a generative probability model to describe the procedure of label fusion in a multi-atlas scenario, for the goal of labeling each point in the target image by the best representative atlas patches that also have the largest labeling unanimity in labeling the underlying point correctly. Specifically, sparsity constraint is imposed upon label fusion weights, in order to select a small number of atlas patches that best represent the underlying target patch, thus reducing the risks of including the misleading atlas patches. The labeling unanimity among atlas patches is achieved by exploring their dependencies, where we model these dependencies as the joint probability of each pair of atlas patches in correctly predicting the labels, by analyzing the correlation of their morphological error patterns and also the labeling consensus among atlases. The patch dependencies will be further recursively updated based on the latest labeling results to correct the possible labeling errors, which falls to the Expectation Maximization (EM) framework. To demonstrate the labeling performance, we have comprehensively evaluated our patch-based labeling method on the whole brain parcellation and hippocampus segmentation. Promising labeling results have been achieved with comparison to the conventional patch-based labeling
Making Heads or Tails of Probability: An Experiment with Random Generators
Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie
2013-01-01
Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…
Generation of Transition Probability Data: Can quantity and quality be balanced?
Curry, J. J.; Froese Fisher, C.
2008-10-01
The possibility of truly predictive plasma modeling rests on the availability of large quantities of accurate atomic and molecular data. These include a variety of collision cross-sections and radiative transition data. An example of current interest concerns radiative transition probabilities for neutral Ce, an additive in highly-efficient metal-halide lamps. Transition probabilities have been measured for several hundred lines (Bisson et al., JOSA B 12, 193, 1995 and Lawler et al., unpublished), but the number of observed and classified transitions in the range of 340 nm to 1 μm is in excess of 21,000 (Martin, unpublished). Since the prospect for measuring more than a thousand or so of these transitions is rather low, an important question is whether calculation can adequately fill the void. In this case, we are interested only in electric dipole transitions. Furthermore, we require only that the transition probabilities have an average accuracy of ˜20%. We will discuss our efforts to calculate a comprehensive set of transition probabilities for neutral Ce using the Cowan (The Theory of Atomic Structure and Spectra, 1981) and GRASP (J"onsson et al. Comput. Phys. Commun. 176, 559-579, 2007) codes. We will also discuss our efforts to quantify the accuracy of the results.
Ivana, Pavol; Ivanova, Marika
2016-01-01
Maxwell dynamic equation of Faraday law erroneously predicts that on homopolar without brush generator, the relative movement of the wire is equivalent with relative motion of the conductor of Faraday homopolar generator and therefore electric intensity must be generated at both devices. Research has shown that it is possible to construct experimental without brush homopolar generator, which proves that movement of electrically neutral conductor in radials of homogeneous magnetic field does not induce any voltage. A new description of the operation of Faraday (with brushes) homopolar generator is here presented such as equipment, which simulates necessary and sufficient condition for the formation of the induction. However, the without brush homopolar meets only a necessary condition, but not sufficient. This article includes a mathematical analysis that shows the current differential concept of the rotation intensity vector creation as an incorrect theoretical mission with minimal impact on the design of kno...
Hu, Yaogang; Li, Hui; Liao, Xinglin; Song, Erbing; Liu, Haitao; Chen, Z.
2016-08-01
This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.
DEFF Research Database (Denmark)
Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo
2015-01-01
is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...... extracted with stochastic route generation. The term is easily applicable to large-scale networks and various environments, given its dependence only on a random number generator and the Dijkstra shortest path algorithm. The implementation for revealed preferences data, which consist of actual route choices...... collected in Cagliari, Italy, shows the feasibility of generating routes stochastically in a high-resolution network and calculating the correction factor. The model estimation with and without correction illustrates how the correction not only improves the goodness of fit but also turns illogical signs...
PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL
Energy Technology Data Exchange (ETDEWEB)
Bovy, Jo; Hogg, David W.; Weaver, Benjamin A. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Myers, Adam D. [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Hennawi, Joseph F. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); McMahon, Richard G. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Schiminovich, David [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Sheldon, Erin S. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Brinkmann, Jon [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Schneider, Donald P., E-mail: jo.bovy@nyu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States)
2012-04-10
We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques-which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data-and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.
Photometric redshifts and quasar probabilities from a single, data-driven generative model
Energy Technology Data Exchange (ETDEWEB)
Bovy, Jo [New York Univ. (NYU), NY (United States); Myers, Adam D. [Univ. of Wyoming, Laramie, WY (United States); Max Planck Inst. for Medical Research, Heidelberg (Germany); Hennawi, Joseph F. [Max Planck Inst. for Medical Research, Heidelberg (Germany); Hogg, David W. [Max Planck Inst. for Medical Research, Heidelberg (Germany); New York Univ. (NYU), NY (United States); McMahon, Richard G. [Univ. of Cambridge (United Kingdom); Schiminovich, David [Columbia Univ., New York, NY (United States); Sheldon, Erin S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brinkmann, Jon [Apache Point Observatory and New Mexico State Univ., Sunspot, NM (United States); Schneider, Donald P. [Pennsylvania State Univ., University Park, PA (United States); Weaver, Benjamin A. [New York Univ. (NYU), NY (United States)
2012-03-20
We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.
On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach
Directory of Open Access Journals (Sweden)
M. Srinivas
1996-01-01
Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.
DEFF Research Database (Denmark)
Hu, Y.; Li, H.; Liao, X;
2016-01-01
This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration......-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation...... method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...
Influence of disorder on generation and probability of extreme events in Salerno lattices
Mančić, A.; Maluckov, A.; Hadžievski, Lj.
2017-03-01
Extreme events (EEs) in nonlinear and/or disordered one-dimensional photonic lattice systems described by the Salerno model with on-site disorder are studied. The goal is to explain particular properties of these phenomena, essentially related to localization of light in the presence of nonlinear and/or nonlocal couplings in the considered systems. Combining statistical and nonlinear dynamical methods and measures developed in the framework of the theory of localization phenomena in disordered and nonlinear systems, particularities of EEs are qualitatively clarified. Findings presented here indicate that the best environment for EEs' creation are disordered near-integrable Salerno lattices. In addition, it is been shown that the leading role in the generation and dynamical properties of EEs in the considered model is played by modulation instability, i.e., by nonlinearities in the system, although EEs can be induced in linear lattices with on-site disorder too.
Energy Technology Data Exchange (ETDEWEB)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)
2012-07-06
Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Mazurana, Dyan; Benelli, Prisca; Walker, Peter
2013-07-01
Humanitarian aid remains largely driven by anecdote rather than by evidence. The contemporary humanitarian system has significant weaknesses with regard to data collection, analysis, and action at all stages of response to crises involving armed conflict or natural disaster. This paper argues that humanitarian actors can best determine and respond to vulnerabilities and needs if they use sex- and age-disaggregated data (SADD) and gender and generational analyses to help shape their assessments of crises-affected populations. Through case studies, the paper shows how gaps in information on sex and age limit the effectiveness of humanitarian response in all phases of a crisis. The case studies serve to show how proper collection, use, and analysis of SADD enable operational agencies to deliver assistance more effectively and efficiently. The evidence suggests that the employment of SADD and gender and generational analyses assists in saving lives and livelihoods in a crisis.
Akın Avşaroğlu; Suphi URAL
2017-01-01
The purpose of this study is to Reducing and Analysing of Flow Accelerated Corrosion in Thermal Plant Heat Recovery Steam Generators. All these studies have been performed in a new and 16 year-old established Combined Cycle Power Plants in Turkey. Corrosion cases have been investigated due to Mechanical Outage Reports at Power Plant in 2011-2015. Flow Accelerated Corrosion study has been based on specific zone related with Economizer Low Pressure connection pipings. It was issued a performanc...
Long Cui; Emily Hoi-Man Wong; Guo Cheng; Manoel Firmato de Almeida; Man-Ting So; Pak-Chung Sham; Stacey S Cherny; Paul Kwong-Hang Tam; Maria-Mercè Garcia-Barceló
2013-01-01
We present the genetic analyses conducted on a three-generation family (14 individuals) with three members affected with isolated-Hirschsprung disease (HSCR) and one with HSCR and heterochromia iridum (syndromic-HSCR), a phenotype reminiscent of Waardenburg-Shah syndrome (WS4). WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs)...
Energy Technology Data Exchange (ETDEWEB)
Umezawa, Osamu [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501 (Japan); Morita, Motoaki [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501, Japan and Now Tokyo University of Marine Science and Technology, Koto-ku, Tokyo 135-8533 (Japan); Yuasa, Takayuki [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501, Japan and Now Nippon Steel and Sumitomo Metal, Kashima, 314-0014 (Japan); Morooka, Satoshi [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501, Japan and Now Tokyo Metropolitan University, Hino, Tokyo 191-0065 (Japan); Ono, Yoshinori; Yuri, Tetsumi; Ogata, Toshio [National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, 305-0047 (Japan)
2014-01-27
Subsurface crack initiation in high-cycle fatigue has been detected as (0001) transgranular facet in titanium alloys at low temperature. The discussion on the subsurface crack generation was reviewed. Analyses by neutron diffraction and full constraints model under tension mode as well as crystallographic identification of the facet were focused. The accumulated tensile stress along <0001> may be responsible to initial microcracking on (0001) and the crack opening.
Directory of Open Access Journals (Sweden)
Chung-Ho Su
2010-12-01
Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach.
Directory of Open Access Journals (Sweden)
Akın Avşaroğlu
2017-01-01
Full Text Available The purpose of this study is to Reducing and Analysing of Flow Accelerated Corrosion in Thermal Plant Heat Recovery Steam Generators. All these studies have been performed in a new and 16 year-old established Combined Cycle Power Plants in Turkey. Corrosion cases have been investigated due to Mechanical Outage Reports at Power Plant in 2011-2015. Flow Accelerated Corrosion study has been based on specific zone related with Economizer Low Pressure connection pipings. It was issued a performance report. Results and lessons learnt from these studies will be used as a preventive action manner in all similar Plants.
Analyses of an air conditioning system with entropy generation minimization and entransy theory
Yan-Qiu, Wu; Li, Cai; Hong-Juan, Wu
2016-06-01
In this paper, based on the generalized heat transfer law, an air conditioning system is analyzed with the entropy generation minimization and the entransy theory. Taking the coefficient of performance (denoted as COP) and heat flow rate Q out which is released into the room as the optimization objectives, we discuss the applicabilities of the entropy generation minimization and entransy theory to the optimizations. Five numerical cases are presented. Combining the numerical results and theoretical analyses, we can conclude that the optimization applicabilities of the two theories are conditional. If Q out is the optimization objective, larger entransy increase rate always leads to larger Q out, while smaller entropy generation rate does not. If we take COP as the optimization objective, neither the entropy generation minimization nor the concept of entransy increase is always applicable. Furthermore, we find that the concept of entransy dissipation is not applicable for the discussed cases. Project supported by the Youth Programs of Chongqing Three Gorges University, China (Grant No. 13QN18).
Directory of Open Access Journals (Sweden)
Jessica Jeyanthi James Antony
2015-12-01
Full Text Available This study was conducted to detect the morphological, histological and molecular diff erences in the second generation of the PVS2 cryopreserved Dendrobium Bobby Messina [DBM] (18 months old culture plantlets. Morphological analyses indicated that similarities and diff erences in cryopreserved DBM plantlets comparing to control stock culture based on selected morphological criteria. Morphological criteria, such as root length, number of shoot per explant and shoot length displayed diff erences, while the other three criteria, leaf diameter, leaf length and PLBs size were similar in cryopreserved compared to the control stock culture plant. Higher amount of homogenous cell population and denser cytoplasm were observed in cryopreserved PLBs compared to control stock culture PLBs based on histological analysis. This suggests the existance of somatic embryogenesis development mechanism taking place during the recovery and regeneration of the cryopreserved PLBs. However, RAPD analyses based on 10 primers indicated that cryopreserved DBM regenerated from vitrifi cation method generated a total of 20 to 39.9% polymorphic bands as compared to stock culture indicating potential somaclonal variation. Hence, an increase percentage of polymorphics bands in cryopreserved plantlets 18 months post cryopreservation as compared to previous report of 10% polymorphic bands in cryopreserved DBM 3 months post cryopreservation.
Directory of Open Access Journals (Sweden)
CHEOL HO PYEON
2013-02-01
Full Text Available Neutron spectrum analyses of spallation neutrons are conducted in the accelerator-driven system (ADS facility at the Kyoto University Critical Assembly (KUCA. High-energy protons (100 MeV obtained from the fixed field alternating gradient accelerator are injected onto a tungsten target, whereby the spallation neutrons are generated. For neutronic characteristics of spallation neutrons, the reaction rates and the continuous energy distribution of spallation neutrons are measured by the foil activation method and by an organic liquid scintillator, respectively. Numerical calculations are executed by MCNPX with JENDL/HE-2007 and ENDF/B-VI libraries to evaluate the reaction rates of activation foils (bismuth and indium set at the target and the continuous energy distribution of spallation neutrons set in front of the target. For the reaction rates by the foil activation method, the C/E values between the experiments and the calculations are found around a relative difference of 10%, except for some reactions. For continuous energy distribution by the organic liquid scintillator, the spallation neutrons are observed up to 45 MeV. From these results, the neutron spectrum information on the spallation neutrons generated at the target are attained successfully in injecting 100 MeV protons onto the tungsten target.
Cui, Long; Wong, Emily Hoi-Man; Cheng, Guo; Firmato de Almeida, Manoel; So, Man-Ting; Sham, Pak-Chung; Cherny, Stacey S; Tam, Paul Kwong-Hang; Garcia-Barceló, Maria-Mercè
2013-01-01
We present the genetic analyses conducted on a three-generation family (14 individuals) with three members affected with isolated-Hirschsprung disease (HSCR) and one with HSCR and heterochromia iridum (syndromic-HSCR), a phenotype reminiscent of Waardenburg-Shah syndrome (WS4). WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs) using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10) and in the main HSCR gene (RET). Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor) the transition results in the abolishment of translation initiation (M1V), in isoform 3 (only in the cytosol) the replacement occurs at Met91 (M91V) and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency-) in the 5'-untranslated region of EDN3 (EDNRB ligand) was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family.
Directory of Open Access Journals (Sweden)
Long Cui
Full Text Available We present the genetic analyses conducted on a three-generation family (14 individuals with three members affected with isolated-Hirschsprung disease (HSCR and one with HSCR and heterochromia iridum (syndromic-HSCR, a phenotype reminiscent of Waardenburg-Shah syndrome (WS4. WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10 and in the main HSCR gene (RET. Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor the transition results in the abolishment of translation initiation (M1V, in isoform 3 (only in the cytosol the replacement occurs at Met91 (M91V and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency- in the 5'-untranslated region of EDN3 (EDNRB ligand was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family.
Cheng, Guo; Firmato de Almeida, Manoel; So, Man-Ting; Sham, Pak-Chung; Cherny, Stacey S.; Tam, Paul Kwong-Hang; Garcia-Barceló, Maria-Mercè
2013-01-01
We present the genetic analyses conducted on a three-generation family (14 individuals) with three members affected with isolated-Hirschsprung disease (HSCR) and one with HSCR and heterochromia iridum (syndromic-HSCR), a phenotype reminiscent of Waardenburg-Shah syndrome (WS4). WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs) using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10) and in the main HSCR gene (RET). Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor) the transition results in the abolishment of translation initiation (M1V), in isoform 3 (only in the cytosol) the replacement occurs at Met91 (M91V) and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency-) in the 5′-untranslated region of EDN3 (EDNRB ligand) was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family. PMID:23840513
Chen, Cao; Xiao, Di; Zhou, Wei; Zhang, Yong-Chan; Shi, Qi; Tian, Chan; Zhang, Jin; Zhou, Chun-Xi; Zhang, Jian-Zhong; Dong, Xiao-Ping
2012-01-01
The shotgun strategy applying tandem mass spectrometry has been widely used to identify the proteins that are differentially distributed among diseases for its high reliability and efficiency. To find out the potential difference of protein profiles in cerebrospinal fluids (CSF) between Creutzfeldt-Jakob disease (CJD) and non-CJD patients, especially in the fraction ranging from 1-10 KD, the CSF samples of 40 probable sporadic CJD (sCJD) patients, 32 non-CJD cases with dementia and 17 non-CJD cases without dementia were separately pooled and enriched by the magnetic beads based weak cation exchange chromatography (MB-WCX). After trypsin digestion, each enriched CSF was separated and identified by RP-HPLC-ESI-QTOF MS/MS. In total, 42, 53 and 47 signals of proteins were identified in the pooled CSF fraction less than 10 KD of probable sCJD, non-CJD with dementia and non-CJD without dementia, respectively. Compared with that of probable sCJD, the similarity of CSF protein profiles of non-CJD with dementia (76.2%) were higher than that of non-CJD without dementia (57.1%). Nine CSF proteins were found to be specially observed in probable sCJD group. Those data may help to select the potential biomarkers for diagnosis of CJD. Additionally, further studies of the small segments of cellular proteins in CSF of CJD patients may also provide scientific clues for understanding the neuropathogenesis of TSEs.
Leake, Stanley A.; Macy, Jamie P.; Truini, Margot
2016-06-01
reclamation operations within the Kayenta Mine permit boundary since 1973.The KMC part of the proposed project requires approval by the Office of Surface Mining (OSM) of a significant revision of the mine’s permit to operate in accordance with the Surface Mine Control and Reclamation Act (Public Law 95-87, 91 Stat. 445 [30 U.S.C. 1201 et seq.]). The revision will identify coal resource areas that may be used to continue extracting coal at the present rate of approximately 8.2 million tons per year. The Kayenta Mine Complex uses water pumped from the D and N aquifers beneath PWCC’s leasehold to support mining and reclamation activities. Prior to 2006, water from the PWCC well field also was used to transport coal by way of a coal-slurry pipeline to the now-closed Mohave Generating Station. Water usage at the leasehold was approximately 4,100 acre-feet per year (acre-ft/yr) during the period the pipeline was in use, and declined to an average 1,255 acre-ft/yr from 2006 to 2011. The Probable Hydrologic Consequences (PHC) section of the mining and reclamation permit must be modified to project the consequences of extended water use by the mine for the duration of the KMC part of the project, including a post-mining reclamation period.Since 1971, the U.S. Geological Survey (USGS) has conducted the Black Mesa Monitoring Program, which consists of monitoring water levels and water quality in the N aquifer, compiling information on water use by PWCC and tribal communities, maintaining several stream-gaging stations, measuring discharge at selected springs, conducting special studies, and reporting findings. These data are useful in evaluating the effects on the N aquifer from PWCC and community pumping, and the effects of variable precipitation.The EIS will assess the impacts of continued pumping on the N aquifer, including changes in storage, water quality, and effects on spring and baseflow discharge, by proposed mining through 2044, and during the reclamation process to 2057
Leake, Stanley A.; Macy, Jamie P.; Truini, Margot
2016-06-01
reclamation operations within the Kayenta Mine permit boundary since 1973.The KMC part of the proposed project requires approval by the Office of Surface Mining (OSM) of a significant revision of the mine’s permit to operate in accordance with the Surface Mine Control and Reclamation Act (Public Law 95-87, 91 Stat. 445 [30 U.S.C. 1201 et seq.]). The revision will identify coal resource areas that may be used to continue extracting coal at the present rate of approximately 8.2 million tons per year. The Kayenta Mine Complex uses water pumped from the D and N aquifers beneath PWCC’s leasehold to support mining and reclamation activities. Prior to 2006, water from the PWCC well field also was used to transport coal by way of a coal-slurry pipeline to the now-closed Mohave Generating Station. Water usage at the leasehold was approximately 4,100 acre-feet per year (acre-ft/yr) during the period the pipeline was in use, and declined to an average 1,255 acre-ft/yr from 2006 to 2011. The Probable Hydrologic Consequences (PHC) section of the mining and reclamation permit must be modified to project the consequences of extended water use by the mine for the duration of the KMC part of the project, including a post-mining reclamation period.Since 1971, the U.S. Geological Survey (USGS) has conducted the Black Mesa Monitoring Program, which consists of monitoring water levels and water quality in the N aquifer, compiling information on water use by PWCC and tribal communities, maintaining several stream-gaging stations, measuring discharge at selected springs, conducting special studies, and reporting findings. These data are useful in evaluating the effects on the N aquifer from PWCC and community pumping, and the effects of variable precipitation.The EIS will assess the impacts of continued pumping on the N aquifer, including changes in storage, water quality, and effects on spring and baseflow discharge, by proposed mining through 2044, and during the reclamation process to 2057
Quantum, classical and semiclassical analyses of photon statistics in harmonic generation
Bajer, J; Bajer, Jiri; Miranowicz, Adam
2001-01-01
In this review, we compare different descriptions of photon-number statistics in harmonic generation processes within quantum, classical and semiclassical approaches. First, we study the exact quantum evolution of the harmonic generation by applying numerical methods including those of Hamiltonian diagonalization and global characteristics. We show explicitly that the harmonic generations can indeed serve as a source of nonclassical light. Then, we demonstrate that the quasi-stationary sub-Poissonian light can be generated in these quantum processes under conditions corresponding to the so-called no-energy-transfer regime known in classical nonlinear optics. By applying method of classical trajectories, we demonstrate that the analytical predictions of the Fano factors are in good agreement with the quantum results. On comparing second and higher harmonic generations in the no-energy-transfer regime, we show that the highest noise reduction is achieved in third-harmonic generation with the Fano-factor of the ...
Analyses of steam generator collector rupture for WWER-1000 using Relap5 code
Energy Technology Data Exchange (ETDEWEB)
Balabanov, E.; Ivanova, A. [Energoproekt, Sofia (Bulgaria)
1995-12-31
The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment.
Energy Technology Data Exchange (ETDEWEB)
Kumaki, Masafumi, E-mail: masafumi.kumaki@riken.jp [Cooperative Major in Nuclear Energy, Waseda University, Shinjuku, Tokyo (Japan); RIKEN, Wako, Saitama (Japan); Ikeda, Shunsuke; Sekine, Megumi; Munemoto, Naoya [RIKEN, Wako, Saitama (Japan); Department of Energy Sciences, Tokyo Institute of Technology, Meguro, Tokyo (Japan); Fuwa, Yasuhiro [RIKEN, Wako, Saitama (Japan); Department of Physics and Astronomy, Kyoto University, Uji, Kyoto (Japan); Cinquegrani, David [American Nuclear Society, University of Michigan, Ann Arbor, Michigan 48109 (United States); Kanesue, Takeshi; Okamura, Masahiro [Collider-Accelerator Department, Brookhaven National Laboratory, Upton, New York 11973 (United States); Washio, Masakazu [Cooperative Major in Nuclear Energy, Waseda University, Shinjuku, Tokyo (Japan)
2014-02-15
In Brookhaven National Laboratory, laser ion source has been developed to provide heavy ion beams by using plasma generation with 1064 nm Nd:YAG laser irradiation onto solid targets. The laser energy is transferred to the target material and creates a crater on the surface. However, only the partial material can be turned into plasma state and the other portion is considered to be just vaporized. Since heat propagation in the target material requires more than typical laser irradiation period, which is typically several ns, only the certain depth of the layers may contribute to form the plasma. As a result, the depth is more than 500 nm because the base material Al ions were detected. On the other hand, the result of comparing each carbon thickness case suggests that the surface carbon layer is not contributed to generate plasma.
Analysing generator matrices G of similar state but varying minimum determinants
Harun, H.; Razali, M. F.; Rahman, N. A. Abdul
2016-10-01
Since Tarokh discovered Space-Time Trellis Code (STTC) in 1998, a considerable effort has been done to improve the performance of the original STTC. One way of achieving enhancement is by focusing on the generator matrix G, which represents the encoder structure for STTC. Until now, researchers have only concentrated on STTCs of different states in analyzing the performance of generator matrix G. No effort has been made on different generator matrices G of similar state. The reason being, it is difficult to produce a wide variety of generator matrices G with diverse minimum determinants. In this paper a number of generator matrices G with minimum determinant of four (4), eight (8) and sixteen (16) of the same state (i.e., 4-PSK) have been successfully produced. The performance of different generator matrices G in term of their bit error rate and signal-to-noise ratio for a Rayleigh fading environment are compared and evaluated. It is found from the MATLAB simulation that at low SNR (14) there is no significant difference between the BER of these generator matrices G.
Scale/Analytical Analyses of Freezing and Convective Melting with Internal Heat Generation
Energy Technology Data Exchange (ETDEWEB)
Ali S. Siahpush; John Crepeau; Piyush Sabharwall
2013-07-01
Using a scale/analytical analysis approach, we model phase change (melting) for pure materials which generate constant internal heat generation for small Stefan numbers (approximately one). The analysis considers conduction in the solid phase and natural convection, driven by internal heat generation, in the liquid regime. The model is applied for a constant surface temperature boundary condition where the melting temperature is greater than the surface temperature in a cylindrical geometry. The analysis also consider constant heat flux (in a cylindrical geometry).We show the time scales in which conduction and convection heat transfer dominate.
Dynamical Simulation of Probabilities
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
DEFF Research Database (Denmark)
Romanovsky, G.; Xydis, G.; Mutale, J.
2011-01-01
While there are presently different options for renewable and distributed generation (RES/DG) to participate in the UK electricity market, none of the market options is specifically tailored for such types of generation and in particular, the smaller (up to 5 MW) RES/DG. This is because the UK has...... a number of specific historical, technical and economic reasons that significantly influenced the ability of the smaller size RES/DG to participate in the electricity market and in provision of balancing services in accordance with the UK National Grid requirements. This paper discusses some perspectives...... and approaches aiming to help stand alone small size and clusters of RES and DG units to participate in the UK electricity market drawing on relevant experience from Denmark....
Analyses of MYMIV-induced transcriptome in Vigna mungo as revealed by next generation sequencing.
Ganguli, Sayak; Dey, Avishek; Banik, Rahul; Kundu, Anirban; Pal, Amita
2016-03-01
Mungbean Yellow Mosaic Virus (MYMIV) is the viral pathogen that causes yellow mosaic disease to a number of legumes including Vigna mungo. VM84 is a recombinant inbred line resistant to MYMIV, developed in our laboratory through introgression of resistance trait from V. mungo line VM-1. Here we present the quality control passed transcriptome data of mock inoculated (control) and MYMIV-infected VM84, those have already been submitted in Sequence Read Archive (SRX1032950, SRX1082731) of NCBI. QC reports of FASTQ files generated by 'SeqQC V2.2' bioinformatics tool.
Analyses of MYMIV-induced transcriptome in Vigna mungo as revealed by next generation sequencing
Directory of Open Access Journals (Sweden)
Sayak Ganguli
2016-03-01
Full Text Available Mungbean Yellow Mosaic Virus (MYMIV is the viral pathogen that causes yellow mosaic disease to a number of legumes including Vigna mungo. VM84 is a recombinant inbred line resistant to MYMIV, developed in our laboratory through introgression of resistance trait from V. mungo line VM-1. Here we present the quality control passed transcriptome data of mock inoculated (control and MYMIV-infected VM84, those have already been submitted in Sequence Read Archive (SRX1032950, SRX1082731 of NCBI. QC reports of FASTQ files generated by ‘SeqQC V2.2’ bioinformatics tool.
Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro
2016-02-01
Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-01
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Thermodynamic analyses of a biomass-coal co-gasification power generation system.
Yan, Linbo; Yue, Guangxi; He, Boshu
2016-04-01
A novel chemical looping power generation system is presented based on the biomass-coal co-gasification with steam. The effects of different key operation parameters including biomass mass fraction (Rb), steam to carbon mole ratio (Rsc), gasification temperature (Tg) and iron to fuel mole ratio (Rif) on the system performances like energy efficiency (ηe), total energy efficiency (ηte), exergy efficiency (ηex), total exergy efficiency (ηtex) and carbon capture rate (ηcc) are analyzed. A benchmark condition is set, under which ηte, ηtex and ηcc are found to be 39.9%, 37.6% and 96.0%, respectively. Furthermore, detailed energy Sankey diagram and exergy Grassmann diagram are drawn for the entire system operating under the benchmark condition. The energy and exergy efficiencies of the units composing the system are also predicted.
Energy Technology Data Exchange (ETDEWEB)
Ganter, J.H.
1996-02-01
This paper suggests that inexorable changes in the society are presenting both challenges and a rich selection of technologies for responding to these challenges. The citizen is more demanding of environmental and personal protection, and of information. Simultaneously, the commercial and government information technologies markets are providing new technologies like commercial off-the-shelf (COTS) software, common datasets, ``open`` GIS, recordable CD-ROM, and the World Wide Web. Thus one has the raw ingredients for creating new techniques and tools for spatial analysis, and these tools can support participative study and decision-making. By carrying out a strategy of thorough and demonstrably correct science, design, and development, can move forward into a new generation of participative risk assessment and routing for radioactive and hazardous materials.
How well do analyses capture dust-generating winds in the Sahara and Sahel?
Roberts, Alexander; Marsham, John; Knippertz, Peter; Parker, Douglas
2016-04-01
Airborne mineral dust is important for weather, climate and earth-system prediction. Uncertainty in winds, as well as the land-surface, are known to be key to model uncertainties for dust uplift. Recent research has shown that during the summer wet season in the Sahel strong winds generated by the cold outflow from organized convective systems are an important dust storm mechanism (so called haboobs), while over the inner Sahara nocturnal low-level jets forming on the pressure gradient around the heat low dominate. Together the Sahel and Sahara are the world's largest dust source. Until now there has been a severe shortage of data for evaluating models for this region. Here, we bring together new observations from the remote Sahara, made during the Fennec project, with Sahelian data from the African Monsoon Multidisciplinary Analysis (AMMA), to provide an unprecedented evaluation of dust-generating winds in the European Centre for Medium-Range Weather Forecasts ERA-Interim (ERA-I) reanalysis. Differences between observations and ERA-I are explored with specific attention to monsoon and non-monsoon influenced regions. The main results are: (1) High speed winds in instantaneous ERA-I grid-box mean winds are lacking compared to time-averaged wind speed observations; (2) agreement between ERA-I and observations is lower during the monsoon season, even in parts of the Sahara not directly affected by the monsoon; and (3) both the seasonal and diurnal variability is under-represented in ERA-I. ERA-I fails to capture the summertime maximum for monsoon-affected stations and seasonally, correlations between daily-mean ERA-I and observed winds vary from 0.8 to 0.4, with lower correlations for 3-hourly data. These differences demonstrate that the model used in the production of the ERA-I reanalysis is unable to represent some important dust uplift processes, especially during the monsoon season when moist convection plays a key role, and that the product is not sufficiently
Burczyk, Jaroslaw; Koralewski, Tomasz E
2005-07-01
Assessment of contemporary pollen-mediated gene flow in plants is important for various aspects of plant population biology, genetic conservation and breeding. Here, through simulations we compare the two alternative approaches for measuring pollen-mediated gene flow: (i) the NEIGHBORHOOD model--a representative of parentage analyses, and (ii) the recently developed TWOGENER analysis of pollen pool structure. We investigate their properties in estimating the effective number of pollen parents (N(ep)) and the mean pollen dispersal distance (delta). We demonstrate that both methods provide very congruent estimates of N(ep) and delta, when the methods' assumptions considering the shape of pollen dispersal curve and the mating system follow those used in data simulations, although the NEIGHBORHOOD model exhibits generally lower variances of the estimates. The violations of the assumptions, especially increased selfing or long-distance pollen dispersal, affect the two methods to a different degree; however, they are still capable to provide comparable estimates of N(ep). The NEIGHBORHOOD model inherently allows to estimate both self-fertilization and outcrossing due to the long-distance pollen dispersal; however, the TWOGENER method is particularly sensitive to inflated selfing levels, which in turn may confound and suppress the effects of distant pollen movement. As a solution we demonstrate that in case of TWOGENER it is possible to extract the fraction of intraclass correlation that results from outcrossing only, which seems to be very relevant for measuring pollen-mediated gene flow. The two approaches differ in estimation precision and experimental efforts but they seem to be complementary depending on the main research focus and type of a population studied.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Engel, Erwan; Ratel, Jérémy
2007-06-22
The objective of the work was to assess the relevance for the authentication of food of a novel chemometric method developed to correct mass spectrometry (MS) data from instrumental drifts, namely, the comprehensive combinatory standard correction (CCSC). Applied to gas chromatography (GC)-MS data, the method consists in analyzing a liquid sample with a mixture of n internal standards and in using the best combination of standards to correct the MS signal provided by each compound. The paper focuses on the authentication of the type of feeding in farm animals based on the composition in volatile constituents of their adipose tissues. The first step of the work enabled on one hand to ensure the feasibility of the conversion of the adipose tissue sample into a liquid phase required for the use of the CCSC method and on the other hand, to determine the key parameters of the extraction of the volatile fraction from this liquid phase by dynamic headspace. The second step showed the relevance of the CCSC pre-processing of the MS fingerprints generated by dynamic headspace-MS analysis of lamb tissues, for the discrimination of animals fed exclusively with pasture (n=8) or concentrate (n=8). When compared with filtering of raw data, internal normalization and correction by a single standard, the CCSC method increased by 17.1-, 3.3- and 1.3-fold, respectively, the number of mass fragments which discriminated the type of feeding. The final step confirmed the advantage of the CCSC pre-processing of dynamic headspace-gas chromatography-MS data for revealing molecular tracers of the type of feeding those number (n=72) was greater when compared to the number of tracers obtained with raw data (n=42), internal normalization (n=63) and correction by a single standard (n=57). The relevance of the information gained by using the CCSC method is discussed.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...
Energy Technology Data Exchange (ETDEWEB)
Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br
2009-07-01
This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.
Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran
2016-09-01
This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks.
Nomura, Kouji; Nakaji-Hirabayashi, Tadashi; Gemmei-Ide, Makoto; Kitano, Hiromi; Noguchi, Hidenori; Uosaki, Kohei
2014-09-01
Surfaces of both a cover glass and the flat plane of a semi-cylindrical quartz prism were modified with a mixture of positively and negatively charged silane coupling reagents (3-aminopropyltriethoxysilane (APTES) and 3-(trihydroxysilyl)propylmethylphosphonate (THPMP), respectively). The glass surface modified with a self-assembled monolayer (SAM) prepared at a mixing ratio of APTES:THPMP=4:6 was electrically almost neutral and was resistant to non-specific adsorption of proteins, whereas fibroblasts gradually adhered to an amphoteric (mixed) SAM surface probably due to its stiffness, though the number of adhered cells was relatively small. Sum frequency generation (SFG) spectra indicated that total intensity of the OH stretching region (3000-3600cm(-1)) for the amphoteric SAM-modified quartz immersed in liquid water was smaller than those for the positively and negatively charged SAM-modified quartz prisms and a bare quartz prism in contact with liquid water. These results suggested that water molecules at the interface of water and an amphoteric SAM-modified quartz prism are not strongly oriented in comparison with those at the interface of a lopsidedly charged SAM-modified quartz prism and bare quartz. The importance of charge neutralization for the anti-biofouling properties of solid materials was strongly suggested.
Directory of Open Access Journals (Sweden)
Douglas Blackiston
Full Text Available A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays. The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science.
Energy Technology Data Exchange (ETDEWEB)
Ramos, Dorel Soares; Negri, Jean Cesari; Kann, Zevi [Companhia Energetica de Sao Paulo, SP (Brazil); Pereira, Mario Veiga Ferraz [PSR Inc., Rio de Janeiro, RJ (Brazil)
1996-12-31
The paper describes the model SAEGET developed to analyse the thermal power plants generation expansion and its integration with the set of models currently used for planning of expansion purposes in the Brazilian electrical sector. Additionally some illustrative examples are presented and future developments of the model are proposed. (author) 3 refs., 2 figs., 2 tabs.
Directory of Open Access Journals (Sweden)
Hussein A. Kazem
2013-01-01
Full Text Available This paper presents a method for determining optimal sizes of PV array, wind turbine, diesel generator, and storage battery installed in a building integrated system. The objective of the proposed optimization is to design the system that can supply a building load demand at minimum cost and maximum availability. The mathematical models for the system components as well as meteorological variables such as solar energy, temperature, and wind speed are employed for this purpose. Moreover, the results showed that the optimum sizing ratios (the daily energy generated by the source to the daily energy demand for the PV array, wind turbine, diesel generator, and battery for a system located in Sohar, Oman, are 0.737, 0.46, 0.22, and 0.17, respectively. A case study represented by a system consisting of 30 kWp PV array (36%, 18 kWp wind farm (55%, and 5 kVA diesel generator (9% is presented. This system is supposed to power a 200 kWh/day load demand. It is found that the generated energy share of the PV array, wind farm, and diesel generator is 36%, 55%, and 9%, respectively, while the cost of energy is 0.17 USD/kWh.
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long
2014-11-01
The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.
Directory of Open Access Journals (Sweden)
Gian Paolo Beretta
2008-08-01
Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.
Gutschow, Christian; The ATLAS collaboration
2016-01-01
The Monte Carlo setups used by ATLAS to model boson+jets and multi-boson processes in 13 TeV pp collisions are described. Comparisons between data and several events generators are provided for key kinematic distributions at 7 TeV, 8 TeV and 13 TeV. Issues associated with sample normalisation and the evaluation of systematic uncertainties are also discussed.
Huri, Emre; Dogantekin, Engin; Hayran, Murvet; Malkan, Umit Yavuz; Ergun, Mine; Firat, Aysegul; Beyazit, Yavuz; Ustun, Huseyin; Kekilli, Murat; Dadali, Mumtaz; Astarci, Muzeyyen; Haznedaroglu, Ibrahim C
2016-01-01
Ankaferd Blood Stopper (ABS), a hemostatic agent of plant origin, has been registered for the prevention of clinical hemorrhages. Currently there is no data regarding the ultrastructural analysis of ABS at the tissue level. The aim of this study is to assess renal tissue effects via scanning electron microscopy (SEM) analyses for the ABS and ABS nanohemostat (formed by the combination of self-assembling peptide amphiphile molecules and ABS). SEM experiments were performed with FEI Nova NanoSEM 230, using the ETD detector at low vacuum mode with 30 keV beam energy. SEM analyses revealed that significant erythroid aggregation are present inside the capillary bed of the renal tissue. However, neither the signs of necrosis nor any other sign of tissue damage are evident in the surrounding renal tissue supplied by the microcapillary vasculature. Our study is important for several reasons. Firstly, in our study we used ABS nanohemostat which was recently developed. This study adds valuable information to the literature regarding ABS nanohemostat. Secondly, this study is the first ultrastructural analysis of ABS that was performed at the tissue level. Thirdly, we disclosed that ABS nanohemostat could induce vital erythroid aggregation at the renal tissue level as detected by SEM. Lastly, we detected that ABS nanohemostat causes no harm to the tissues including necrosis and any other detrimental effects.
Hussein A. Kazem; Tamer Khatib
2013-01-01
This paper presents a method for determining optimal sizes of PV array, wind turbine, diesel generator, and storage battery installed in a building integrated system. The objective of the proposed optimization is to design the system that can supply a building load demand at minimum cost and maximum availability. The mathematical models for the system components as well as meteorological variables such as solar energy, temperature, and wind speed are employed for this purpose. Moreover, the r...
Energy Technology Data Exchange (ETDEWEB)
Monniaux, D.
2009-06-15
Software operating critical systems (aircraft, nuclear power plants) should not fail - whereas most computerised systems of daily life (personal computer, ticket vending machines, cell phone) fail from time to time. This is not a simple engineering problem: it is known, since the works of Turing and Cook, that proving that programs work correctly is intrinsically hard. In order to solve this problem, one needs methods that are, at the same time, efficient (moderate costs in time and memory), safe (all possible failures should be found), and precise (few warnings about nonexistent failures). In order to reach a satisfactory compromise between these goals, one can research fields as diverse as formal logic, numerical analysis or 'classical' algorithmics. From 2002 to 2007 I participated in the development of the Astree static analyser. This suggested to me a number of side projects, both theoretical and practical (use of formal proof techniques, analysis of numerical filters...). More recently, I became interested in modular analysis of numerical property and in the applications to program analysis of constraint solving techniques (semi-definite programming, SAT and SAT modulo theory). (author)
Directory of Open Access Journals (Sweden)
Doddy Kastanya
2017-02-01
Full Text Available In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Tan, Hui Teng; Lee, Keat Teong; Mohamed, Abdul Rahman
2010-07-01
Recently, second-generation bio-ethanol (SGB), which utilizes readily available lignocellulosic biomass has received much interest as another potential source of liquid biofuel comparable to biodiesel. Thus the aim of this paper is to determine the exergy efficiency and to compare the effectiveness of SGB and palm methyl ester (PME) processes. It was found that the production of bio-ethanol is more thermodynamically sustainable than that of biodiesel as the net exergy value (NExV) of SGB is 10% higher than that of PME. Contrarily, the former has a net energy value (NEV) which is 9% lower than the latter. Despite this, SGB is still strongly recommended as a potential biofuel because SGB production can help mitigate several detrimental impacts on the environment.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)
Directory of Open Access Journals (Sweden)
Rita Kéri
2016-12-01
Full Text Available The paper presents a field study that looked at teaching contexts as instances of joint knowledge construction. The study was part of a larger enterprise in the vein of grounded theory, exploring qualitative connections between communication dynamics and evolving cooperation patterns, aiming to provide feedback to theories on the overall relationship between communication and cooperation. This study also involved looking at the joint problem definition and planning in groups of adults with different sociocultural backgrounds. In the kinds of settings selected, participants are likely to start with diverging strategies and axioms used in articulating knowledge. Comparative analyses of formal and extracurricular teaching situations are presented in the paper, and their implications are explained in the conceptual framework of common ground, private experience, and public knowledge products. The focus is on the communicative context, the role that verbal contributions and interpersonal strategies play in jointly framing a problem: how different dimensions of communication complement or interfere with each other to serve the purposes of local and long-term coordination and knowledge production, and meanwhile shape the community. In the preliminary theoretical considerations governing the study, I aimed to develop a perspective that enables the exploration of the types of situations selected, and this has been refined to give meaningful analysis of such situations. I am presenting strategies that simultaneously shape cooperative potential and construct the means that enable joint action and limit its form, involving the creative mobilization of private worlds.
Gielen, Fabrice; Buryska, Tomas; Van Vliet, Liisa; Butz, Maren; Damborsky, Jiri; Prokop, Zbynek; Hollfelder, Florian
2015-01-06
Analysis of concentration dependencies is key to the quantitative understanding of biological and chemical systems. In experimental tests involving concentration gradients such as inhibitor library screening, the number of data points and the ratio between the stock volume and the volume required in each test determine the quality and efficiency of the information gained. Titerplate assays are currently the most widely used format, even though they require microlitre volumes. Compartmentalization of reactions in pico- to nanoliter water-in-oil droplets in microfluidic devices provides a solution for massive volume reduction. This work addresses the challenge of producing microfluidic-based concentration gradients in a way that every droplet represents one unique reagent combination. We present a simple microcapillary technique able to generate such series of monodisperse water-in-oil droplets (with a frequency of up to 10 Hz) from a sample presented in an open well (e.g., a titerplate). Time-dependent variation of the well content results in microdroplets that represent time capsules of the composition of the source well. By preserving the spatial encoding of the droplets in tubing, each reactor is assigned an accurate concentration value. We used this approach to record kinetic time courses of the haloalkane dehalogenase DbjA and analyzed 150 combinations of enzyme/substrate/inhibitor in less than 5 min, resulting in conclusive Michaelis-Menten and inhibition curves. Avoiding chips and merely requiring two pumps, a magnetic plate with a stirrer, tubing, and a pipet tip, this easy-to-use device rivals the output of much more expensive liquid handling systems using a fraction (∼100-fold less) of the reagents consumed in microwell format.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
DEFF Research Database (Denmark)
Schneider, Jesper Wiborg; Larsen, Birger; Ingwersen, Peter
2009-01-01
XML documents extracted from the IEEE collection. These data allow the construction of ad-hoc citation indexes, which enables us to carry out the hitherto largest all-author co-citation study. Four ACA are made, combining the different units of analyses with the different matrix generation approaches...... into groupings. Finally, the study also demonstrates the importance of sparse matrices and their potential problems in connection with factor analysis. Conclusion: We can confirm that inclusive all-ACA produce more coherent groupings of authors, whereas the present study cannot clearly confirm previous findings...
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2013-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Chambers, David W
2005-01-01
Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession.
Tokinaga, Shozo; Ikeda, Yoshikazu
In investments, it is not easy to identify traders'behavior from stock prices, and agent systems may help us. This paper deals with discriminant analyses of stock prices using multifractality of time series generated via multi-agent systems and interpolation based on Wavelet Transforms. We assume five types of agents where a part of agents prefer forecast equations or production rules. Then, it is shown that the time series of artificial stock price reveals as a multifractal time series whose features are defined by the Hausedorff dimension D(h). As a result, we see the relationship between the reliability (reproducibility) of multifractality and D(h) under sufficient number of time series data. However, generally we need sufficient samples to estimate D(h), then we use interpolations of multifractal times series based on the Wavelet Transform.
Müller, Wolfgang; Kelley, Simon; Villa, Igor
2002-07-01
Three different geochronological techniques (stepwise-heating, laser-ablation 40Ar/39Ar, Rb-Sr microsampling) have been evaluated for dating fault-generated pseudotachylytes sampled along the Periadriatic Fault System (PAF) of the Alps. Because pseudotachylytes are whole-rock systems composed of melt, clast and alteration phases, chemical control from both Ar isotopes (Cl/K, Ca/K ratios) and EMPA analyses is crucial for their discrimination. When applied to stepwise-heating 40Ar/39Ar analyses, this approach yields accurate melt-related ages, even for complex age spectra. The spatial resolution of laser-ablation 40Ar/39Ar analyses is capable of contrasting melt, clast and alteration phases in situ, provided the clasts are not too fine grained, the latter of which results in integrated "mixed" ages without geological information. Elevated Cl/K and Ca/K ratios were found to be an invaluable indicator for the presence of clast admixture or inherited 40Ar. Due to incomplete isotopic resetting during frictional melting, Rb-Sr microsampling dating did not furnish geologically meaningful ages. On the basis of isotopic disequilibria among pseudotachylyte matrix phases, and independent Rb-Sr microsampling dating of cogenetic (ultra)mylonites, the concordant 40Ar/39Ar pseudotachylyte ages are interpreted as formation ages. The investigated pseudotachylytes altogether reveal a Cretaceous to Miocene history for the entire PAF, consistent with independent geological evidence. Individual faults, however, consistently reveal narrower intervals of enhanced activity lasting a few million years. Electronic supplementary material to this paper can be obtained by using the Springer LINK server at http://dx.doi.org/10.1008/s00410-002-0381-6
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
基于概率模型的ATC系统冲突目标生成算法%Probability-Based Method of Generating the Conflict Trajectories for ATC System
Institute of Scientific and Technical Information of China (English)
苏志刚; 眭聪聪; 吴仁彪
2011-01-01
For testing the capability of short term conflict alerting of air traffic control system, two methods are usually used. The former is to set a higher threshold, use the real data testing whether the system can alert when distance between two flights gets lower than the threshold. However, this method is not reliable. The second method is simulating flights which will conflict and obtain their trajectory from calculating, and then send these data to ATC system to see its reaction. This method is usually too simple to test whether the system can pre-detect a conflict effectively. To solve these problems, a probabilistic approach is used in this paper to simulate air-crafts with given probability of conflicting. Firstly, we derived the conflict probability of turing flights from Prandaini' s method of conflict probability estimation for linear flight. Then using reverse derivation we got the motion parameters of two targets whose conflict probability was pre-setted. At last, we simulated this pair of targets' track and anlysised their conflict probability. The simulation results show that the targets' probability of conflict was in line with the previous assumption. The trajectories generated by this algorithm are more realistic then a more effective conclusion of ATC system' s capability of short term conflict alerting and pre-detecting will be provided.%通常用于测试空中交通管制(Air Traffic Control,ATC)自动化系统的飞行冲突告警功能的方法主要有放宽系统告警值和向系统输入模拟的飞行冲突目标的雷达数据.前一种方法存在不可靠性,第二种方法由于只产生简单的确定目标轨迹数据,因此只能简单地测试系统能否告警,无法对系统的飞行冲突预测能力作出评价.为了使用于测试系统的模拟雷达数据更符合实际飞行情况,并检测系统预测飞行冲突的技术水平,本文提出了一种基于飞行冲突概率模型的航迹模拟方法,通过对不同目标
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
On Quantum Conditional Probability
Directory of Open Access Journals (Sweden)
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Cluster pre-existence probability
Energy Technology Data Exchange (ETDEWEB)
Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)
2011-10-15
Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Directory of Open Access Journals (Sweden)
VICTORIA EUGENIA VALLEJO
Full Text Available El presente estudio evaluó el desempeño de dos sales de tetrazolio, una tradicional: INT y una de nueva generación: XTT, para estimar la densidad de microorganismos degradadores de hidrocarburos (HCs en suelos empleando la técnica del Número Más Probable (NMP. Se analizaron 96 muestras de suelo provenientes de la Ecorregión Cafetera de Colombia. Los microorganismos fueron recuperados en agar mínimo de sales en atmósfera saturada de HCs y la capacidad degradadora fue confirmada por repiques sucesivos utilizando diesel como fuente de carbono. No se observaron diferencias significativas en los recuentos de microorganismos degradadores obtenidos con las dos sales (t de Student, p The objective of this study was to evaluate the performance of two tetrazolium indicators: a traditional one: INT and a new generation one: XTT, for the estimation of hydrocarbon (HC degrading microorganism s density using the Most Probable Number Technique (MPN. Ninety six composite soil samples were taken and analyzed from Ecorregión Cafetera Colombiana. Degrading microorganisms were recovered in minimum salt medium with saturated HC atmosphere. Degrading HC capacity of the microorganisms was confirmed by successive subcultures in the same medium using diesel as only carbon source. Counts obtained with the two salts were not significantly different (Student t test, p < 0,05 but XTT allowed an easier visualization of positive wells due to product solubility of the reduce product. A greater percentage of isolates was obtained using XTT (67%, which suggests that salt type is relevant for recovering of these microorganisms. Additionally, cell detection limit, optimal conditions of XTT concentration and incubation times for detection of activity were evaluated. This evaluation was performed by means of microplate format for hydrocarbon degrading microorganisms using Acinetobacter sp. An inhibitory effect was observed in the recovering of cultivable cells when XTT
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Energy Technology Data Exchange (ETDEWEB)
Grunewald, Thomas; Finke, Robert; Graetz, Rainer
2010-07-01
Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.
Zurek, W H
2004-01-01
I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Energy Technology Data Exchange (ETDEWEB)
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Analyses of Generation and Release of Tritium in Nuclear Power Plant%核电厂氚的产生和排放分析
Institute of Scientific and Technical Information of China (English)
黎辉; 梅其良; 付亚茹
2015-01-01
T ritium research including tritium generation in reactor core and in the primary coolant ,release pathways ,tritium chemical forms and release amount is a very impor‐tant part of environment assessment of nuclear power plant .Based on the international operation practice ,the primary coolant system ,auxiliary systems ,radwaste system and ventilation system were analysed ,and the tritium release pathways and chemical forms were investigated .The results indicate that the theoretic calculation results agree with the nuclear power plant operation data very well .The tritium contained in the primary coolant is mainly produced from the three‐fragment fission reaction ,boron activation in the burnable poison rods and boron ,lithium and deuterium activation w hen they pass through the core . The released tritium to the environment is mainly in the form of tritiated water and the percentage between the liquid and gaseous of release tritium mainly depends on the leakage rate from the primary coolant to the reactor building and auxiliary building .%研究核电厂中氚在堆芯和主冷却剂中的产生方式，以及进入环境的途径、形态和排放量，是核电厂辐射环境影响评价非常重要的内容之一。本文通过分析压水堆核电厂中的主冷却剂系统、辅助系统、三废系统和厂房通风系统的运行模式，结合国际上的运行经验参数，研究主冷却剂中的氚排放进入环境大气的途径和形态。研究结果表明：理论计算分析结果与电厂运行经验数据相吻合，氚主要通过燃料棒中的三元裂变，可燃毒物棒中硼的活化以及主冷却剂中硼、锂和氘流经堆芯时的活化产生，主要以液态氚水形式排放，影响气液两相分配份额的主要因素取决于主冷却剂向反应堆厂房和辅助厂房的泄漏率。
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Eliazar, Iddo; Klafter, Joseph
2008-06-01
We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Empirical and Computational Tsunami Probability
Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.
2008-12-01
sources of epistemic uncertainty in the computational analysis are the overall rate of occurrence and the inter-event distribution for landslide sources. From both empirical and computational analyses, tsunami probability as a function of runup (i.e., the tsunami hazard curve) in seismically active ocean basins such as the Pacific can be described by a modified power-law, which is similar to size distributions for other natural hazards (earthquakes, landslides, etc.). At present, it is unclear whether this form of the tsunami hazard curve is of the same form in ocean basins that have a much lower rate of tsunami occurrence (e.g, the Atlantic).
DEFF Research Database (Denmark)
Boolsen, Merete Watt
bogen forklarer de fundamentale trin i forskningsprocessen og applikerer dem på udvalgte kvalitative analyser: indholdsanalyse, Grounded Theory, argumentationsanalyse og diskursanalyse......bogen forklarer de fundamentale trin i forskningsprocessen og applikerer dem på udvalgte kvalitative analyser: indholdsanalyse, Grounded Theory, argumentationsanalyse og diskursanalyse...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.
Whiting, Alan B
2014-01-01
Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.
Data Interpretation: Using Probability
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Learning unbelievable marginal probabilities
Pitkow, Xaq; Miller, Ken D
2011-01-01
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Probabilities for Solar Siblings
Valtonen, M; Bobylev, V V; Myllari, A
2015-01-01
We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Savage s Concept of Probability
Institute of Scientific and Technical Information of China (English)
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
Probability Theory without Bayes' Rule
Rodriques, Samuel G.
2014-01-01
Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...
Interpretation of Plateau in High-Harmonic Generation
Institute of Scientific and Technical Information of China (English)
程太旺; 李晓峰; 敖淑艳; 傅盘铭
2003-01-01
The plateau in high-harmonic generation is investigated in the frequency domain. Probability density of an electron in an electromagnetic field is obtained through analysing the quantized-field Volkov state. The plateau of high-harmonic generation reflects the spectral density of the electron at the location of nucleus after abovethreshold ionization.
Energy Technology Data Exchange (ETDEWEB)
A. Alsaed
2004-11-18
''The Disposal Criticality Analysis Methodology Topical Report'' prescribes an approach to the methodology for performing postclosure criticality analyses within the monitored geologic repository at Yucca Mountain, Nevada. An essential component of the methodology is the ''Configuration Generator Model for In-Package Criticality'' that provides a tool to evaluate the probabilities of degraded configurations achieving a critical state. The configuration generator model is a risk-informed, performance-based process for evaluating the criticality potential of degraded configurations in the monitored geologic repository. The method uses event tree methods to define configuration classes derived from criticality scenarios and to identify configuration class characteristics (parameters, ranges, etc.). The probabilities of achieving the various configuration classes are derived in part from probability density functions for degradation parameters. The NRC has issued ''Safety Evaluation Report for Disposal Criticality Analysis Methodology Topical Report, Revision 0''. That report contained 28 open items that required resolution through additional documentation. Of the 28 open items, numbers 5, 6, 9, 10, 18, and 19 were concerned with a previously proposed software approach to the configuration generator methodology and, in particular, the k{sub eff} regression analysis associated with the methodology. However, the use of a k{sub eff} regression analysis is not part of the current configuration generator methodology and, thus, the referenced open items are no longer considered applicable and will not be further addressed.
RANDOM VARIABLE WITH FUZZY PROBABILITY
Institute of Scientific and Technical Information of China (English)
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
Probability and rational choice
Directory of Open Access Journals (Sweden)
David Botting
2014-04-01
Full Text Available In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.
Probability landscapes for integrative genomics
Directory of Open Access Journals (Sweden)
Benecke Arndt
2008-05-01
Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Kandler, Anne; Shennan, Stephen
2015-12-06
Cultural change can be quantified by temporal changes in frequency of different cultural artefacts and it is a central question to identify what underlying cultural transmission processes could have caused the observed frequency changes. Observed changes, however, often describe the dynamics in samples of the population of artefacts, whereas transmission processes act on the whole population. Here we develop a modelling framework aimed at addressing this inference problem. To do so, we firstly generate population structures from which the observed sample could have been drawn randomly and then determine theoretical samples at a later time t2 produced under the assumption that changes in frequencies are caused by a specific transmission process. Thereby we also account for the potential effect of time-averaging processes in the generation of the observed sample. Subsequent statistical comparisons (e.g. using Bayesian inference) of the theoretical and observed samples at t2 can establish which processes could have produced the observed frequency data. In this way, we infer underlying transmission processes directly from available data without any equilibrium assumption. We apply this framework to a dataset describing pottery from settlements of some of the first farmers in Europe (the LBK culture) and conclude that the observed frequency dynamic of different types of decorated pottery is consistent with age-dependent selection, a preference for 'young' pottery types which is potentially indicative of fashion trends.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Scenario Probability-based Siting and Sizing of Wind Turbine Generators%基于场景概率的风电机组的选址和定容
Institute of Scientific and Technical Information of China (English)
刘苏云; 王笛; 蒋丹; 周竞; 史静; 丁晓群
2014-01-01
Based on the characteristics of wind velocity,this paper proposes scenario probability-based wind turbine siting and sizing.The function is established with the aim of minimizing total investment and annual energy loss.In the meantime,under consideration of various constraints,the application of the scheme in a random environment is assessed from the point of view of scenario probability, and improved PSO is used to solve this problem.Finally,IEEE33 node is taken as an example to verify the effectiveness and feasibility of the model and the algorithm.%根据风速特点提出基于场景概率的风机选址和定容，主要以风电机组总投资成本最小，年电能损失费用最小为目标函数建立，同时考虑各种约束条件，从场景发生概率角度评估规划方案在随机环境下的适用性，并使用改进粒子群算法来解决此问题。最后以IEEE33节点为例验证了模型和算法的有效性和可行性。
Prospect evaluation as a function of numeracy and probability denominator.
Millroth, Philip; Juslin, Peter
2015-05-01
This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.
Directory of Open Access Journals (Sweden)
Alexandre G. de Brevern
2015-01-01
Full Text Available Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.
de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain
2015-01-01
Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.
Directory of Open Access Journals (Sweden)
Guo Li
Full Text Available BACKGROUND: Rapidly growing evidence suggests that microRNAs (miRNAs are involved in a wide range of cancer malignant behaviours including radioresistance. Therefore, the present study was designed to investigate miRNA expression patterns associated with radioresistance in NPC. METHODS: The differential expression profiles of miRNAs and mRNAs associated with NPC radioresistance were constructed. The predicted target mRNAs of miRNAs and their enriched signaling pathways were analyzed via biological informatical algorithms. Finally, partial miRNAs and pathways-correlated target mRNAs were validated in two NPC radioreisitant cell models. RESULTS: 50 known and 9 novel miRNAs with significant difference were identified, and their target mRNAs were narrowed down to 53 nasopharyngeal-/NPC-specific mRNAs. Subsequent KEGG analyses demonstrated that the 53 mRNAs were enriched in 37 signaling pathways. Further qRT-PCR assays confirmed 3 down-regulated miRNAs (miR-324-3p, miR-93-3p and miR-4501, 3 up-regulated miRNAs (miR-371a-5p, miR-34c-5p and miR-1323 and 2 novel miRNAs. Additionally, corresponding alterations of pathways-correlated target mRNAs were observed including 5 up-regulated mRNAs (ICAM1, WNT2B, MYC, HLA-F and TGF-β1 and 3 down-regulated mRNAs (CDH1, PTENP1 and HSP90AA1. CONCLUSIONS: Our study provides an overview of miRNA expression profile and the interactions between miRNA and their target mRNAs, which will deepen our understanding of the important roles of miRNAs in NPC radioresistance.
Energy Technology Data Exchange (ETDEWEB)
Fondeur, F. F.; Fink, S. D.
2011-12-07
A new solvent system referred to as Next Generation Solvent or NGS, has been developed at Oak Ridge National Laboratory for the removal of cesium from alkaline solutions in the Caustic Side Solvent Extraction process. The NGS is proposed for deployment at MCU{sup a} and at the Salt Waste Processing Facility. This work investigated the chemical compatibility between NGS and 16 M, 8 M, and 3 M nitric acid from contact that may occur in handling of analytical samples from MCU or, for 3 M acid, which may occur during contactor cleaning operations at MCU. This work shows that reactions occurred between NGS components and the high molarity nitric acid. Reaction rates are much faster in 8 M and 16 M nitric acid than in 3 M nitric acid. In the case of 16 M and 8 M nitric acid, the nitric acid reacts with the extractant to produce initially organo-nitrate species. The reaction also releases soluble fluorinated alcohols such as tetrafluoropropanol. With longer contact time, the modifier reacts to produce a tarry substance with evolved gases (NO{sub x} and possibly CO). Calorimetric analysis of the reaction product mixtures revealed that the organo-nitrates reaction products are not explosive and will not deflagrate.
Directory of Open Access Journals (Sweden)
Smith Derek
2009-01-01
Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening
Kasai, Chika; Sugimoto, Kazushi; Moritani, Isao; Tanaka, Junichiro; Oya, Yumi; Inoue, Hidekazu; Tameda, Masahiko; Shiraki, Katsuya; Ito, Masaaki; Takei, Yoshiyuki; Takase, Kojiro
2016-01-01
Colorectal cancer (CRC) is the third leading cause of cancer-related deaths in Japan. The etiology of CRC has been linked to numerous factors including genetic mutation, diet, life style, inflammation, and recently, the gut microbiota. However, CRC-associated gut microbiota is still largely unexamined. This study used terminal restriction fragment length polymorphism (T-RFLP) and next-generation sequencing (NGS) to analyze and compare gut microbiota of Japanese control subjects and Japanese patients with carcinoma in adenoma. Stool samples were collected from 49 control subjects, 50 patients with colon adenoma, and 9 patients with colorectal cancer (3/9 with invasive cancer and 6/9 with carcinoma in adenoma) immediately before colonoscopy; DNA was extracted from each stool sample. Based on T-RFLP analysis, 12 subjects (six control and six carcinoma in adenoma subjects) were selected; their samples were used for NGS and species-level analysis. T-RFLP analysis showed no significant differences in bacterial population between control, adenoma and cancer groups. However, NGS revealed that i), control and carcinoma in adenoma subjects had different gut microbiota compositions, ii), one bacterial genus (Slackia) was significantly associated with the control group and four bacterial genera (Actinomyces, Atopobium, Fusobacterium, and Haemophilus) were significantly associated with the carcinoma-in-adenoma group, and iii), several bacterial species were significantly associated with each type (control: Eubacterium coprostanoligens; carcinoma in adenoma: Actinomyces odontolyticus, Bacteroides fragiles, Clostridium nexile, Fusobacterium varium, Haemophilus parainfluenzae, Prevotella stercorea, Streptococcus gordonii, and Veillonella dispar). Gut microbial properties differ between control subjects and carcinoma-in-adenoma patients in this Japanese population, suggesting that gut microbiota is related to CRC prevention and development.
Approximation of Failure Probability Using Conditional Sampling
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Energy Technology Data Exchange (ETDEWEB)
Ferrini, Marcello [GeNERG - DIME/TEC, University of Genova, via all’Opera Pia 15/a, 16145 Genova (Italy); Borreani, Walter [Ansaldo Nucleare S.p.A., Corso F.M. Perrone 25, 16152 Genova (Italy); INFN, Via Dodecaneso 33, 16146 Genova (Italy); Lomonaco, Guglielmo, E-mail: guglielmo.lomonaco@unige.it [GeNERG - DIME/TEC, University of Genova, via all’Opera Pia 15/a, 16145 Genova (Italy); INFN, Via Dodecaneso 33, 16146 Genova (Italy); Magugliani, Fabrizio [Ansaldo Nucleare S.p.A., Corso F.M. Perrone 25, 16152 Genova (Italy)
2016-02-15
Lead-cooled fast reactor (LFR) has both a long history and a penchant of innovation. With early work related to its use for submarine propulsion dating to the 1950s, Russian scientists pioneered the development of reactors cooled by heavy liquid metals (HLM). More recently, there has been substantial interest in both critical and subcritical reactors cooled by lead (Pb) or lead–bismuth eutectic (LBE), not only in Russia, but also in Europe, Asia, and the USA. The growing knowledge of the thermal-fluid-dynamic properties of these fluids and the choice of the LFR as one of the six reactor types selected by Generation IV International Forum (GIF) for further research and development has fostered the exploration of new geometries and new concepts aimed at optimizing the key components that will be adopted in the Advanced Lead Fast Reactor European Demonstrator (ALFRED), the 300 MW{sub t} pool-type reactor aimed at proving the feasibility of the design concept adopted for the European Lead-cooled Fast Reactor (ELFR). In this paper, a theoretical and computational analysis is presented of a multi-blade screw pump evolving liquid Lead as primary pump for the adopted reference conceptual design of ALFRED. The pump is at first analyzed at design operating conditions from the theoretical point of view to determine the optimal geometry according to the velocity triangles and then modeled with a 3D CFD code (ANSYS CFX). The choice of a 3D simulation is dictated by the need to perform a detailed spatial simulation taking into account the peculiar geometry of the pump as well as the boundary layers and turbulence effects of the flow, which are typically tri-dimensional. The use of liquid Lead impacts significantly the fluid dynamic design of the pump because of the key requirement to avoid any erosion affects. These effects have a major impact on the performance, reliability and lifespan of the pump. Albeit some erosion-related issues remain to be fully addressed, the results
Probability distribution fitting of schedule overruns in construction projects
P E D Love; C-P Sing; WANG, X; Edwards, D.J.; H Odeyinka
2013-01-01
The probability of schedule overruns for construction and engineering projects can be ascertained using a ‘best fit’ probability distribution from an empirical distribution. The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed. Skewness and kurtosis values revealed that schedule overruns are non-Gaussian. Theoretical probability distributions were then fitted to the schedule overrun data; including the Kolmogorov–...
Gallart, Francesc; Llorens, Pilar; Pérez-Gallego, Nuria; Latron, Jérôme
2016-04-01
The Vallcebre research catchments are located in NE Spain, in a middle mountain area with a Mediterranean sub-humid climate. Most of the bedrock consists of continental red lutites that are easily weathered into loamy soils. This area was intensely used for agriculture in the past when most of the sunny gentle hillslopes were terraced. The land was progressively abandoned since the mid-20th Century and most of the fields were converted to meadows or were spontaneously forested. Early studies carried out in the terraced Cal Parisa catchment demonstrated the occurrence of two types of frequently saturated areas, ones situated in downslope locations with high topographic index values, and the others located in the inner parts of many terraces, where the shallow water table usually outcrops due to the topographical modifications linked to terrace construction. Both the increased extent of saturated areas and the role of a man-made elementary drainage system designed for depleting water from the terraces suggested that terraced areas would induce an enhanced hydrological response during rainfall events when compared with non-terraced hillslopes. The response of 3 sub-catchments, of increasing area and decreasing percentage of terraced area, during a set of major events collected during over 15 years has been analysed. The results show that storm runoff depths were roughly proportional to precipitations above 30 mm although the smallest catchment (Cal Parisa), with the highest percentage of terraces, was able to completely buffer rainfall events of 60 mm in one hour without any runoff when antecedent conditions were dry. Runoff coefficients depended on antecedent conditions and peak discharges were weakly linked to rainfall intensities. Peak lag times, peak runoff rates and recession coefficients were similar in the 3 catchments; the first variable values were in the range between Hortonian and saturation overland flow and the two last ones were in the range of
Hidden Variables or Positive Probabilities?
Rothman, T; Rothman, Tony
2001-01-01
Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to charac......An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Uncertainty quantification approaches for advanced reactor analyses.
Energy Technology Data Exchange (ETDEWEB)
Briggs, L. L.; Nuclear Engineering Division
2009-03-24
The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.
Landau-Zener Probability Reviewed
Valencia, C
2008-01-01
We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.
Understanding Students' Beliefs about Probability.
Konold, Clifford
The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Costello, Fintan
2012-01-01
The systematic biases and errors seen in people's probability judgments are typically taken as evidence that people do not reason about probability using the rules of probability theory. We show the contrary: that these biases are a consequence of people correctly following probability theory, but with random variation or noise affecting the reasoning process. Taking P_E(A) to represent a person's estimate for the probability of some event A, this random variation account predicts that on average P_E(A)+P_E(B)- P_E(A or B)-P_E(A and B)=0 for all pairs of events A,B, just as required by probability theory. Analysing data from an experiment asking people to estimate such probabilities for a number of pairs A,B we find striking confirmation of this prediction.
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Linear Positivity and Virtual Probability
Hartle, J B
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...
Survival probability and ruin probability of a risk model
Institute of Scientific and Technical Information of China (English)
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Energy Technology Data Exchange (ETDEWEB)
Slentoe, E.; Moeller, F.; Winther, M.; Hjort Mikkelsen, M.
2010-10-15
The report examines in an integrated form, the energy, emissions and welfare economic implications of introducing Danish produced biodiesel, i.e. rapeseed diesel (RME) and the first and second generation wheat ethanol in two scenarios with low and high rate of blending with fossil fuel based automotive fuels. Within this project's, analytical framework and assumptions the welfare economic analysis shows, that it would be beneficial for society to realize the biofuel scenarios to some extent by oil prices above $ 100 a barrel, while it will cause losses by oil prices at $ 65. In all cases, the fossil fuel consumption and the emissions CO2eq emissions are reduced, the effect of which is priced and included in the welfare economic analysis. The implementation of biofuels in Denmark will be dependent on market price. As it stands now, it is not favorable in terms of biofuels. The RME is currently produced in Denmark is exported to other European countries where there are state subsidies. Subsidies would also be a significant factor in Denmark to achieve objectives for biofuel blending. (ln)
Holographic probabilities in eternal inflation.
Bousso, Raphael
2006-11-10
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Probability Ranking in Vector Spaces
Melucci, Massimo
2011-01-01
The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
Local Causality, Probability and Explanation
Healey, Richard A
2016-01-01
In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Probability representation of classical states
Man'ko, OV; Man'ko, [No Value; Pilyavets, OV
2005-01-01
Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.
The probabilities of unique events.
Directory of Open Access Journals (Sweden)
Sangeet S Khemlani
Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.
Diurnal distribution of sunshine probability
Energy Technology Data Exchange (ETDEWEB)
Aydinli, S.
1982-01-01
The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Joint probabilities and quantum cognition
de Barros, J Acacio
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Three lectures on free probability
2012-01-01
These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.
47 CFR 1.1623 - Probability calculation.
2010-10-01
... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be... determine their new intermediate probabilities. (g) Multiply each applicant's probability pursuant...
Stochastics introduction to probability and statistics
Georgii, Hans-Otto
2012-01-01
This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.
Institute of Scientific and Technical Information of China (English)
莫达隆
2012-01-01
Stochastic simulation experiment is a powerful tool for teaching probability statistics. Based on the needs of the teaching of probability statistics, this paper highlights the use of statistical software eviews random number generator, and accordingly gives some of the algorithms and procedures, and expand the statistical software in the use of the mathematical experiment. This instructional design will help deepen the students understanding of the concept of probability statistics and improve the capacity of their hands.%随机模拟实验是概率统计教学的有力工具，文章结合概率统计教学的需要，重点介绍了统计软件eviews中随机数发生器的使用，并相应给出了一些算法和程序，拓展了统计软件在数学实验中的运用，这种教学设计，将会帮助学生加深概率统计概念的理解和提高其实际动手的能力。
Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation
Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin
2016-12-01
If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.
Probability distributions with summary graph structure
Wermuth, Nanny
2010-01-01
A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class are multivariate regression chain graphs. They describe the independences of stepwise processes, in which at each step single or joint responses are generated given the relevant explanatory variables in their past. For joint densities that then result after possible marginalising or conditioning, we use summary graphs. These graphs reflect the independence structure implied by the generating process for the reduced set of variables and they preserve the implied independences after additional marginalising and conditioning. They can identify generating dependences which remain unchanged and alert to possibly severe distortions due to direct and indirect confounding. Operators for matrix representations of graphs are used to de...
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Energy Technology Data Exchange (ETDEWEB)
Grunewald, T.; Graetz, R.
2007-09-29
Equipment intended for use in potentially explosive atmospheres must meet the requirements of the European directive 94/9/EC. The declaration of conformity of the manufacturer testifies that they meet the requirements. The conformity assessment is based on the risk (ignition) assessment which identifies and estimates the ignition sources. The European standards in the area of the directive 94/9/EC (like EN 1127-1, EN 13463-1) describe 13 possible ignition sources. Mechanically generated sparks are one of them. Statements to the ignition effectiveness and especially the ignition probability in case of mechanically generated sparks for a given kinetic impact energy and given explosive gas/air-mixtures are not possible. An extensive literature looking confirms this state. This was and is a problem in making and revising standards. Simple ferritic steel is a common material for the construction of equipment also for non electrical applications intended for use in potentially explosive atmospheres for chemical and mechanical engineering and manufacturing technology. Therefore it was the objective of this study to get some statistical ignition probabilities depending on the kinetic impact energy and the minimum ignition energy of the explosive gas/air-mixture. This study was made with impact testing machines of BAM (Federal Institute of Materials Research and Testing) at three kinetic impact energies. The following results were obtained for all the reference gas/air-mixtures of the IEC-explosion groups (I methane, IIA propane, IIB ethylene, IIC acetylene, hydrogen): 1. It was not possible to generate ignitable mechanically sparks for kinetic impact energies below 3 Nm for the test conditions in this study respectively the impact kinetics and impact geometry of the impact machines. 2. Single mechanically generated particles were able to be a dangerous ignition source through oxidation process at kinetic impact energies of 10 Nm. Furthermore the tests have shown that the
Cluster Membership Probability: Polarimetric Approach
Medhi, Biman J
2013-01-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Detonation probabilities of high explosives
Energy Technology Data Exchange (ETDEWEB)
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Fuzzy Markov chains: uncertain probabilities
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Probability representations of fuzzy systems
Institute of Scientific and Technical Information of China (English)
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
Probability distributions for multimeric systems.
Albert, Jaroslav; Rooman, Marianne
2016-01-01
We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Knot probabilities in random diagrams
Cantarella, Jason; Chapman, Harrison; Mastin, Matt
2016-10-01
We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Logic, Probability, and Human Reasoning
2015-01-01
3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ
Probability and statistics: A reminder
Directory of Open Access Journals (Sweden)
Clément Benoit
2013-07-01
Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Objective probability and quantum fuzziness
Mohrhoff, U
2007-01-01
This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...
Energy Technology Data Exchange (ETDEWEB)
Sensfuss, F.; Ragwitz, M.
2007-06-18
The authors of the contribution under consideration analyse the impact of power generation from EEG on the electricity rate. Under this aspect, the effect of the market value, the effect of CO{sub 2} and the effect of merit order on the power market have to be distinguished. This contribution presents a detailed analysis of the effect of the merit order. The priority feed of EEG reduces the demand of conventional power. Therefore, the most expensive power plants needed for covering the demand are not needed any more according to the merit order, the price in the spot market is reduced correspondingly. Due to the fact that spot market prices simultaneously are the most important indicator on the whole power market, the EEG not only should result into increased reductions of prices at the spot market but also into savings for all customers (leverage effect). The quantification of this effect was performed on the basis of a detailed model of the power market (PowerACE).
Theory of overdispersion in counting statistics caused by fluctuating probabilities
Semkow, T M
1999-01-01
It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.
Energy Technology Data Exchange (ETDEWEB)
Gouronnec, A.M. [Institut de Radioprotection et de Surete Nucleaire (IRSN), 92 - Clamart (France)
2004-06-15
The olfactometric analyses presented here are applied to industrial odors being able to generate harmful effects for people. The aim of the olfactometric analyses is to quantify odors, to qualify them or to join a pleasant or an unpleasant character to them (hedonism notion). The aim of this work is at first to present the different measurements carried out, the different measurement methods used and the current applications for each of the methods. (O.M.)
Probability and Statistics The Science of Uncertainty (Revised Edition)
Tabak, John
2011-01-01
Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of
Hydrogeologic unit flow characterization using transition probability geostatistics.
Jones, Norman L; Walker, Justin R; Carle, Steven F
2005-01-01
This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
Hf Transition Probabilities and Abundances
Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...
Gd Transition Probabilities and Abundances
Den Hartog, E A; Sneden, C; Cowan, J J
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...
Cheng, Peter C-H
2011-07-01
The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic.
An improved probability mapping approach to assess genome mosaicism
Directory of Open Access Journals (Sweden)
Gogarten J Peter
2003-09-01
Full Text Available Abstract Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events.
Directory of Open Access Journals (Sweden)
Ondřej Šimpach
2012-12-01
Full Text Available It is estimated, that in the Czech Republic live about 40 000–50 000 people suffering from celiac disease, which is a disease of gluten intolerance. At the beginning of the independent Czech Republic, the life expectancy at birth of these people was quite low, because just in this period detailed diagnosis of this disease came fromabroad. With an increasing age the probability of death of these people grew faster than that of total population. The aim of this study is to analyse the probability of death of x-year old persons during next five years after the general medical examination in 1990 and 1995. Both analyses will be solved using LOGIT and PROBITmodels and the hypothesis claiming, that probability of death of x-year old person suffering from celiac disease decreased few years after the gaining of new medical knowledge from abroad will be confirmed or refused.
Off-site ignition probability of flammable gases.
Rew, P J; Spencer, H; Daycock, J
2000-01-07
A key step in the assessment of risk for installations where flammable liquids or gases are stored is the estimation of ignition probability. A review of current modelling and data confirmed that ignition probability values used in risk analyses tend to be based on extrapolation of limited incident data or, in many cases, on the judgement of those conducting the safety assessment. Existing models tend to assume that ignition probability is a function of release rate (or flammable gas cloud size) alone and they do not consider location, density or type of ignition source. An alternative mathematical framework for calculating ignition probability is outlined in which the approach used is to model the distribution of likely ignition sources and to calculate ignition probability by considering whether the flammable gas cloud will reach these sources. Data are collated on the properties of ignition sources within three generic land-use types: industrial, urban and rural. These data are then incorporated into a working model for ignition probability in a form capable of being implemented within risk analysis models. The sensitivity of the model results to assumptions made in deriving the ignition source properties is discussed and the model is compared with other available ignition probability methods.
The Inductive Applications of Probability Calculus
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.
Directory of Open Access Journals (Sweden)
T. Bulteau
2014-11-01
Full Text Available The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces the issue of outliers, those particularly extreme values distant from the others which increase the uncertainty on the results. In this study, we investigate how historical information, even partial, of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. A Bayesian Markov Chain Monte Carlo method is developed to tackle this issue. We apply this method to the site of La Rochelle (France, where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events, the analysis shows that: (1 integrating historical information in the analysis greatly reduces statistical uncertainties on return levels (2 Xynthia's water level no longer appears as an outlier, (3 we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data till end of 2009 of the same order of magnitude as the standard estimative probability using data till end of 2010. Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Fusion probability in heavy nuclei
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
The Black Hole Formation Probability
Clausen, Drew; Ott, Christian D
2014-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...
Probable Linezolid-Induced Pancytopenia
Directory of Open Access Journals (Sweden)
Nita Lakhani
2005-01-01
Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.
Transit probabilities around hypervelocity and runaway stars
Fragione, G.; Ginsburg, I.
2017-04-01
In the blooming field of exoplanetary science, NASA's Kepler Space Telescope has revolutionized our understanding of exoplanets. Kepler's very precise and long-duration photometry is ideal for detecting planetary transits around Sun-like stars. The forthcoming Transiting Exoplanet Survey Satellite (TESS) is expected to continue Kepler's legacy. Along with transits, the Doppler technique remains an invaluable tool for discovering planets. The next generation of spectrographs, such as G-CLEF, promise precision radial velocity measurements. In this paper, we explore the possibility of detecting planets around hypervelocity and runaway stars, which should host a very compact system as consequence of their turbulent origin. We find that the probability of a multiplanetary transit is 10-3 ≲ P ≲ 10-1. We therefore need to observe ∼10-1000 high-velocity stars to spot a transit. However, even if transits are rare around runaway and hypervelocity stars, the chances of detecting such planets using radial velocity surveys is high. We predict that the European Gaia satellite, along with TESS and the new-generation spectrographs G-CLEF and ESPRESSO, will spot planetary systems orbiting high-velocity stars.
Sukhov, Vladimir; Sherstneva, Oksana; Surova, Lyubov; Katicheva, Lyubov; Vodeneev, Vladimir
2014-11-01
Electrical signals (action potential and variation potential, VP) caused by environmental stimuli are known to induce various physiological responses in plants, including changes in photosynthesis; however, their functional mechanisms remain unclear. In this study, the influence of VP on photosynthesis in pea (Pisum sativum L.) was investigated and the proton participation in this process analysed. VP, induced by local heating, inactivated photosynthesis and activated respiration, with the initiation of the photosynthetic response connected with inactivation of the photosynthetic dark stage; however, direct VP influence on the light stage was also probable. VP generation was accompanied with pH increases in apoplasts (0.17-0.30 pH unit) and decreases in cytoplasm (0.18-0.60 pH unit), which probably reflected H(+) -ATPase inactivation and H(+) influx during this electrical event. Imitation of H(+) influx using the protonophore carbonyl cyanide m-chlorophenylhydrazone (CCCP) induced a photosynthetic response that was similar with a VP-induced response. Experiments on chloroplast suspensions showed that decreased external pH also induced an analogous response and that its magnitude depended on the magnitude of pH change. Thus, the present results showed that proton cellular influx was the probable mechanism of VP's influence on photosynthesis in pea. Potential means of action for this influence are discussed.
Multivariate Evolutionary Analyses in Astrophysics
Fraix-Burnet, Didier
2011-01-01
The large amount of data on galaxies, up to higher and higher redshifts, asks for sophisticated statistical approaches to build adequate classifications. Multivariate cluster analyses, that compare objects for their global similarities, are still confidential in astrophysics, probably because their results are somewhat difficult to interpret. We believe that the missing key is the unavoidable characteristics in our Universe: evolution. Our approach, known as Astrocladistics, is based on the evolutionary nature of both galaxies and their properties. It gathers objects according to their "histories" and establishes an evolutionary scenario among groups of objects. In this presentation, I show two recent results on globular clusters and earlytype galaxies to illustrate how the evolutionary concepts of Astrocladistics can also be useful for multivariate analyses such as K-means Cluster Analysis.
Network class superposition analyses.
Directory of Open Access Journals (Sweden)
Carl A B Pearson
Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.
Employment and Wage Assimilation of Male First Generation Immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael
2000-01-01
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...
Employment and Wage Assimilation of Male First-generation immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael
2001-01-01
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...
Employment and Wage assimilation of Male First Generation Immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...
Conditional probability modulates visual search efficiency.
Cort, Bryan; Anderson, Britt
2013-01-01
We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Inferring Beliefs as Subjectively Imprecise Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
2012-01-01
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
The trajectory of the target probability effect.
Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B
2013-05-01
The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Convolutions Induced Discrete Probability Distributions and a New Fibonacci Constant
Rajan, Arulalan; Rao, Vittal; Rao, Ashok
2010-01-01
This paper proposes another constant that can be associated with Fibonacci sequence. In this work, we look at the probability distributions generated by the linear convolution of Fibonacci sequence with itself, and the linear convolution of symmetrized Fibonacci sequence with itself. We observe that for a distribution generated by the linear convolution of the standard Fibonacci sequence with itself, the variance converges to 8.4721359... . Also, for a distribution generated by the linear convolution of symmetrized Fibonacci sequences, the variance converges in an average sense to 17.1942 ..., which is approximately twice that we get with common Fibonacci sequence.
On Markov Chains Induced by Partitioned Transition Probability Matrices
Institute of Scientific and Technical Information of China (English)
Thomas KAIJSER
2011-01-01
Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P, then we call M a partition of P. Let K denote the set of probability vectors on S. With every partition M of P we can associate a transition probability function PM on K defined in such a way that if p ∈ K and M ∈ M are such that ‖pM‖ ＞ 0, then, with probability ‖pM‖, the vector p is transferred to the vector pM/‖pM‖. Here ‖· ‖ denotes the l1-norm. In this paper we investigate the convergence in distribution for Markov chains generated by transition probability functions induced by partitions of transition probability matrices. The main motivation for this investigation is the application of the convergence results obtained to filtering processes of partially observed Markov chains with denumerable state space.
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
Efficient calculation of detection probabilities
Energy Technology Data Exchange (ETDEWEB)
Thoreson, Gregory G., E-mail: gthoreson@mail.utexas.ed [University of Texas - Austin, Pickle Research Campus, R-9000, Austin, TX 78712 (United States); Schneider, Erich A. [University of Texas - Austin, Pickle Research Campus, R-9000, Austin, TX 78712 (United States)
2010-04-11
Radiation transport simulations have found wide use as a detector and system design tool for smuggled nuclear material interdiction applications. A major obstacle to the utility of Monte Carlo radiation transport to this class of problems is the computational burden associated with simulating a spanning set of threat scenarios. One common method for circumventing this obstacle models a subset of detailed scenarios which are considered representative of the system. Another simplifies the threat scenarios, enabling many cases to be simulated at the cost of a loss of fidelity. This paper demonstrates a new approach to the problem of modeling a very large scenario set. The scenario is disaggregated into components in which radiation transport may be simulated independently. Green's functions for each submodel are generated, parameterized with respect to major scenario variables, and convolved to create a depiction of the radiation transport within the entire scenario. With this approach, the computation time required to model many different scenarios is greatly reduced. The theoretical basis of this algorithm is presented along with validation results that show it to be comparable in fidelity to more computationally intensive methods, in particular brute-force simulation.
Statistics and probability with applications for engineers and scientists
Gupta, Bhisham C
2013-01-01
Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob
Digital dice computational solutions to practical probability problems
Nahin, Paul J
2013-01-01
Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the
On the computability of conditional probability
Ackerman, Nathanael L; Roy, Daniel M
2010-01-01
We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...
Collision strengths and transition probabilities for Co III forbidden lines
Storey, P. J.; Sochi, Taha
2016-07-01
In this paper we compute the collision strengths and their thermally averaged Maxwellian values for electron transitions between the 15 lowest levels of doubly ionized cobalt, Co2+, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.
Collision strengths and transition probabilities for Co III forbidden lines
Storey, P J
2016-01-01
In this paper we compute the collision strengths and their thermally-averaged Maxwellian values for electron transitions between the fifteen lowest levels of doubly-ionised cobalt, Co^{2+}, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
Cannon, S.H.; Gartner, J.E.; Rupert, M.G.; Michael, J.A.; Rea, A.H.; Parrett, C.
2010-01-01
Empirical models to estimate the probability of occurrence and volume of postwildfire debris flows can be quickly implemented in a geographic information system (GIS) to generate debris-flow hazard maps either before or immediately following wildfires. Models that can be used to calculate the probability of debris-flow production from individual drainage basins in response to a given storm were developed using logistic regression analyses of a database from 388 basins located in 15 burned areas located throughout the U.S. Intermountain West. The models describe debris-flow probability as a function of readily obtained measures of areal burned extent, soil properties, basin morphology, and rainfall from short-duration and low-recurrence-interval convective rainstorms. A model for estimating the volume of material that may issue from a basin mouth in response to a given storm was developed using multiple linear regression analysis of a database from 56 basins burned by eight fires. This model describes debris-flow volume as a function of the basin gradient, aerial burned extent, and storm rainfall. Applications of a probability model and the volume model for hazard assessments are illustrated using information from the 2003 Hot Creek fire in central Idaho. The predictive strength of the approach in this setting is evaluated using information on the response of this fire to a localized thunderstorm in August 2003. The mapping approach presented here identifies those basins that are most prone to the largest debris-flow events and thus provides information necessary to prioritize areas for postfire erosion mitigation, warnings, and prefire management efforts throughout the Intermountain West.
Bell Could Become the Copernicus of Probability
Khrennikov, Andrei
2016-07-01
Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.
A model to assess dust explosion occurrence probability.
Hassan, Junaid; Khan, Faisal; Amyotte, Paul; Ferdous, Refaul
2014-03-15
Dust handling poses a potential explosion hazard in many industrial facilities. The consequences of a dust explosion are often severe and similar to a gas explosion; however, its occurrence is conditional to the presence of five elements: combustible dust, ignition source, oxidant, mixing and confinement. Dust explosion researchers have conducted experiments to study the characteristics of these elements and generate data on explosibility. These experiments are often costly but the generated data has a significant scope in estimating the probability of a dust explosion occurrence. This paper attempts to use existing information (experimental data) to develop a predictive model to assess the probability of a dust explosion occurrence in a given environment. The pro-posed model considers six key parameters of a dust explosion: dust particle diameter (PD), minimum ignition energy (MIE), minimum explosible concentration (MEC), minimum ignition temperature (MIT), limiting oxygen concentration (LOC) and explosion pressure (Pmax). A conditional probabilistic approach has been developed and embedded in the proposed model to generate a nomograph for assessing dust explosion occurrence. The generated nomograph provides a quick assessment technique to map the occurrence probability of a dust explosion for a given environment defined with the six parameters.
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
Estimating the concordance probability in a survival analysis with a discrete number of risk groups.
Heller, Glenn; Mo, Qianxing
2016-04-01
A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
UT Biomedical Informatics Lab (BMIL probability wheel
Directory of Open Access Journals (Sweden)
Sheng-Cheng Huang
2016-01-01
Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
Total variation denoising of probability measures using iterated function systems with probabilities
La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.
2017-01-01
In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.
Bayesian Probabilities and the Histories Algebra
Marlow, Thomas
2006-01-01
We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
Non-Boolean probabilities and quantum measurement
Energy Technology Data Exchange (ETDEWEB)
Niestegge, Gerd
2001-08-03
A non-Boolean extension of the classical probability model is proposed. The non-Boolean probabilities reproduce typical quantum phenomena. The proposed model is more general and more abstract, but easier to interpret, than the quantum mechanical Hilbert space formalism and exhibits a particular phenomenon (state-independent conditional probabilities) which may provide new opportunities for an understanding of the quantum measurement process. Examples of the proposed model are provided, using Jordan operator algebras. (author)
Data analysis recipes: Probability calculus for inference
Hogg, David W
2012-01-01
In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.
Probabilities are single-case, or nothing
Appleby, D M
2004-01-01
Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state.
Real analysis and probability solutions to problems
Ash, Robert P
1972-01-01
Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.
Some New Results on Transition Probability
Institute of Scientific and Technical Information of China (English)
Yu Quan XIE
2008-01-01
In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.
Melucci, Massimo
2012-01-01
Probabilistic models require the notion of event space for defining a probability measure. An event space has a probability measure which ensues the Kolmogorov axioms. However, the probabilities observed from distinct sources, such as that of relevance of documents, may not admit a single event space thus causing some issues. In this article, some results are introduced for ensuring whether the observed prob- abilities of relevance of documents admit a single event space. More- over, an alternative framework of probability is introduced, thus chal- lenging the use of classical probability for ranking documents. Some reflections on the convenience of extending the classical probabilis- tic retrieval toward a more general framework which encompasses the issues are made.
Probability sampling design in ethnobotanical surveys of medicinal plants
Directory of Open Access Journals (Sweden)
Mariano Martinez Espinosa
2012-12-01
Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.
Choice probability for apple juice based on novel processing techniques
DEFF Research Database (Denmark)
Olsen, Nina Veflen; Menichelli, E.; Grunert, Klaus G.
2011-01-01
and pulsed electric field (PEF) juice are compared with their probability of choice for pasteurized juice and freshly produced apple juice, and consumer choices are tried explained by values and consequences generated from a MEC study. The study support, at least partly, that means-end chain structures’ have......, within the core of academic consumer research, MEC has been almost ignored. One plausible explanation for this lack of interest may be that studies linking MEC data to choice have been few. In this study, we are to investigate how values and consequences generated from a previous MEC study structure can...... be linked to likelihood of choice. Hypotheses about European consumers’ likelihood of choice for novel processed juice are stated and tested in a rating based conjoint study in Norway, Denmark, Hungary and Slovakia. In the study, consumers probability of choice for high pressure processed (HPP) juice...
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.H.
1980-01-01
Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Multivariate saddlepoint approximations in tail probability and conditional inference
Kolassa, John; 10.3150/09-BEJ237
2010-01-01
We extend known saddlepoint tail probability approximations to multivariate cases, including multivariate conditional cases. Our approximation applies to both continuous and lattice variables, and requires the existence of a cumulant generating function. The method is applied to some examples, including a real data set from a case-control study of endometrial cancer. The method contains less terms and is easier to implement than existing methods, while showing an accuracy comparable to those methods.
An Improved Model of Attack Probability Prediction System
Institute of Scientific and Technical Information of China (English)
WANG Hui; LIU Shufen; ZHANG Xinjia
2006-01-01
This paper presents a novel probability generation algorithm to predict attacks from an insider who exploits known system vulnerabilities through executing authorized operations. It is different from most intrusion detection systems (IDSs) because these IDSs are inefficient to resolve threat from authorized insiders. To deter cracker activities, this paper introduces an improved structure of augmented attack tree and a notion of "minimal attack tree", and proposes a new generation algorithm of minimal attack tree. We can provide a quantitative approach to help system administrators make sound decision.
Analytical Study of Thermonuclear Reaction Probability Integrals
Chaudhry, M A; Mathai, A M
2000-01-01
An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Recent Developments in Applied Probability and Statistics
Devroye, Luc; Kohler, Michael; Korn, Ralf
2010-01-01
This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Average Transmission Probability of a Random Stack
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
Quantifying extinction probabilities from sighting records: inference and uncertainties.
Directory of Open Access Journals (Sweden)
Peter Caley
Full Text Available Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes.
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2016-01-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is achieved by first checking such structures in covariant quantum mechanics, and then passing to spin foam models via the general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the Hilbert space of the canonical theory and the relevant quantum logical structure. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize property transitions and causality in this categorical context in connection with presheaves on quantaloids and respectively causal categories. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Laboratory-Tutorial activities for teaching probability
Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.
2006-01-01
We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.
DEFF Research Database (Denmark)
le Fevre Jakobsen, Bjarne
Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011......Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011...
Exact capture probability analysis of GSC receivers over Rayleigh fading channel
Nam, Sungsik
2010-01-01
For third generation systems and ultrawideband systems, RAKE receivers have been introduced due to the advantage of RAKE receivers which is their ability to combine different replicas of the transmitted signal arriving at different delays in a rich multipath environment. In principle, RAKE receivers combine all resolvable paths which gives the best performance in a rich diversity environment. However, this is usually costly in terms of hardware required as the number of RAKE fingers increases. Therefore, generalized selection combining (GSC) RAKE reception was proposed and has been studied by many researcher as an alternative to the classical two fundamental diversity schemes: maximal ratio combining and selection combining. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closedform expressions for various performance measures. However, the remaining set of uncombined paths affect the overall performance both in terms of loss in power. Therefore, to have a full understanding of the performance of GSC RAKE receivers, we introduce in this paper the notion of capture probability, which is defined as the ratio of the captured power (essentially combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.
Gebhardt, Volker
2011-01-01
We present an algorithm to generate positive braids of a given length as words in Artin generators with a uniform probability. The complexity of this algorithm is polynomial in the number of strands and in the length of the generated braids. As a byproduct, we describe a finite state automaton accepting the language of lexicographically minimal representatives of positive braids that has the minimal possible number of states, and we prove that its number of states is exponential in the number of strands.
Energy Technology Data Exchange (ETDEWEB)
Wolff, Marc
2011-10-14
This work is devoted to the construction of numerical methods that allow the accurate simulation of inertial confinement fusion (ICF) implosion processes by taking self-generated magnetic field terms into account. In the sequel, we first derive a two-temperature resistive magnetohydrodynamics model and describe the considered closure relations. The resulting system of equations is then split in several subsystems according to the nature of the underlying mathematical operator. Adequate numerical methods are then proposed for each of these subsystems. Particular attention is paid to the development of finite volume schemes for the hyperbolic operator which actually is the hydrodynamics or ideal magnetohydrodynamics system depending on whether magnetic fields are considered or not. More precisely, a new class of high-order accurate dimensionally split schemes for structured meshes is proposed using the Lagrange re-map formalism. One of these schemes' most innovative features is that they have been designed in order to take advantage of modern massively parallel computer architectures. This property can for example be illustrated by the dimensionally split approach or the use of artificial viscosity techniques and is practically highlighted by sequential performance and parallel efficiency figures. Hyperbolic schemes are then combined with finite volume methods for dealing with the thermal and resistive conduction operators and taking magnetic field generation into account. In order to study the characteristics and effects of self-generated magnetic field terms, simulation results are finally proposed with the complete two-temperature resistive magnetohydrodynamics model on a test problem that represents the state of an ICF capsule at the beginning of the deceleration phase. (author)
Survival probability in patients with liver trauma.
Buci, Skender; Kukeli, Agim
2016-08-01
Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Basic Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first
Advanced Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob
Tomographic probability representation for quantum fermion fields
Andreev, V A; Man'ko, V I; Son, Nguyen Hung; Thanh, Nguyen Cong; Timofeev, Yu P; Zakharov, S D
2009-01-01
Tomographic probability representation is introduced for fermion fields. The states of the fermions are mapped onto probability distribution of discrete random variables (spin projections). The operators acting on the fermion states are described by fermionic tomographic symbols. The product of the operators acting on the fermion states is mapped onto star-product of the fermionic symbols. The kernel of the star-product is obtained. The antisymmetry of the fermion states is formulated as the specific symmetry property of the tomographic joint probability distribution associated with the states.
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Are All Probabilities Fundamentally Quantum Mechanical?
Pradhan, Rajat Kumar
2011-01-01
The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Institute of Scientific and Technical Information of China (English)
曹兴; 杜文静; 程林
2012-01-01
采用数值模拟的方法,研究了螺旋角对连续螺旋折流板换热器流动与传热性能的影响,并以熵产数为指标对换热器性能进行了基于热力学第二定律的分析评价.结果表明,相同质量流量时壳程传热系数和压降均随螺旋角的增大而降低,且后者降低的幅度大于前者.连续螺旋折流板换热器壳程横截面上切向速度分布较弓形折流板换热器更加均匀.在靠近中心假管的内层区域,同一径向位置的轴向速度随螺旋角的增大而降低,而在靠近壳体壁面的外层区域则相反.螺旋角越大,不同径向位置的换热管间的换热量分布均匀性越好.壳程质量流量相等时,换热器中传热引起的熵产占总熵产的比重随着螺旋角的增大而增加,熵产数随着螺旋角的增大而降低.%A numerical simulation for heat exchanger with continuous helical baffles was carried out by using commercial codes of ANSYS CFX 12. 0. The study focuses on the effects of helix angle on flow and heat transfer characteristics, and heat exchanger performance is evaluated by entropy generation number based on the analysis of the second law of thermodynamics. The results show that both the shell-side heat transfer coefficient and pressure drop decrease with the increase of the helix angle at certain mass flow rate. The latter decreases more quickly than the former. The tangential velocity distribution on shell-side cross section is more uniform with continuous helical baffles than with segmental baffles. The axial velocity at certain radial position decreases as the helix angle increases in the inner region near the central dummy tube, whereas it increases as the helix angle increases in the outer region near the shell. The heat exchange quantity distribution in tubes at different radial positions is more uniform at larger helix angle. The proportion of the entropy generation contributed by heat transfer in total entropy generation increases and the
Inclusion probability with dropout: an operational formula.
Milot, E; Courteau, J; Crispino, F; Mailly, F
2015-05-01
In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.
Modelling the probability of building fires
Directory of Open Access Journals (Sweden)
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Teaching Elementary Probability Through its History.
Kunoff, Sharon; Pines, Sylvia
1986-01-01
Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Zordok, Wael A.
2014-08-01
The preparation and characterization of the new solid complexes [VO(CIP)2L]SO4ṡnH2O, where L = aniline (An), dimethylformamide (DMF), pyridine (Py) and triethylamine (Et3N) in the reaction of ciprofloxacin (CIP) with VO(SO4)2·2H2O in ethanol. The isolated complexes have been characterized with their melting points, elemental analysis, IR spectroscopy, magnetic properties, conductance measurements, UV-Vis. and 1H NMR spectroscopic methods and thermal analyses. The results supported the formation of the complexes and indicated that ciprofloxacin reacts as a bidentate ligand bound to the vanadium ion through the pyridone oxygen and one carboxylato oxygen. The activation energies, E*; entropies, ΔS*; enthalpies, ΔH*; Gibbs free energies, ΔG*, of the thermal decomposition reactions have been derived from thermo gravimetric (TGA) and differential thermo gravimetric (DTG) curves, using Coats-Redfern and Horowitz-Metzeger methods. The lowest energy model structure of each complex has been proposed by using the density functional theory (DFT) at the B3LYP/CEP-31G level of theory. The ligand and their metal complexes were also evaluated for their antibacterial activity against several bacterial species, such as Bacillus Subtilis (B. Subtilis), Staphylococcus aureus (S. aureus), Nesseria Gonorrhoeae (N. Gonorrhoeae), Pseudomonas aeruginosa (P. aeruginosa) and Escherichia coli (E. coli).
Evidence for Truncated Exponential Probability Distribution of Earthquake Slip
Thingbaijam, Kiran K. S.
2016-07-13
Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Characteristic Functions over C*-Probability Spaces
Institute of Scientific and Technical Information of China (English)
王勤; 李绍宽
2003-01-01
Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.
De Finetti's contribution to probability and statistics
Cifarelli, Donato Michele; Regazzini, Eugenio
1996-01-01
This paper summarizes the scientific activity of de Finetti in probability and statistics. It falls into three sections: Section 1 includes an essential biography of de Finetti and a survey of the basic features of the scientific milieu in which he took the first steps of his scientific career; Section 2 concerns de Finetti's work in probability: (a) foundations, (b) processes with independent increments, (c) sequences of exchangeable random variables, and (d) contributions which fall within ...
Probability, clinical decision making and hypothesis testing
Directory of Open Access Journals (Sweden)
A Banerjee
2009-01-01
Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.
Imprecise Probability Methods for Weapons UQ
Energy Technology Data Exchange (ETDEWEB)
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Ruin Probability in Linear Time Series Model
Institute of Scientific and Technical Information of China (English)
ZHANG Lihong
2005-01-01
This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.
Survival probability for open spherical billiards
Dettmann, Carl P.; Rahman, Mohammed R.
2014-12-01
We study the survival probability for long times in an open spherical billiard, extending previous work on the circular billiard. We provide details of calculations regarding two billiard configurations, specifically a sphere with a circular hole and a sphere with a square hole. The constant terms of the long-time survival probability expansions have been derived analytically. Terms that vanish in the long time limit are investigated analytically and numerically, leading to connections with the Riemann hypothesis.
Data analysis recipes: Probability calculus for inference
Hogg, David W.
2012-01-01
In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods,...
Representing Uncertainty by Probability and Possibility
DEFF Research Database (Denmark)
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
Institute of Scientific and Technical Information of China (English)
林剑; 张向前
2013-01-01
The quality of the competence of successors is the key factor which affeds the development of family business. First, this paper constructs the competency model of successors of family business named KAP (Knowledge, Ability and Per-sonality) model by qualitative study, then it conducts analyses on the data collected through on-site research based on SPSS statistic software and AMOS structural equation modeling analysis tool for verification. The results of the study show that the empirical results basically tallies with the hypothetic theory model. They also indicate that there are discrepancies of the components of competence between successors and the founders of family business. At last, the paper puts forward concrete suggestions, from preparations before succession, trials during succession and innovations after succession, on promoting the level of the competence of successors of family business.%家族企业继任者素质的高低成为影响企业发展的关键因素。通过质性研究方法构建家族企业继任者胜任力KAP模型并基于SPSS统计软件和AMOS结构方程分析工具对实证数据进行处理和分析。研究结果显示，除了实证结果与理论模型基本契合之外，还发现继任者与第一代创业者在胜任力要素构成上存在着差异。最后，从家族企业继任前的筹备、继任中的考验以及继任后的创新三个方面提出提升继任者胜任力水平的具体管理建议。
The assessment of low probability containment failure modes using dynamic PRA
Brunett, Acacia Joann
a significant threat to containment integrity. Additional scoping studies regarding the effect of recovery actions on in-vessel hydrogen generation show that reflooding a partially degraded core do not significantly affect hydrogen generation in-vessel, and the NUREG-1150 assumption that insufficient hydrogen is generated in-vessel to produce an energetic deflagration is confirmed. The DET analyses performed in this work show that very late power recovery produces the potential for very energetic combustion events which are capable of failing containment with a non-negligible probability, and that containment cooling systems have a significant impact on core concrete attack, and therefore combustible gas generation ex-vessel. Ultimately, the overall risk of combustion-induced containment failure is low, but its conditional likelihood can have a significant effect on accident mitigation strategies. It is also shown in this work that DETs are particularly well suited to examine low probability events because of their ability to rediscretize CDFs and observe solution convergence.
Directory of Open Access Journals (Sweden)
Takeshi Takeda
2016-01-01
Full Text Available Two tests related to a new safety system for a pressurized water reactor were performed with the ROSA/LSTF (rig of safety assessment/large scale test facility. The tests simulated cold leg small-break loss-of-coolant accidents with 2-inch diameter break using an early steam generator (SG secondary-side depressurization with or without release of nitrogen gas dissolved in accumulator (ACC water. The SG depressurization was initiated by fully opening the depressurization valves in both SGs immediately after a safety injection signal. The pressure difference between the primary and SG secondary sides after the actuation of ACC system was larger in the test with the dissolved gas release than that in the test without the dissolved gas release. No core uncovery and heatup took place because of the ACC coolant injection and two-phase natural circulation. Long-term core cooling was ensured by the actuation of low-pressure injection system. The RELAP5 code predicted most of the overall trends of the major thermal-hydraulic responses after adjusting a break discharge coefficient for two-phase discharge flow under the assumption of releasing all the dissolved gas at the vessel upper plenum.
Fibonacci Sequence, Recurrence Relations, Discrete Probability Distributions and Linear Convolution
Rajan, Arulalan; Rao, Ashok; Jamadagni, H S
2012-01-01
The classical Fibonacci sequence is known to exhibit many fascinating properties. In this paper, we explore the Fibonacci sequence and integer sequences generated by second order linear recurrence relations with positive integer coe?cients from the point of view of probability distributions that they induce. We obtain the generalizations of some of the known limiting properties of these probability distributions and present certain optimal properties of the classical Fibonacci sequence in this context. In addition, we also look at the self linear convolution of linear recurrence relations with positive integer coefficients. Analysis of self linear convolution is focused towards locating the maximum in the resulting sequence. This analysis, also highlights the influence that the largest positive real root, of the "characteristic equation" of the linear recurrence relations with positive integer coefficients, has on the location of the maximum. In particular, when the largest positive real root is 2,the locatio...
Outage Probability for Multi-Cell Processing under Rayleigh Fading
Garcia, Virgile; Lebedev, Nikolai
2010-01-01
Multi-cell processing, also called Coordinated Multiple Point (CoMP), is a very promising distributed multi-antennas technique that uses neighbour cell's antennas. This is expected to be part of next generation cellular networks standards such as LTE-A. Small cell networks in dense urban environment are mainly limited by interferences and CoMP can strongly take advantage of this fact to improve cell-edge users' throughput. This paper provides an analytical derivation of the capacity outage probability for CoMP experiencing fast Rayleigh fading. Only the average received power (slow varying fading) has to be known, and perfect Channel State Information (CSI) is not required. An optimisation of the successfully received data-rate is then derived with respect to the number of cooperating stations and the outage probability, illustrated by numerical examples.
DEFF Research Database (Denmark)
Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove
2007-01-01
The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...
温州地区太阳能电池板实际发电能力的分析%Analyses on the Actual Power-generating Capability of Solar Panels in Wenzhou
Institute of Scientific and Technical Information of China (English)
华晓玲; 梁步猛; 吴桂初
2014-01-01
简要介绍了薄膜、单晶硅和多晶硅太阳能电池板的优缺点以及实现最大功率点跟踪的几种方法。结合温州地区的太阳辐射情况，利用扰动观察法测量了大量的数据，以比功率为衡量标准对三种太阳能电池板的实际发电能力进行了比较。结果表明，不管是在光强很强还是在光强较弱的环境下，单晶硅太阳能电池板的发电能力均最高；对另两种太阳能板，在光强较强时，薄膜太阳能电池板占优势，在光强很弱时，多晶硅太阳能电池板发电能力更强。%In this paper, solar panels made of thin film, monocrystalline silicon and polycrystalline sil-icon were introduced and compared, and several methods to achieve maximum power point tracking (MPPT) were summarized as well. Combined with the condition of solar radiation in WenZhou, a large amou-nt of data were acquired by using the perturbing-and-observing algorithm and then analyzed to determinethe power-generating capacity by calculating their specific power. Experimental results indicated that mono-crystalline silicon solar panels always perform best in spite of weather conditions. Thin film solar panelshave advantages over polysilicon solar panels when light intensity is high, and when the light intensity is weak, polysilicon solar panels are better than the film solar panels.
Laboratory-tutorial activities for teaching probability
Directory of Open Access Journals (Sweden)
Roger E. Feeley
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.
Computing Earthquake Probabilities on Global Scales
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
Wavelet Analyses and Applications
Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.
2009-01-01
It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…
Veldman, M.; Schelvis-Smit, A.A.M.
2005-01-01
On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,
Contesting Citizenship: Comparative Analyses
DEFF Research Database (Denmark)
Siim, Birte; Squires, Judith
2007-01-01
. Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...
Hendriks, M.A.; Luyten, J.W.; Scheerens, J.; Sleegers, P.J.C.; Scheerens, J.
2014-01-01
In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used
Reduced reward-related probability learning in schizophrenia patients
Directory of Open Access Journals (Sweden)
Yılmaz A
2012-01-01
Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation
ESTIMATION OF INTRUSION DETECTION PROBABILITY BY PASSIVE INFRARED DETECTORS
Directory of Open Access Journals (Sweden)
V. V. Volkhonskiy
2015-07-01
Full Text Available Subject of Research. The paper deals with estimation of detection probability of intruder by passive infrared detector in different conditions of velocity and direction for automated analyses of physical protection systems effectiveness. Method. Analytic formulas for detection distance distribution laws obtained by means of experimental histogram approximation are used. Main Results. Applicability of different distribution laws has been studied, such as Rayleigh, Gauss, Gamma, Maxwell and Weibull distribution. Based on walk tests results, approximation of experimental histograms of detection distance probability distribution laws by passive infrared detectors was done. Conformity of the histograms to the mentioned analytical laws according to fitting criterion 2 has been checked for different conditions of velocity and direction of intruder movement. Mean and variance of approximate distribution laws were equal to the same parameters of experimental histograms for corresponding intruder movement parameters. Approximation accuracy evaluation for above mentioned laws was done with significance level of 0.05. According to fitting criterion 2, the Rayleigh and Gamma laws are corresponded mostly close to the histograms for different velocity and direction of intruder movement. Dependences of approximation accuracy for different conditions of intrusion have been got. They are usable for choosing an approximation law in the certain condition. Practical Relevance. Analytic formulas for detection probability are usable for modeling of intrusion process and objective effectiveness estimation of physical protection systems by both developers and users.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
A Thermodynamical Approach for Probability Estimation
Isozaki, Takashi
2012-01-01
The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.
Probabilities and Signalling in Quantum Field Theory
Dickinson, Robert; Millington, Peter
2016-01-01
We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.
Channel Capacity Estimation using Free Probability Theory
Ryan, Øyvind
2007-01-01
In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...
Probability, Arrow of Time and Decoherence
Bacciagaluppi, G
2007-01-01
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.
A basic course in probability theory
Bhattacharya, Rabi
2016-01-01
This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...
Correlations and Non-Linear Probability Models
DEFF Research Database (Denmark)
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
A Revisit to Probability - Possibility Consistency Principles
Directory of Open Access Journals (Sweden)
Mamoni Dhar
2013-03-01
Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.
Python for probability, statistics, and machine learning
Unpingco, José
2016-01-01
This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...
7th High Dimensional Probability Meeting
Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan
2016-01-01
This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.
Explosion probability of unexploded ordnance: expert beliefs.
MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G
2008-08-01
This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.
Exact probability distribution functions for Parrondo's games
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Atomic transition probabilities of Nd I
Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.
2011-12-01
Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.
Survival probability in diffractive dijet photoproduction
Klasen, M
2009-01-01
We confront the latest H1 and ZEUS data on diffractive dijet photoproduction with next-to-leading order QCD predictions in order to determine whether a rapidity gap survival probability of less than one is supported by the data. We find evidence for this hypothesis when assuming global factorization breaking for both the direct and resolved photon contributions, in which case the survival probability would have to be E_T^jet-dependent, and for the resolved or in addition the related direct initial-state singular contribution only, where it would be independent of E_T^jet.
Conditional Probabilities and Collapse in Quantum Measurements
Laura, Roberto; Vanni, Leonardo
2008-09-01
We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.
Harmonic analysis and the theory of probability
Bochner, Salomon
2005-01-01
Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro
Duelling idiots and other probability puzzlers
Nahin, Paul J
2002-01-01
What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki
Lady luck the theory of probability
Weaver, Warren
1982-01-01
""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa
Probabilities for separating sets of order statistics.
Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E
2010-04-01
Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.
Concepts of probability in radiocarbon analysis
Directory of Open Access Journals (Sweden)
Bernhard Weninger
2011-12-01
Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.
Fifty challenging problems in probability with solutions
Mosteller, Frederick
1987-01-01
Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall
Probability, statistics, and decision for civil engineers
Benjamin, Jack R
2014-01-01
Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and
Probability densities and Lévy densities
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Probability in biology: overview of a comprehensive theory of probability in living systems.
Nakajima, Toshiyuki
2013-09-01
Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
Effect of Liquid Penetrant Sensitivity on Probability of Detection
Parker, Bradford H.
2008-01-01
The objective of the task is to investigate the effect of liquid penetrant sensitivity level on probability of crack detection (POD). NASA-STD-5009 currently requires the use of only sensitivity level 4 liquid penetrants. This requirement is based on the fact that the data generated in the NTIAC Nondestructive Evaluation (NDE) Capabilities Data Book was produced using only sensitivity level 4 penetrants. Many NDE contractors supporting NASA Centers routinely use sensitivity level 3 penetrants. Because of the new NASA-STD-5009 requirement, these contractors will have to either shift to sensitivity level 4 penetrants or perform formal POD demonstration tests to qualify their existing process.
A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures
Institute of Scientific and Technical Information of China (English)
李典庆; 张圣坤; 唐文勇
2003-01-01
There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.
Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package
Energy Technology Data Exchange (ETDEWEB)
S.F.A. Deng; M. Saglam; L.J. Gratton
2001-05-23
In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{sub eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.
Analysing Access Control Specifications
DEFF Research Database (Denmark)
Probst, Christian W.; Hansen, René Rydhof
2009-01-01
. Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set......When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... of credentials needed to reach a certain location in a system. This knowledge allows to identify a set of (inside) actors who have the possibility to commit an insider attack at that location. This has immediate applications in analysing log files, but also nontechnical applications such as identifying possible...
Geiser, Achim
2015-01-01
A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing $ep$ collider data and their physics scope. Comparisons to the original scope of the HERA programme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-e...
Comparing coefficients of nested nonlinear probability models
DEFF Research Database (Denmark)
Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders
2011-01-01
In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi...
Probability of boundary conditions in quantum cosmology
Suenobu, Hiroshi; Nambu, Yasusada
2017-02-01
One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.
Phonotactic Probability Effects in Children Who Stutter
Anderson, Julie D.; Byrd, Courtney T.
2008-01-01
Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…
Comonotonic Book-Making with Nonadditive Probabilities
Diecidue, E.; Wakker, P.P.
2000-01-01
This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the
Probability & Statistics: Modular Learning Exercises. Teacher Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Probability & Statistics: Modular Learning Exercises. Student Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Interstitial lung disease probably caused by imipramine.
Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H
2014-01-01
Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.
PROBABILITY SAMPLING DESIGNS FOR VETERINARY EPIDEMIOLOGY
Xhelil Koleci; Coryn, Chris L.S.; Kristin A. Hobson; Rruzhdi Keci
2011-01-01
The objective of sampling is to estimate population parameters, such as incidence or prevalence, from information contained in a sample. In this paper, the authors describe sources of error in sampling; basic probability sampling designs, including simple random sampling, stratified sampling, systematic sampling, and cluster sampling; estimating a population size if unknown; and factors influencing sample size determination for epidemiological studies in veterinary medicine.
STRIP: stream learning of influence probabilities
DEFF Research Database (Denmark)
Kutzkov, Konstantin
2013-01-01
cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...
Rethinking the learning of belief network probabilities
Energy Technology Data Exchange (ETDEWEB)
Musick, R.
1996-03-01
Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.
Entanglement Mapping VS. Quantum Conditional Probability Operator
Chruściński, Dariusz; Kossakowski, Andrzej; Matsuoka, Takashi; Ohya, Masanori
2011-01-01
The relation between two methods which construct the density operator on composite system is shown. One of them is called an entanglement mapping and another one is called a quantum conditional probability operator. On the base of this relation we discuss the quantum correlation by means of some types of quantum entropy.
Probable Bright Supernova discovered by PSST
Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.
2016-09-01
A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).
Error probabilities in default Bayesian hypothesis testing
Gu, Xin; Hoijtink, Herbert; Mulder, J,
2016-01-01
This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for
Updating piping probabilities with survived historical loads
Schweckendiek, T.; Kanning, W.
2009-01-01
Piping, also called under-seepage, is an internal erosion mechanism, which can cause the failure of dikes or other flood defence structures. The uncertainty in the resistance of a flood defence against piping is usually large, causing high probabilities of failure for this mechanism. A considerable
Assessing Schematic Knowledge of Introductory Probability Theory
Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley
2005-01-01
The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…
Independent Events in Elementary Probability Theory
Csenki, Attila
2011-01-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…
Probability & Perception: The Representativeness Heuristic in Action
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Statistical physics of pairwise probability models
DEFF Research Database (Denmark)
Roudi, Yasser; Aurell, Erik; Hertz, John
2009-01-01
(dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data...
Probability from a Socio-Cultural Perspective
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Applied probability models with optimization applications
Ross, Sheldon M
1992-01-01
Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.
Energy Technology Data Exchange (ETDEWEB)
Geiser, Achim
2015-12-15
A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.
Energy Technology Data Exchange (ETDEWEB)
Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies
1996-12-31
The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report
Energy Technology Data Exchange (ETDEWEB)
Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang
2009-09-18
Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.
Establishment probability in newly founded populations
Directory of Open Access Journals (Sweden)
Gusset Markus
2012-06-01
Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.
Digital bowel cleansing for virtual colonoscopy with probability map
Hong, Wei; Qiu, Feng
2010-03-01
Virtual colonoscopy (VC) is a noninvasivemethod for colonic polyp screening, by reconstructing three-dimensional models of the colon using computerized tomography (CT). Identifying the residual fluid retained inside the colon is a major challenge for 3D virtual colonoscopy using fecal tagging CT data. Digital bowel cleansing aims to segment the colon lumen from a patient abdominal image acquired using an oral contrast agent for colonic material tagging. After removing the segmented residual fluid, the clean virtual colon model can be constructed and visualized for screening. We present a novel automatic method for digital cleansing using probability map. The random walker algorithm is used to generate the probability map for air (inside the colon), soft tissue, and residual fluid instead of segment colon lumen directly. The probability map is then used to remove residual fluid from the original CT data. The proposed method was tested using VC study data at National Cancer Institute at NIH. The performance of our VC system for polyp detection has been improved by providing radiologists more detail information of the colon wall.
Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs
Energy Technology Data Exchange (ETDEWEB)
Khaleel, Mohammad A.; Simonen, Fredric A.
2009-05-01
The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.
Recursive recovery of Markov transition probabilities from boundary value data
Energy Technology Data Exchange (ETDEWEB)
Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)
1994-04-01
In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.
Determination of bounds on failure probability in the presence of hybrid uncertainties
Indian Academy of Sciences (India)
M B Anoop; K Balaji Rao
2008-12-01
A fundamental component of safety assessment is the appropriate representation and incorporation of uncertainty. A procedure for handling hybrid uncertainties in stochastic mechanics problems is presented. The procedure can be used for determining the bounds on failure probability for cases where failure probability is a monotonic function of the fuzzy variables. The procedure is illustrated through an example problem of safety assessment of a nuclear power plant piping component against stress corrosion cracking, considering the stochastic evolution of stress corrosion cracks with time. It is found that the bounds obtained enclose the values of failure probability obtained from probabilistic analyses.
Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods
Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.
2012-01-01
Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…
Reliability Analyses of Groundwater Pollutant Transport
Energy Technology Data Exchange (ETDEWEB)
Dimakis, Panagiotis
1997-12-31
This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.
Energy Technology Data Exchange (ETDEWEB)
Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division
1998-03-01
The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)
Employment and Wage Assimilation of Male First Generation Immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;
2000-01-01
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...... for unobserved cohort and individual effects and panel selectivity due to missing wage information. The results show that immigrants assimilate partially to Danes, but the assimilation process differs between refugees and non-refugees....
Probability and statistics for particle physics
Mana, Carlos
2017-01-01
This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...
Probability of Boundary Conditions in Quantum Cosmology
Suenobu, Hiroshi
2016-01-01
One of the main interest in quantum cosmology is to determine which type of boundary conditions for the wave function of the universe can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation numerically and evaluate probabilities for an observable representing evolution of the classical universe, especially, the number of e-foldings of the inflation. To express boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify them introducing two real parameters which discriminate boundary conditions and estimate values of these parameters resulting in observationally preferable predictions. We obtain the probability for these parameters under the requirement of the sufficient e-foldings of the inflation.
Volcano shapes, entropies, and eruption probabilities
Gudmundsson, Agust; Mohajeri, Nahid
2014-05-01
We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Quantum probabilities: an information-theoretic interpretation
Bub, Jeffrey
2010-01-01
This Chapter develops a realist information-theoretic interpretation of the nonclassical features of quantum probabilities. On this view, what is fundamental in the transition from classical to quantum physics is the recognition that \\emph{information in the physical sense has new structural features}, just as the transition from classical to relativistic physics rests on the recognition that space-time is structurally different than we thought. Hilbert space, the event space of quantum systems, is interpreted as a kinematic (i.e., pre-dynamic) framework for an indeterministic physics, in the sense that the geometric structure of Hilbert space imposes objective probabilistic or information-theoretic constraints on correlations between events, just as the geometric structure of Minkowski space in special relativity imposes spatio-temporal kinematic constraints on events. The interpretation of quantum probabilities is more subjectivist in spirit than other discussions in this book (e.g., the chapter by Timpson)...
Estimation of transition probabilities of credit ratings
Peng, Gan Chew; Hin, Pooi Ah
2015-12-01
The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.
Earthquake probabilities: theoretical assessments and reality
Kossobokov, V. G.
2013-12-01
It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance
A Bibliography of Generative-Based Grammatical Analyses of Spanish.
Nuessel, Frank H.
One hundred sixty-eight books, articles, and dissertations written between 1960 and 1973 are listed in this bibliography of linguistic studies of the Spanish language within the grammatical theory originated by Noam Chomsky in his "Syntactic Structures" (1957). The present work is divided into two general categories: (1) phonology and (2) syntax…
Improving customer generation by analysing website visitor behaviour
Ramlall, Shalini
2011-01-01
This dissertation describes the creation of a new integrated Information Technology (IT) system that assisted in the collection of data about the behaviour of website visitors as well as sales and marketing data for those visitors who turned into customers. A key contribution to knowledge was the creation of a method to predict the outcome of visits to a website from visitors’ browsing behaviour. A new Online Tracking Module (OTM) was created that monitored visitors’ behaviour while they brow...
Nuclear data uncertainties: I, Basic concepts of probability
Energy Technology Data Exchange (ETDEWEB)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Marrakesh International Conference on Probability and Statistics
Ouassou, Idir; Rachdi, Mustapha
2015-01-01
This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.
Interpreting Prediction Market Prices as Probabilities
Wolfers, Justin; Zitzewitz, Eric
2006-01-01
While most empirical analysis of prediction markets treats prices of binary options as predictions of the probability of future events, Manski (2004) has recently argued that there is little existing theory supporting this practice. We provide relevant analytic foundations, describing sufficient conditions under which prediction markets prices correspond with mean beliefs. Beyond these specific sufficient conditions, we show that for a broad class of models prediction market prices are usuall...
Probable Unusual Transmission of Zika Virus
Centers for Disease Control (CDC) Podcasts
2011-05-23
This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event. Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID). Date Released: 5/25/2011.
The Origin of Probability and Entropy
Knuth, Kevin H.
2008-11-01
Measuring is the quantification of ordering. Thus the process of ordering elements of a set is a more fundamental activity than measuring. Order theory, also known as lattice theory, provides a firm foundation on which to build measure theory. The result is a set of new insights that cast probability theory and information theory in a new light, while simultaneously opening the door to a better understanding of measures as a whole.
Non-signalling Theories and Generalized Probability
Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek
2016-09-01
We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.
Probable Cause: A Decision Making Framework.
1984-08-01
draw upon several approaches to the p study of causality; specifically, work in attribution theory (Hider, 1958; Kelley, 1973), methodology (Cook...psychology (Michotte, 1946; Piaget , 1974). From our perspective, much of the difficulty in assessing causality is due to the fact that judgments of...that violate probability and statistical theory . We therefore consider such cases because they highlight the various characteristics of each system
Calculating Cumulative Binomial-Distribution Probabilities
Scheuer, Ernest M.; Bowerman, Paul N.
1989-01-01
Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.
SureTrak Probability of Impact Display
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
A quantum probability model of causal reasoning.
Trueblood, Jennifer S; Busemeyer, Jerome R
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.
Probability of metastable states in Yukawa clusters
Ludwig, Patrick; Kaehlert, Hanno; Baumgartner, Henning; Bonitz, Michael
2008-11-01
Finite strongly coupled systems of charged particles in external traps are of high interest in many fields. Here we analyze the occurrence probabilities of ground- and metastable states of spherical, three-dimensional Yukawa clusters by means of molecular dynamics and Monte Carlo simulations and an analytical method. We find that metastable states can occur with a higher probability than the ground state, thus confirming recent dusty plasma experiments with so-called Yukawa balls [1]. The analytical method [2], based on the harmonic approximation of the potential energy, allows for a very intuitive explanation of the probabilities when combined with the simulation results [3].[1] D. Block, S. Käding, A. Melzer, A. Piel, H. Baumgartner, and M. Bonitz, Physics of Plasmas 15, 040701 (2008)[2] F. Baletto and R. Ferrando, Reviews of Modern Physics 77, 371 (2005)[3] H. Kählert, P. Ludwig, H. Baumgartner, M. Bonitz, D. Block, S. Käding, A. Melzer, and A. Piel, submitted for publication (2008)
A Quantum Probability Model of Causal Reasoning
Trueblood, Jennifer S.; Busemeyer, Jerome R.
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747
Bacteria survival probability in bactericidal filter paper.
Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M
2014-05-01
Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.
Probability of Default and Default Correlations
Directory of Open Access Journals (Sweden)
Weiping Li
2016-07-01
Full Text Available We consider a system where the asset values of firms are correlated with the default thresholds. We first evaluate the probability of default of a single firm under the correlated assets assumptions. This extends Merton’s probability of default of a single firm under the independent asset values assumption. At any time, the distance-to-default for a single firm is derived in the system, and this distance-to-default should provide a different measure for credit rating with the correlated asset values into consideration. Then we derive a closed formula for the joint default probability and a general closed formula for the default correlation via the correlated multivariate process of the first-passage-time default correlation model. Our structural model encodes the sensitivities of default correlations with respect to the underlying correlation among firms’ asset values. We propose the disparate credit risk management from our result in contrast to the commonly used risk measurement methods considering default correlations into consideration.
A quantum probability model of causal reasoning
Directory of Open Access Journals (Sweden)
Jennifer S Trueblood
2012-05-01
Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.
Per Jönsson; Hyun-Kyung Chung
2013-01-01
There exist several codes in the atomic physics community to generate atomic structure and transition probabilities freely and readily distributed to researchers outside atomic physics community, in plasma, astrophysical or nuclear physics communities. Users take these atomic physics codes to generate the necessary atomic data or modify the codes for their own applications. However, there has been very little effort to validate and verify the data sets generated by non-expert users. [...
Calculating the Probability of Returning a Loan with Binary Probability Models
Directory of Open Access Journals (Sweden)
Julian Vasilev
2014-12-01
Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.
Energy Technology Data Exchange (ETDEWEB)
Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)
2009-02-01
The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.
Probability of success for phase III after exploratory biomarker analysis in phase II.
Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver
2017-02-23
The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs.
Hyun-Kyung Chung; Per Jönsson; Alexander Kramida
2013-01-01
Atomic structure and transition probabilities are fundamental physical data required in many fields of science and technology. Atomic physics codes are freely available to other community users to generate atomic data for their interest, but the quality of these data is rarely verified. This special issue addresses estimation of uncertainties in atomic structure and transition probability calculations, and discusses methods and strategies to assess and ensure the quality of theoretical atomic...
A cellular automata model with probability infection and spatial dispersion
Institute of Scientific and Technical Information of China (English)
Jin Zhen; Liu Quan-Xing; Mainul Haque
2007-01-01
In this article, we have proposed an epidemic model based on the probability cellular automata theory. The essential mathematical features are analysed with the help of stability theory. We have given an alternative modelling approach for the spatiotemporal system which is more realistic from the practical point of view. A discrete and spatiotemporal approach is shown by using cellular automata theory. It is interesting to note that both the size of the endemic equilibrium and the density of the individuals increase with the increase of the neighbourhood size and infection rate, but the infections decrease with the increase of the recovery rate. The stability of the system around the positive interior equilibrium has been shown by using a suitable Lyapunov function. Finally, experimental data simulation for SARS disease in China in 2003 and a brief discussion are given.
Market-implied risk-neutral probabilities, actual probabilities, credit risk and news
Directory of Open Access Journals (Sweden)
Shashidhar Murthy
2011-09-01
Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.
Spacetime quantum probabilities II: Relativized descriptions and Popperian propensities
Mugur-Schächter, M.
1992-02-01
In the first part of this work(1) we have explicated the spacetime structure of the probabilistic organization of quantum mechanics. We have shown that each quantum mechanical state, in consequence of the spacetime characteristics of the epistemic operations by which the observer produces the state to be studied and the processes of qualification of these, brings in a tree-like spacetime structure, a “quantum mechanical probability tree,” that transgresses the theory of probabilities as it now stands. In this second part we develop the general implications of these results. Starting from the lowest level of cognitive action and creating an appropriate symbolism, we construct a “relativizing epistemic syntax,” a “general method of relativized conceptualization” where—systematically—each description is explicitly referred to the epistemic operations by which the observer produces the entity to be described and obtains qualifications of it. The method generates a typology of increasingly complex relativized descriptions where the question of realism admits of a particularly clear pronouncement. Inside this typology the epistemic processes that lie—UNIVERSALLY—at the basis of any conceptualization, reveal a tree-like spacetime structure. It appears in particular that the spacetime structure of the relativized representation of a probabilistic description, which transgresses the nowadays theory of probabilities, is the general mould of which the quantum mechanical probability trees are only particular realizations. This entails a clear definition of the descriptional status of quantum mechanics. While the recognition of the universal cognitive content of the quantum mechanical formalism opens up vistas toward mathematical developments of the relativizing epistemic syntax. The relativized representation of a probabilistic description leads with inner necessity to a “morphic” interpretation of probabilities that can be regarded as a formalized and
A new estimator of the discovery probability.
Favaro, Stefano; Lijoi, Antonio; Prünster, Igor
2012-12-01
Species sampling problems have a long history in ecological and biological studies and a number of issues, including the evaluation of species richness, the design of sampling experiments, and the estimation of rare species variety, are to be addressed. Such inferential problems have recently emerged also in genomic applications, however, exhibiting some peculiar features that make them more challenging: specifically, one has to deal with very large populations (genomic libraries) containing a huge number of distinct species (genes) and only a small portion of the library has been sampled (sequenced). These aspects motivate the Bayesian nonparametric approach we undertake, since it allows to achieve the degree of flexibility typically needed in this framework. Based on an observed sample of size n, focus will be on prediction of a key aspect of the outcome from an additional sample of size m, namely, the so-called discovery probability. In particular, conditionally on an observed basic sample of size n, we derive a novel estimator of the probability of detecting, at the (n+m+1)th observation, species that have been observed with any given frequency in the enlarged sample of size n+m. Such an estimator admits a closed-form expression that can be exactly evaluated. The result we obtain allows us to quantify both the rate at which rare species are detected and the achieved sample coverage of abundant species, as m increases. Natural applications are represented by the estimation of the probability of discovering rare genes within genomic libraries and the results are illustrated by means of two expressed sequence tags datasets.
Data analysis & probability task & drill sheets
Cook, Tanya
2011-01-01
For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro
Normativity And Probable Reasoning: Hume On Induction
Tejedor, Chon
2011-01-01
En este artículo examino el debate entre los intérpretes epistémicos y descriptivistas de la discusión humeana de la inducción y el razonamiento probable. Los intérpretes epistémicos consideran a Hume como concernido principalmente con cuestiones relacionadas con la autoridad y justificación epistémica de nuestros principios y creencias inductivas. Los intérpretes descriptivistas, por contra, sugieren que lo que Hume pretende es explicar cómo se producen nuestras creencias, no dictaminar si e...
Elemental mercury poisoning probably causes cortical myoclonus.
Ragothaman, Mona; Kulkarni, Girish; Ashraf, Valappil V; Pal, Pramod K; Chickabasavaiah, Yasha; Shankar, Susarla K; Govindappa, Srikanth S; Satishchandra, Parthasarthy; Muthane, Uday B
2007-10-15
Mercury toxicity causes postural tremors, commonly referred to as "mercurial tremors," and cerebellar dysfunction. A 23-year woman, 2 years after injecting herself with elemental mercury developed disabling generalized myoclonus and ataxia. Electrophysiological studies confirmed the myoclonus was probably of cortical origin. Her deficits progressed over 2 years and improved after subcutaneous mercury deposits at the injection site were surgically cleared. Myoclonus of cortical origin has never been described in mercury poisoning. It is important to ask patients presenting with jerks about exposure to elemental mercury even if they have a progressive illness, as it is a potentially reversible condition as in our patient.
Atomic transition probabilities of Gd i
Lawler, J. E.; Bilty, K. A.; Den Hartog, E. A.
2011-05-01
Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.
Atomic transition probabilities of Er i
Lawler, J. E.; Wyart, J.-F.; Den Hartog, E. A.
2010-12-01
Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.
Intermediate Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner--developing sp
Probability of inflation in loop quantum cosmology
Ashtekar, Abhay; Sloan, David
2011-12-01
Inflationary models of the early universe provide a natural mechanism for the formation of large scale structure. This success brings to forefront the question of naturalness: Does a sufficiently long slow roll inflation occur generically or does it require a careful fine tuning of initial parameters? In recent years there has been considerable controversy on this issue (Hollands and Wald in Gen Relativ Gravit, 34:2043, 2002; Kofman et al. in J High Energy Phys 10:057, 2002); (Gibbons and Turok in Phys Rev D 77:063516, 2008). In particular, for a quadratic potential, Kofman et al. (J High Energy Phys 10:057, 2002) have argued that the probability of inflation with at least 65 e-foldings is close to one, while Gibbons and Turok (Phys Rev D 77:063516, 2008) have argued that this probability is suppressed by a factor of ~10-85. We first clarify that such dramatically different predictions can arise because the required measure on the space of solutions is intrinsically ambiguous in general relativity. We then show that this ambiguity can be naturally resolved in loop quantum cosmology (LQC) because the big bang is replaced by a big bounce and the bounce surface can be used to introduce the structure necessary to specify a satisfactory measure. The second goal of the paper is to present a detailed analysis of the inflationary dynamics of LQC using analytical and numerical methods. By combining this information with the measure on the space of solutions, we address a sharper question than those investigated in Kofman et al. (J High Energy Phys 10:057, 2002), Gibbons and Turok (Phys Rev D 77:063516, 2008), Ashtekar and Sloan (Phys Lett B 694:108, 2010): What is the probability of a sufficiently long slow roll inflation which is compatible with the seven year WMAP data? We show that the probability is very close to 1. The material is so organized that cosmologists who may be more interested in the inflationary dynamics in LQC than in the subtleties associated with
Numerical Ultimate Ruin Probabilities under Interest Force
Directory of Open Access Journals (Sweden)
Juma Kasozi
2005-01-01
Full Text Available This work addresses the issue of ruin of an insurer whose portfolio is exposed to insurance risk arising from the classical surplus process. Availability of a positive interest rate in the financial world forces the insurer to invest into a risk free asset. We derive a linear Volterra integral equation of the second kind and apply an order four Block-by-block method in conjuction with the Simpson rule to solve the Volterra equation for ultimate ruin. This probability is arrived at by taking a linear combination of some two solutions to the Volterra integral equation. The several numerical examples given show that our results are excellent and reliable.
An introduction to probability and statistical inference
Roussas, George G
2003-01-01
"The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs
Acceleration Detection of Large (Probably Prime Numbers
Directory of Open Access Journals (Sweden)
Dragan Vidakovic
2013-02-01
Full Text Available In order to avoid unnecessary applications of Miller-Rabin algorithm to the number in question, we resortto trial division by a few initial prime numbers, since such a division take less time. How far we should gowith such a division is the that we are trying to answer in this paper?For the theory of the matter is fullyresolved. However, that in practice we do not have much use.Therefore, we present a solution that isprobably irrelevant to theorists, but it is very useful to people who have spent many nights to producelarge (probably prime numbers using its own software.
The Probability Model of Expectation Disconfirmation Process
Directory of Open Access Journals (Sweden)
Hui-Hsin HUANG
2015-06-01
Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.
Random iteration with place dependent probabilities
Kapica, R
2011-01-01
Markov chains arising from random iteration of functions $S_{\\theta}:X\\to X$, $\\theta \\in \\Theta$, where $X$ is a Polish space and $\\Theta$ is arbitrary set of indices are considerd. At $x\\in X$, $\\theta$ is sampled from distribution $\\theta_x$ on $\\Theta$ and $\\theta_x$ are different for different $x$. Exponential convergence to a unique invariant measure is proved. This result is applied to case of random affine transformations on ${\\mathbb R}^d$ giving existence of exponentially attractive perpetuities with place dependent probabilities.