WorldWideScience

Sample records for analyses generated probability

  1. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  4. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  5. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  6. Analysing uncertainties: Towards comparing Bayesian and interval probabilities'

    Science.gov (United States)

    Blockley, David

    2013-05-01

    Two assumptions, commonly made in risk and reliability studies, have a long history. The first is that uncertainty is either aleatoric or epistemic. The second is that standard probability theory is sufficient to express uncertainty. The purposes of this paper are to provide a conceptual analysis of uncertainty and to compare Bayesian approaches with interval approaches with an example relevant to research on climate change. The analysis reveals that the categorisation of uncertainty as either aleatoric or epistemic is unsatisfactory for practical decision making. It is argued that uncertainty emerges from three conceptually distinctive and orthogonal attributes FIR i.e., fuzziness, incompleteness (epistemic) and randomness (aleatory). Characterisations of uncertainty, such as ambiguity, dubiety and conflict, are complex mixes of interactions in an FIR space. To manage future risks in complex systems it will be important to recognise the extent to which we 'don't know' about possible unintended and unwanted consequences or unknown-unknowns. In this way we may be more alert to unexpected hazards. The Bayesian approach is compared with an interval probability approach to show one way in which conflict due to incomplete information can be managed.

  7. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    Science.gov (United States)

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  8. Sharp Bounds by Probability-Generating Functions and Variable Drift

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten

    2011-01-01

    We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al...

  9. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  10. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  11. The relative impact of sizing errors on steam generator tube failure probability

    Energy Technology Data Exchange (ETDEWEB)

    Cizelj, L.; Dvorsek, T. [Jozef Stefan Inst., Ljubljana (Slovenia)

    1998-07-01

    The Outside Diameter Stress Corrosion Cracking (ODSCC) at tube support plates is currently the major degradation mechanism affecting the steam generator tubes made of Inconel 600. This caused development and licensing of degradation specific maintenance approaches, which addressed two main failure modes of the degraded piping: tube rupture; and excessive leakage through degraded tubes. A methodology aiming at assessing the efficiency of a given set of possible maintenance approaches has already been proposed by the authors. It pointed out better performance of the degradation specific over generic approaches in (1) lower probability of single and multiple steam generator tube rupture (SGTR), (2) lower estimated accidental leak rates and (3) less tubes plugged. A sensitivity analysis was also performed pointing out the relative contributions of uncertain input parameters to the tube rupture probabilities. The dominant contribution was assigned to the uncertainties inherent to the regression models used to correlate the defect size and tube burst pressure. The uncertainties, which can be estimated from the in-service inspections, are further analysed in this paper. The defect growth was found to have significant and to some extent unrealistic impact on the probability of single tube rupture. Since the defect growth estimates were based on the past inspection records they strongly depend on the sizing errors. Therefore, an attempt was made to filter out the sizing errors and to arrive at more realistic estimates of the defect growth. The impact of different assumptions regarding sizing errors on the tube rupture probability was studied using a realistic numerical example. The data used is obtained from a series of inspection results from Krsko NPP with 2 Westinghouse D-4 steam generators. The results obtained are considered useful in safety assessment and maintenance of affected steam generators. (author)

  12. Demand and choice probability generating functions for perturbed consumers

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2011-01-01

    generating function to be consistent with utility maximization. Within a budget, the convex hull of the demand correspondence is the subdifferential of the demand generating function. The additive random utility discrete choice model (ARUM) is a special case with finite budget sets where utility...

  13. On singular probability densities generated by extremal dynamics

    OpenAIRE

    Garcia, Guilherme J. M.; Dickman, Ronald

    2003-01-01

    Extremal dynamics is the mechanism that drives the Bak-Sneppen model into a (self-organized) critical state, marked by a singular stationary probability density $p(x)$. With the aim of understanding this phenomenon, we study the BS model and several variants via mean-field theory and simulation. In all cases, we find that $p(x)$ is singular at one or more points, as a consequence of extremal dynamics. Furthermore we show that the extremal barrier $x_i$ always belongs to the `prohibited' inter...

  14. On the smallest number of generators and the probability of generating an algebra

    CERN Document Server

    Kravchenko, Rostyslav V; Petrenko, Bogdan V

    2010-01-01

    In this paper we study algebraic and asymptotic properties of generating sets of algebras over orders in number fields. Let $A$ be an associative algebra over an order $R$ in an algebraic number field. We assume that $A$ is a free $R$-module of finite rank. We develop a technique to compute the smallest number of generators of $A$. For example, we prove that the ring $\\M_3(\\mathbb{Z})^{k}$ admits two generators if and only if $k\\leq 768$. For a given positive integer $m$, we define the density of the set of all ordered $m$-tuples of elements of $A$ which generate it as an $R$-algebra. We express this density as a certain infinite product over the maximal ideals of $R$, and we interpret the resulting formula probabilistically. For example, we show that the probability that 2 random $3\\times 3$ matrices generate the ring $\\M_3(\\mathbb{Z})$ is equal to $\\dd (\\zeta(2)^2 \\zeta(3))^{-1}$, where $\\zeta$ is the Riemann zeta-function.

  15. Banking on a bad bet. Probability matching in risky choice is linked to expectation generation.

    Science.gov (United States)

    James, Greta; Koehler, Derek J

    2011-06-01

    Probability matching is the tendency to match choice probabilities to outcome probabilities in a binary prediction task. This tendency is a long-standing puzzle in the study of decision making under risk and uncertainty, because always predicting the more probable outcome across a series of trials (maximizing) would yield greater predictive accuracy and payoffs. In three experiments, we tied the predominance of probability matching over maximizing to a generally adaptive cognitive operation that generates expectations regarding the aggregate outcomes of an upcoming sequence of events. Under conditions designed to diminish the generation or perceived applicability of such expectations, we found that the frequency of probability-matching behavior dropped substantially and maximizing became the norm.

  16. AN EXPERT BASED INITIAL GENERATION OF GENETIC ALGORITHM WITH ADAPTIVE PROBABILITY APPROACH FOR QUADRATIC OPF

    OpenAIRE

    BHASKAR, Mithun; BENARJI, Mohan; MAHESWARAPU, Sydulu

    2012-01-01

    This paper presents a novel and superior Genetic Algorithm (GA) based resolver for Optimal Power flow (OPF) problem. Here, the main contrast to other Genetic Algorithm based approaches is that a novel expert based initial generation of population and adaptive probability approach (variable Cross over probability and mutation probability) is adopted in selection of offspring together with roulette wheel technique which reduces the computation time and increases the quality considerably. Select...

  17. Estimation of Mutation Rates from Fluctuation Experiments via Probability Generating Functions

    OpenAIRE

    Montgomery-Smith, Stephen; Le, Anh; Smith, George; Billstein, Sidney; Oveys, Hesam; Pisechko, Dylan; Yates, Austin

    2016-01-01

    This paper calculates probability distributions modeling the Luria-Delbr\\"uck experiment. We show that by thinking purely in terms of generating functions, and using a 'backwards in time' paradigm, that formulas describing various situations can be easily obtained. This includes a generating function for Haldane's probability distribution due to Ycart. We apply our formulas to both simulated and real data created by looking at yeast cells acquiring an immunization to the antibiotic canavanine...

  18. Resolving conflicts between statistical methods by probability combination: Application to empirical Bayes analyses of genomic data

    CERN Document Server

    Bickel, David R

    2011-01-01

    In the typical analysis of a data set, a single method is selected for statistical reporting even when equally applicable methods yield very different results. Examples of equally applicable methods can correspond to those of different ancillary statistics in frequentist inference and of different prior distributions in Bayesian inference. More broadly, choices are made between parametric and nonparametric methods and between frequentist and Bayesian methods. Rather than choosing a single method, it can be safer, in a game-theoretic sense, to combine those that are equally appropriate in light of the available information. Since methods of combining subjectively assessed probability distributions are not objective enough for that purpose, this paper introduces a method of distribution combination that does not require any assignment of distribution weights. It does so by formalizing a hedging strategy in terms of a game between three players: nature, a statistician combining distributions, and a statistician ...

  19. On the smallest number of generators and the probability of generating an algebra

    OpenAIRE

    Petrenko, Bogdan V.; Mazur, Marcin; Kravchenko, Rostyslav V.

    2012-01-01

    In this paper we study algebraic and asymptotic properties of generating sets of algebras over orders in number fields. Let $A$ be an associative algebra over an order $R$ in an algebraic number field. We assume that $A$ is a free $R$-module of finite rank. We develop a technique to compute the smallest number of generators of $A$. For example, we prove that the ring $M_3(\\mathbb{Z})^{k}$ admits two generators if and only if $k\\leq 768$. For a given positive integer $m$, we define the density...

  20. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  1. The development of weak alpha source for the application to industrial probability generator

    International Nuclear Information System (INIS)

    We have been developing a random number generator and probability generator that utilizes natural random phenomenon of radioactive alpha-decay. The time intervals of electric pulse signals induced in a semiconductor detector are converted to physical random number by measuring the interval as clock pulse counts. Since the time interval has an exponential distribution, ρ(t), depending on the count rate of original electric pulse signals, the probability (P(Δt)) is easily defined by P(Δt)=∫t1t2 ρdt/∫0∞ ρdt. The sealed 100 Bq 244Cm source pieces were supplied from the AEA technology. We selected pin-diode detector as the alpha detector. The source pieces were subjected to the test for thermal stress simulating the bonding process to mount the source into a small chip of probability generator. We confirmed that the sealed surface was not changed after adding the thermal stress by the observation with SEM. We have a next plan to replace 244Cm to natural radioisotope 210Pb. As the first step of this attempt, the methods for collecting 210Pb from several natural resources such as aerosol collected in HEPA filters, ground water, and natural radioactive minerals were surveyed. (author)

  2. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  3. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  4. Fortran code for generating random probability vectors, unitaries, and quantum states

    CERN Document Server

    Maziero, Jonas

    2015-01-01

    The usefulness of generating random configurations is recognized in a variety of contexts, as for instance in the simulation of physical systems, in the verification of bounds and/or ansatz solutions for optimization problems, and in secure communications. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And the several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  5. On the probability of exceeding allowable leak rates through degraded steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Cizelj, L.; Sorsek, I. [Jozef Stefan Institute, Ljubljana (Slovenia); Riesch-Oppermann, H. [Forschungszentrum Karlsruhe (Germany)

    1997-02-01

    This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds the predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.

  6. ANALYSING SOLAR-WIND HYBRID POWER GENERATING SYSTEM

    OpenAIRE

    Mustafa ENGİN; Metin ÇOLAK

    2005-01-01

    In this paper, a solar-wind hybrid power generating, system that will be used for security lighting was designed. Hybrid system was installed and solar cells, wind turbine, battery bank, charge regulators and inverter performance values were measured through the whole year. Using measured values of overall system efficiency, reliability, demanded energy cost per kWh were calculated, and percentage of generated energy according to resources were defined. We also include in the paper a discussi...

  7. The probability of generating the symmetric group with a commutator condition

    CERN Document Server

    Birulia, Raman

    2012-01-01

    Let B(n) be the set of pairs of permutations from the symmetric group of degree n with a 3-cycle commutator, and let A(n) be the set of those pairs which generate the symmetric or the alternating group of degree n. We find effective formulas for calculating the cardinalities of both sets. More precisely, we show that #B(n)/n! is a discrete convolution of the partition function and a linear combination of divisor functions, while #A(n)/n! is the product of a polynomial and Jordan's totient function. In particular, it follows that the probability that a pair of random permutations with a 3-cycle commutator generates the symmetric or the alternating group of degree n tends to zero as n tends to infinity, which makes a contrast with Dixon's classical result. Key elements of our proofs are Jordan's theorem from the 19th century, a formula by Ramanujan from the 20th century and a technique of square-tiled surfaces developed by French mathematicians Lelievre and Royer in the beginning of the 21st century. This paper...

  8. Probability distribution, conditional dissipation, and transport of passive temperature fluctuations in grid-generated turbulence

    International Nuclear Information System (INIS)

    The evolution of the scalar probability density function (pdf), the conditional scalar dissipation rate, and other statistics including transport properties are studied for passive temperature fluctuations in decaying grid-generated turbulence. The effect of filtering and differentiating the time series is also investigated. For a nonzero mean temperature gradient it is shown that the pdf of the temperature fluctuations has pronounced exponential tails for turbulence Reynolds number (Rel) greater than 70 but below this value the pdf is close to Gaussian. The scalar dissipation rate, conditioned on the fluctuations, shows that there is a high expectation of dissipation in the presence of the large, rare fluctuations that produce the exponential tails. Significant positive correlation between the mean square scalar fluctuations and the instantaneous scalar dissipation rate is found when exponential tails occur. The case of temperature fluctuations in the absence of a mean gradient is also studied. Here, the results are less definite because the generation of the fluctuations (by means of fine heated wires) causes an asymmetry in the pdf. The results show, however, that the pdf is close to Gaussian and that the correlation between the mean square temperature fluctuations and the instantaneous scalar dissipation rate is very weak. For the linear profile case, measurements over the range 60≤Rel≤1100 show that the dimensionless heat flux Nu is proportional to Rel0.88 and that the transition from a Gaussian pdf to one with exponential tails occurs at Nu∼31, a value close to transitions observed in other recent mixing experiments conducted in entirely different turbulent flows

  9. Inverse analyses of laser generated dispersive surface waves

    International Nuclear Information System (INIS)

    This paper presents results on the inverse analysis of laser generated surface waves in a epoxy bonded layered specimen. Laser ultrasonic experiments were performed and the acquired surface wave signals were processed in the frequency domain to obtain the dispersion relation of the phase velocity in a epoxy-bonded layered specimen. A computer program for calculating the phase velocity dispersion of general isotropic and/or anisotropic layered media was utilized to explore the influence of the epoxy-bonded layer. Inversions of the bonding layer thickness and the elastic wave velocities of the epoxy layer were investigated. The current results show that the thickness and the elastic wave velocities of the bonding layer can be successfully determined. Simultaneous determination of the thickness and the elastic properties of the bonding layer are currently under investigation by the authors.

  10. Next generation sequencing and comparative analyses of Xenopus mitogenomes

    Directory of Open Access Journals (Sweden)

    Lloyd Rhiannon E

    2012-09-01

    -coding genes were shown to be under strong negative (purifying selection, with genes under the strongest pressure (Complex 4 also being the most highly expressed, highlighting their potentially crucial functions in the mitochondrial respiratory chain. Conclusions Next generation sequencing of long-PCR amplicons using single taxon or multi-taxon approaches enabled two new species of Xenopus mtDNA to be fully characterized. We anticipate our complete mitochondrial genome amplification methods to be applicable to other amphibians, helpful for identifying the most appropriate markers for differentiating species, populations and resolving phylogenies, a pressing need since amphibians are undergoing drastic global decline. Our mtDNAs also provide templates for conserved primer design and the assembly of RNA and DNA reads following high throughput “omic” techniques such as RNA- and ChIP-seq. These could help us better understand how processes such mitochondrial replication and gene expression influence xenopus growth and development, as well as how they evolved and are regulated.

  11. Estimating Route Choice Models from Stochastically Generated Choice Sets on Large-Scale Networks Correcting for Unequal Sampling Probability

    DEFF Research Database (Denmark)

    Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo

    2015-01-01

    is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...

  12. Unconventional construction of the brushless homopolar generator probably reveals a predictive weakness in Maxwell's equations

    OpenAIRE

    Ivana, Pavol; Vlcek, Ivan; Ivanova, Marika

    2016-01-01

    Maxwell dynamic equation of Faraday law erroneously predicts that on homopolar without brush generator, the relative movement of the wire is equivalent with relative motion of the conductor of Faraday homopolar generator and therefore electric intensity must be generated at both devices. Research has shown that it is possible to construct experimental without brush homopolar generator, which proves that movement of electrically neutral conductor in radials of homogeneous magnetic field does n...

  13. Generation of Transition Probability Data: Can quantity and quality be balanced?

    Science.gov (United States)

    Curry, J. J.; Froese Fisher, C.

    2008-10-01

    The possibility of truly predictive plasma modeling rests on the availability of large quantities of accurate atomic and molecular data. These include a variety of collision cross-sections and radiative transition data. An example of current interest concerns radiative transition probabilities for neutral Ce, an additive in highly-efficient metal-halide lamps. Transition probabilities have been measured for several hundred lines (Bisson et al., JOSA B 12, 193, 1995 and Lawler et al., unpublished), but the number of observed and classified transitions in the range of 340 nm to 1 μm is in excess of 21,000 (Martin, unpublished). Since the prospect for measuring more than a thousand or so of these transitions is rather low, an important question is whether calculation can adequately fill the void. In this case, we are interested only in electric dipole transitions. Furthermore, we require only that the transition probabilities have an average accuracy of ˜20%. We will discuss our efforts to calculate a comprehensive set of transition probabilities for neutral Ce using the Cowan (The Theory of Atomic Structure and Spectra, 1981) and GRASP (J"onsson et al. Comput. Phys. Commun. 176, 559-579, 2007) codes. We will also discuss our efforts to quantify the accuracy of the results.

  14. Unconventional construction of the brushless homopolar generator probably reveals a predictive weakness in Maxwell's equations

    CERN Document Server

    Ivana, Pavol; Ivanova, Marika

    2016-01-01

    Maxwell dynamic equation of Faraday law erroneously predicts that on homopolar without brush generator, the relative movement of the wire is equivalent with relative motion of the conductor of Faraday homopolar generator and therefore electric intensity must be generated at both devices. Research has shown that it is possible to construct experimental without brush homopolar generator, which proves that movement of electrically neutral conductor in radials of homogeneous magnetic field does not induce any voltage. A new description of the operation of Faraday (with brushes) homopolar generator is here presented such as equipment, which simulates necessary and sufficient condition for the formation of the induction. However, the without brush homopolar meets only a necessary condition, but not sufficient. This article includes a mathematical analysis that shows the current differential concept of the rotation intensity vector creation as an incorrect theoretical mission with minimal impact on the design of kno...

  15. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    Science.gov (United States)

    Hu, Yaogang; Li, Hui; Liao, Xinglin; Song, Erbing; Liu, Haitao; Chen, Z.

    2016-08-01

    This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.

  16. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    International Nuclear Information System (INIS)

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  17. Fast dose algorithm for generation of dose coverage probability for robustness analysis of fractionated radiotherapy

    International Nuclear Information System (INIS)

    A fast algorithm is constructed to facilitate dose calculation for a large number of randomly sampled treatment scenarios, each representing a possible realisation of a full treatment with geometric, fraction specific displacements for an arbitrary number of fractions. The algorithm is applied to construct a dose volume coverage probability map (DVCM) based on dose calculated for several hundred treatment scenarios to enable the probabilistic evaluation of a treatment plan.For each treatment scenario, the algorithm calculates the total dose by perturbing a pre-calculated dose, separately for the primary and scatter dose components, for the nominal conditions. The ratio of the scenario specific accumulated fluence, and the average fluence for an infinite number of fractions is used to perturb the pre-calculated dose. Irregularities in the accumulated fluence may cause numerical instabilities in the ratio, which is mitigated by regularisation through convolution with a dose pencil kernel.Compared to full dose calculations the algorithm demonstrates a speedup factor of ∼1000. The comparisons to full calculations show a 99% gamma index (2%/2 mm) pass rate for a single highly modulated beam in a virtual water phantom subject to setup errors during five fractions. The gamma comparison shows a 100% pass rate in a moving tumour irradiated by a single beam in a lung-like virtual phantom. DVCM iso-probability lines computed with the fast algorithm, and with full dose calculation for each of the fractions, for a hypo-fractionated prostate case treated with rotational arc therapy treatment were almost indistinguishable. (paper)

  18. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Bovy, Jo; Hogg, David W.; Weaver, Benjamin A. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Myers, Adam D. [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Hennawi, Joseph F. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); McMahon, Richard G. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Schiminovich, David [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Sheldon, Erin S. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Brinkmann, Jon [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Schneider, Donald P., E-mail: jo.bovy@nyu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States)

    2012-04-10

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques-which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data-and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  19. Photometric redshifts and quasar probabilities from a single, data-driven generative model

    Energy Technology Data Exchange (ETDEWEB)

    Bovy, Jo [New York Univ. (NYU), NY (United States); Myers, Adam D. [Univ. of Wyoming, Laramie, WY (United States); Max Planck Inst. for Medical Research, Heidelberg (Germany); Hennawi, Joseph F. [Max Planck Inst. for Medical Research, Heidelberg (Germany); Hogg, David W. [Max Planck Inst. for Medical Research, Heidelberg (Germany); New York Univ. (NYU), NY (United States); McMahon, Richard G. [Univ. of Cambridge (United Kingdom); Schiminovich, David [Columbia Univ., New York, NY (United States); Sheldon, Erin S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brinkmann, Jon [Apache Point Observatory and New Mexico State Univ., Sunspot, NM (United States); Schneider, Donald P. [Pennsylvania State Univ., University Park, PA (United States); Weaver, Benjamin A. [New York Univ. (NYU), NY (United States)

    2012-03-20

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  20. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  1. Minimization of Handoff Failure Probability for Next-Generation Wireless Systems

    OpenAIRE

    Debabrata Sarddar; Tapas Jana; Souvik Kumar Saha; Joydeep Banerjee; Utpal Biswas; M.K.Naskar

    2010-01-01

    During the past few years, advances in mobile communication theory have enabled the development and deployment of different wireless technologies, complementary to each other. Hence, their integration can realize a unified wireless system that has the best features of the individual networks. Next-Generation Wireless Systems (NGWS) integrate different wireless systems, each of which is optimized for some specific services and coverage area to provide ubiquitous communications to the mobile us...

  2. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    DEFF Research Database (Denmark)

    Hu, Y.; Li, H.; Liao, X;

    2016-01-01

    This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration......-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation...... method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...

  3. Technique of account of a leak's probability of a steam generator due to destruction of a studs of a collector cover

    International Nuclear Information System (INIS)

    The approach estimating the leak probability of flanged joint due to the destruction of fastening studs is described. The mentioned approach consists of two stages. The probability of destroying one stud is calculated at the first stage, and the probability of different combination interpositions of intact and destroyed studs is calculated at the second one. The probability calculation of leak in the area of collector cover of steam generator PGV-1000 is used as an example of developed approach

  4. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  5. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  6. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  7. Analyses of heterogeneous deformation and subsurface fatigue crack generation in alpha titanium alloy at low temperature

    Energy Technology Data Exchange (ETDEWEB)

    Umezawa, Osamu [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501 (Japan); Morita, Motoaki [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501, Japan and Now Tokyo University of Marine Science and Technology, Koto-ku, Tokyo 135-8533 (Japan); Yuasa, Takayuki [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501, Japan and Now Nippon Steel and Sumitomo Metal, Kashima, 314-0014 (Japan); Morooka, Satoshi [Department of Mechanical Engineering and Materials Science, Yokohama National University 79-5 Tokiwadai, Hodogaya, Yokohama, 240-8501, Japan and Now Tokyo Metropolitan University, Hino, Tokyo 191-0065 (Japan); Ono, Yoshinori; Yuri, Tetsumi; Ogata, Toshio [National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, 305-0047 (Japan)

    2014-01-27

    Subsurface crack initiation in high-cycle fatigue has been detected as (0001) transgranular facet in titanium alloys at low temperature. The discussion on the subsurface crack generation was reviewed. Analyses by neutron diffraction and full constraints model under tension mode as well as crystallographic identification of the facet were focused. The accumulated tensile stress along <0001> may be responsible to initial microcracking on (0001) and the crack opening.

  8. Elastic-plastic fracture mechanics analyses of cracked steam generator tubes under internal pressure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeong Keun; Ahn, Min Yong; Moon, Seong In; Chang, Yoon Suk; Kim, Young Jin [Sungkyunkwan Univ., Suwon (Korea, Republic of); Hwang, Seong Sik; Kim, Joung Soo [KAERI, Taejon (Korea, Republic of)

    2005-07-01

    The structural and leakage integrity of steam generator tube should be maintained during operation even though a crack is existed on it. During the past three decades, several limit load solutions have been proposed to resolve the integrity issue. However, for exact load carrying capacity estimation of specific components under different conditions, these solutions have to be modified by using lots of experimental data. The purpose of this paper is to introduce a new burst pressure estimation scheme based on fracture mechanics analyses for steam generator tube with an axial or circumferential through-wall crack. To do this, closed-form engineering equations were derived to get relevant parameters from three dimensional elastic-plastic finite element analyses combined with reference stress method. Also, a series of structural integrity analyses were carried out using the calculated J-integral from engineering equations and fracture toughness data. Thereby, in comparison with the experimental data as well as corresponding estimation results from limit load solutions, it was proven that the proposed estimation scheme can be used as an efficient tool for integrity evaluation of cracked steam generator tubes.

  9. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach.

  10. Analyses of Acoustic Streaming Generated by Four Ultrasonic Vibrators in a Vessel

    Science.gov (United States)

    Nakagawa, Masafumi

    2004-05-01

    When ultrasonic waves are applied, the heat transfer at a heated surface in water increases markedly. The origin of this increase in heat transfer is thought to be due to the agitation effect from the microjets of cavitation and from acoustic streaming. The method in which four vibrators are used has the ability of further enhancing heat transfer. This paper presents the method using four vibrators to eject an acoustic stream jet at a selected position in the vessel. Analyses of this method are performed to establish it theoretically and to compare with an experiment previously conducted. The analyses shown in this research indicate that the aspects of acoustic streaming generated by the four vibrators in the vessel can be correctly predicted and provide a foundation for the development of using this method for the enhancement of heat transfer.

  11. Life cycle analyses applied to first generation bio-fuels consumed in France

    International Nuclear Information System (INIS)

    This rather voluminous publication reports detailed life cycle analyses for the different present bio-fuels channels also named first-generation bio-fuels: bio-ethanol, bio-diesel, pure vegetal oils, and oil. After a recall of the general principles adopted for this life-cycle analysis, it reports the modelling of the different channels (agricultural steps, bio-fuel production steps, Ethyl tert-butyl ether or ETBE steps, vehicles, animal fats and used vegetal oils, soil assignment change). It gives synthetic descriptions of the different production ways (methyl ester from different plants, ethanol from different plants). It reports and compares the results obtained in terms of performance

  12. Analyses of an air conditioning system with entropy generation minimization and entransy theory

    Science.gov (United States)

    Yan-Qiu, Wu; Li, Cai; Hong-Juan, Wu

    2016-06-01

    In this paper, based on the generalized heat transfer law, an air conditioning system is analyzed with the entropy generation minimization and the entransy theory. Taking the coefficient of performance (denoted as COP) and heat flow rate Q out which is released into the room as the optimization objectives, we discuss the applicabilities of the entropy generation minimization and entransy theory to the optimizations. Five numerical cases are presented. Combining the numerical results and theoretical analyses, we can conclude that the optimization applicabilities of the two theories are conditional. If Q out is the optimization objective, larger entransy increase rate always leads to larger Q out, while smaller entropy generation rate does not. If we take COP as the optimization objective, neither the entropy generation minimization nor the concept of entransy increase is always applicable. Furthermore, we find that the concept of entransy dissipation is not applicable for the discussed cases. Project supported by the Youth Programs of Chongqing Three Gorges University, China (Grant No. 13QN18).

  13. Analyses of an air conditioning system with entropy generation minimization and entransy theory

    Science.gov (United States)

    Yan-Qiu, Wu; Li, Cai; Hong-Juan, Wu

    2016-06-01

    In this paper, based on the generalized heat transfer law, an air conditioning system is analyzed with the entropy generation minimization and the entransy theory. Taking the coefficient of performance (denoted as COP) and heat flow rate Q out which is released into the room as the optimization objectives, we discuss the applicabilities of the entropy generation minimization and entransy theory to the optimizations. Five numerical cases are presented. Combining the numerical results and theoretical analyses, we can conclude that the optimization applicabilities of the two theories are conditional. If Q out is the optimization objective, larger entransy increase rate always leads to larger Q out, while smaller entropy generation rate does not. If we take COP as the optimization objective, neither the entropy generation minimization nor the concept of entransy increase is always applicable. Furthermore, we find that the concept of entransy dissipation is not applicable for the discussed cases. Project supported by the Youth Programs of Chongqing Three Gorges University, China (Grant No. 13QN18).

  14. Genetic Analyses of a Three Generation Family Segregating Hirschsprung Disease and Iris Heterochromia.

    Directory of Open Access Journals (Sweden)

    Long Cui

    Full Text Available We present the genetic analyses conducted on a three-generation family (14 individuals with three members affected with isolated-Hirschsprung disease (HSCR and one with HSCR and heterochromia iridum (syndromic-HSCR, a phenotype reminiscent of Waardenburg-Shah syndrome (WS4. WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10 and in the main HSCR gene (RET. Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor the transition results in the abolishment of translation initiation (M1V, in isoform 3 (only in the cytosol the replacement occurs at Met91 (M91V and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency- in the 5'-untranslated region of EDN3 (EDNRB ligand was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family.

  15. Genetic Analyses of a Three Generation Family Segregating Hirschsprung Disease and Iris Heterochromia.

    Science.gov (United States)

    Cui, Long; Wong, Emily Hoi-Man; Cheng, Guo; Firmato de Almeida, Manoel; So, Man-Ting; Sham, Pak-Chung; Cherny, Stacey S; Tam, Paul Kwong-Hang; Garcia-Barceló, Maria-Mercè

    2013-01-01

    We present the genetic analyses conducted on a three-generation family (14 individuals) with three members affected with isolated-Hirschsprung disease (HSCR) and one with HSCR and heterochromia iridum (syndromic-HSCR), a phenotype reminiscent of Waardenburg-Shah syndrome (WS4). WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs) using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10) and in the main HSCR gene (RET). Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor) the transition results in the abolishment of translation initiation (M1V), in isoform 3 (only in the cytosol) the replacement occurs at Met91 (M91V) and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency-) in the 5'-untranslated region of EDN3 (EDNRB ligand) was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family. PMID:23840513

  16. Trend analyses of the emergency diesel generator problem events in Japanese and U.S. nuclear power plants

    International Nuclear Information System (INIS)

    Up to 2009, the author and a colleague conducted trend analyses of problem events related to main generators, emergency diesel generators, breakers, motors and transformers which are more likely to cause problems than other electric components in nuclear power plants. Among the electric components with high frequency of defect occurrence, i.e., emergency diesel generators, several years have passed since the last analyses. These are very important components needed to stop a nuclear reactor safely and to cool it down during external power supply loses. Then trend analyses were conducted for the second time. The trend analyses were performed on 80 problem events with emergency diesel generators which had occurred in U.S. nuclear power plants in the five years from 2005 through 2009 among events reported in the Licensee Event Reports (LERs: event reports submitted to NRC by U.S. nuclear power plants) which have been registered in the nuclear information database of the Institute of Nuclear Safety System, Inc. (INSS) , as well as 40 events registered in the Nuclear Information Archives (NUCIA), which occurred in Japanese nuclear power plants in the same time period. It was learned from the trend analyses of the problem events with emergency diesel generators that frequency of defect occurrence are high in both Japanese and US plants during plant operations and functional tests (that is, defects can be discovered effectively in advance), so that implementation of periodical functional tests under plant operation is an important task for the future. (author)

  17. Hydrologic analyses in support of the Navajo Generating Station–Kayenta Mine Complex environmental impact statement

    Science.gov (United States)

    Leake, Stanley A.; Macy, Jamie P.; Truini, Margot

    2016-06-01

    reclamation operations within the Kayenta Mine permit boundary since 1973.The KMC part of the proposed project requires approval by the Office of Surface Mining (OSM) of a significant revision of the mine’s permit to operate in accordance with the Surface Mine Control and Reclamation Act (Public Law 95-87, 91 Stat. 445 [30 U.S.C. 1201 et seq.]). The revision will identify coal resource areas that may be used to continue extracting coal at the present rate of approximately 8.2 million tons per year. The Kayenta Mine Complex uses water pumped from the D and N aquifers beneath PWCC’s leasehold to support mining and reclamation activities. Prior to 2006, water from the PWCC well field also was used to transport coal by way of a coal-slurry pipeline to the now-closed Mohave Generating Station. Water usage at the leasehold was approximately 4,100 acre-feet per year (acre-ft/yr) during the period the pipeline was in use, and declined to an average 1,255 acre-ft/yr from 2006 to 2011. The Probable Hydrologic Consequences (PHC) section of the mining and reclamation permit must be modified to project the consequences of extended water use by the mine for the duration of the KMC part of the project, including a post-mining reclamation period.Since 1971, the U.S. Geological Survey (USGS) has conducted the Black Mesa Monitoring Program, which consists of monitoring water levels and water quality in the N aquifer, compiling information on water use by PWCC and tribal communities, maintaining several stream-gaging stations, measuring discharge at selected springs, conducting special studies, and reporting findings. These data are useful in evaluating the effects on the N aquifer from PWCC and community pumping, and the effects of variable precipitation.The EIS will assess the impacts of continued pumping on the N aquifer, including changes in storage, water quality, and effects on spring and baseflow discharge, by proposed mining through 2044, and during the reclamation process to 2057

  18. Hydrologic analyses in support of the Navajo Generating Station–Kayenta Mine Complex environmental impact statement

    Science.gov (United States)

    Leake, Stanley A.; Macy, Jamie P.; Truini, Margot

    2016-06-01

    reclamation operations within the Kayenta Mine permit boundary since 1973.The KMC part of the proposed project requires approval by the Office of Surface Mining (OSM) of a significant revision of the mine’s permit to operate in accordance with the Surface Mine Control and Reclamation Act (Public Law 95-87, 91 Stat. 445 [30 U.S.C. 1201 et seq.]). The revision will identify coal resource areas that may be used to continue extracting coal at the present rate of approximately 8.2 million tons per year. The Kayenta Mine Complex uses water pumped from the D and N aquifers beneath PWCC’s leasehold to support mining and reclamation activities. Prior to 2006, water from the PWCC well field also was used to transport coal by way of a coal-slurry pipeline to the now-closed Mohave Generating Station. Water usage at the leasehold was approximately 4,100 acre-feet per year (acre-ft/yr) during the period the pipeline was in use, and declined to an average 1,255 acre-ft/yr from 2006 to 2011. The Probable Hydrologic Consequences (PHC) section of the mining and reclamation permit must be modified to project the consequences of extended water use by the mine for the duration of the KMC part of the project, including a post-mining reclamation period.Since 1971, the U.S. Geological Survey (USGS) has conducted the Black Mesa Monitoring Program, which consists of monitoring water levels and water quality in the N aquifer, compiling information on water use by PWCC and tribal communities, maintaining several stream-gaging stations, measuring discharge at selected springs, conducting special studies, and reporting findings. These data are useful in evaluating the effects on the N aquifer from PWCC and community pumping, and the effects of variable precipitation.The EIS will assess the impacts of continued pumping on the N aquifer, including changes in storage, water quality, and effects on spring and baseflow discharge, by proposed mining through 2044, and during the reclamation process to 2057

  19. Quantum, classical and semiclassical analyses of photon statistics in harmonic generation

    CERN Document Server

    Bajer, J; Bajer, Jiri; Miranowicz, Adam

    2001-01-01

    In this review, we compare different descriptions of photon-number statistics in harmonic generation processes within quantum, classical and semiclassical approaches. First, we study the exact quantum evolution of the harmonic generation by applying numerical methods including those of Hamiltonian diagonalization and global characteristics. We show explicitly that the harmonic generations can indeed serve as a source of nonclassical light. Then, we demonstrate that the quasi-stationary sub-Poissonian light can be generated in these quantum processes under conditions corresponding to the so-called no-energy-transfer regime known in classical nonlinear optics. By applying method of classical trajectories, we demonstrate that the analytical predictions of the Fano factors are in good agreement with the quantum results. On comparing second and higher harmonic generations in the no-energy-transfer regime, we show that the highest noise reduction is achieved in third-harmonic generation with the Fano-factor of the ...

  20. Analyses of steam generator collector rupture for WWER-1000 using Relap5 code

    Energy Technology Data Exchange (ETDEWEB)

    Balabanov, E.; Ivanova, A. [Energoproekt, Sofia (Bulgaria)

    1995-12-31

    The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment.

  1. Participation of smaller size renewable generation in the electricity market trade in UK: Analyses and approaches

    DEFF Research Database (Denmark)

    Romanovsky, G.; Xydis, G.; Mutale, J.

    2011-01-01

    While there are presently different options for renewable and distributed generation (RES/DG) to participate in the UK electricity market, none of the market options is specifically tailored for such types of generation and in particular, the smaller (up to 5 MW) RES/DG. This is because the UK ha...

  2. Scale/Analytical Analyses of Freezing and Convective Melting with Internal Heat Generation

    Energy Technology Data Exchange (ETDEWEB)

    Ali S. Siahpush; John Crepeau; Piyush Sabharwall

    2013-07-01

    Using a scale/analytical analysis approach, we model phase change (melting) for pure materials which generate constant internal heat generation for small Stefan numbers (approximately one). The analysis considers conduction in the solid phase and natural convection, driven by internal heat generation, in the liquid regime. The model is applied for a constant surface temperature boundary condition where the melting temperature is greater than the surface temperature in a cylindrical geometry. The analysis also consider constant heat flux (in a cylindrical geometry).We show the time scales in which conduction and convection heat transfer dominate.

  3. Economic analysis of geothermal electricity generation in Germany; Oekonomische Analyse einer geothermischen Stromerzeugung in Deutschland

    Energy Technology Data Exchange (ETDEWEB)

    Frick, S. [Institut fuer Energetik und Umwelt gGmbH, Leipzig (Germany); Huenges, E [GeoForschungsZentrum (GFZ), Potsdam (Germany); Jung, R. [Institut fuer Geowissenschaftliche Gemeinschaftsaufgaben (GGA), Hannover (Germany); Kaltschmitt, M. [Institut fuer Energetik und Umwelt gGmbH, Leipzig (Germany); Technische Univ. Hamburg-Harburg, Hamburg (DE). Inst. fuer Umwelttechnik und Energiewirtschaft (IUE)

    2007-07-01

    Geothermal energy is due to its existing large resources in Germany an option which can note-worthy contribute to the future energy provision. The amendment of the EEG law (law on the use of renewables) therefore draws much more interest to geothermal electricity generation. Against this background, the objective of this article is to identify the main cost drivers and risks of a geothermal power and heat generation under the geological conditions in Germany and derives the resultant recommendations. (orig.)

  4. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  6. Analyses on the Ionization Instability of Non-Equilibrium Seeded Plasma in an MHD Generator

    Science.gov (United States)

    Le, Chi Kien

    2016-06-01

    Recently, closed cycle magnetohydrodynamic power generation system research has been focused on improving the isentropic efficiency and the enthalpy extraction ratio. By reducing the cross-section area ratio of the disk magnetohydrodynamic generator, it is believed that a high isentropic efficiency can be achieved with the same enthalpy extraction. In this study, the result relating to a plasma state which takes into account the ionization instability of non-equilibrium seeded plasma is added to the theoretical prediction of the relationship between enthalpy extraction and isentropic efficiency. As a result, the electron temperature which reaches the seed complete ionization state without the growth of ionization instability can be realized at a relatively high seed fraction condition. However, the upper limit of the power generation performance is suggested to remain lower than the value expected in the low seed fraction condition. It is also suggested that a higher power generation performance may be obtained by implementing the electron temperature range which reaches the seed complete ionization state at a low seed fraction.

  7. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  8. Analysing the statistics of group constants generated by Serpent 2 Monte Carlo code

    International Nuclear Information System (INIS)

    An important topic in Monte Carlo neutron transport calculations is to verify that the statistics of the calculated estimates are correct. Undersampling, non-converged fission source distribution and inter-cycle correlations may result in inaccurate results. In this paper, we study the effect of the number of neutron histories on the distributions of homogenized group constants and assembly discontinuity factors generated using Serpent 2 Monte Carlo code. We apply two normality tests and a so-called “drift-in-mean” test to the batch-wise distributions of selected parameters generated for two assembly types taken from the MIT BEAVRS benchmark. The results imply that in the tested cases the batch-wise estimates of the studied group constants can be regarded as normally distributed. We also show that undersampling is an issue with the calculated assembly discontinuity factors when the number of neutron histories is small. (author)

  9. Energetic and Exergetic Analyses of a Direct Steam Generation Solar Thermal Power Plant in Cyprus

    OpenAIRE

    Hamidi, Armita

    2012-01-01

    ABSTRACT: In recent decades, the threat of climate change and other environmental impacts of fossil fuels have reinforced interests in alternative and renewable energy sources for producing electricity. In this regard, solar thermal energy can be utilized in existing power generation plants as replacement for the heat produced by means of fossil fuels. The objective of this study is to investigate the energetic and exergetic feasibility of utilizing a solar thermal power plant in Cyprus. The...

  10. Analysing of the power generation system%浅析汽车发电机

    Institute of Scientific and Technical Information of China (English)

    牟亮

    2016-01-01

    简单介绍了发电系统结构和发电系统各组成部分的原理。解决了在看整车电气原理图时候只能看到一个发电系统符号,而看不懂里面的组成与原理的问题。在以后的研究工作中,提供有用的关于发电系统方面的技术资料。%This article simply introduces the structure of the power generation system and the principle of the power generation system. This paper solved the problem that when we see vehicle electrical principle diagram we can only see a symbolic power generation systems and don't understand the inside composition and principle. The article provides useful technical information on power systems for employees of the company in the later work.

  11. Changes in Sexual Behavior and Attitudes Across Generations and Gender Among a Population-Based Probability Sample From an Urbanizing Province in Thailand.

    Science.gov (United States)

    Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro

    2016-02-01

    Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies.

  12. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  13. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  14. Analyses of MYMIV-induced transcriptome in Vigna mungo as revealed by next generation sequencing

    Directory of Open Access Journals (Sweden)

    Sayak Ganguli

    2016-03-01

    Full Text Available Mungbean Yellow Mosaic Virus (MYMIV is the viral pathogen that causes yellow mosaic disease to a number of legumes including Vigna mungo. VM84 is a recombinant inbred line resistant to MYMIV, developed in our laboratory through introgression of resistance trait from V. mungo line VM-1. Here we present the quality control passed transcriptome data of mock inoculated (control and MYMIV-infected VM84, those have already been submitted in Sequence Read Archive (SRX1032950, SRX1082731 of NCBI. QC reports of FASTQ files generated by ‘SeqQC V2.2’ bioinformatics tool.

  15. Analyses of MYMIV-induced transcriptome in Vigna mungo as revealed by next generation sequencing.

    Science.gov (United States)

    Ganguli, Sayak; Dey, Avishek; Banik, Rahul; Kundu, Anirban; Pal, Amita

    2016-03-01

    Mungbean Yellow Mosaic Virus (MYMIV) is the viral pathogen that causes yellow mosaic disease to a number of legumes including Vigna mungo. VM84 is a recombinant inbred line resistant to MYMIV, developed in our laboratory through introgression of resistance trait from V. mungo line VM-1. Here we present the quality control passed transcriptome data of mock inoculated (control) and MYMIV-infected VM84, those have already been submitted in Sequence Read Archive (SRX1032950, SRX1082731) of NCBI. QC reports of FASTQ files generated by 'SeqQC V2.2' bioinformatics tool.

  16. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  17. Thermodynamic analyses of a biomass-coal co-gasification power generation system.

    Science.gov (United States)

    Yan, Linbo; Yue, Guangxi; He, Boshu

    2016-04-01

    A novel chemical looping power generation system is presented based on the biomass-coal co-gasification with steam. The effects of different key operation parameters including biomass mass fraction (Rb), steam to carbon mole ratio (Rsc), gasification temperature (Tg) and iron to fuel mole ratio (Rif) on the system performances like energy efficiency (ηe), total energy efficiency (ηte), exergy efficiency (ηex), total exergy efficiency (ηtex) and carbon capture rate (ηcc) are analyzed. A benchmark condition is set, under which ηte, ηtex and ηcc are found to be 39.9%, 37.6% and 96.0%, respectively. Furthermore, detailed energy Sankey diagram and exergy Grassmann diagram are drawn for the entire system operating under the benchmark condition. The energy and exergy efficiencies of the units composing the system are also predicted.

  18. Thermodynamic analyses of a biomass-coal co-gasification power generation system.

    Science.gov (United States)

    Yan, Linbo; Yue, Guangxi; He, Boshu

    2016-04-01

    A novel chemical looping power generation system is presented based on the biomass-coal co-gasification with steam. The effects of different key operation parameters including biomass mass fraction (Rb), steam to carbon mole ratio (Rsc), gasification temperature (Tg) and iron to fuel mole ratio (Rif) on the system performances like energy efficiency (ηe), total energy efficiency (ηte), exergy efficiency (ηex), total exergy efficiency (ηtex) and carbon capture rate (ηcc) are analyzed. A benchmark condition is set, under which ηte, ηtex and ηcc are found to be 39.9%, 37.6% and 96.0%, respectively. Furthermore, detailed energy Sankey diagram and exergy Grassmann diagram are drawn for the entire system operating under the benchmark condition. The energy and exergy efficiencies of the units composing the system are also predicted. PMID:26826573

  19. Geospatial analyses and system architectures for the next generation of radioactive materials risk assessment and routing

    Energy Technology Data Exchange (ETDEWEB)

    Ganter, J.H.

    1996-02-01

    This paper suggests that inexorable changes in the society are presenting both challenges and a rich selection of technologies for responding to these challenges. The citizen is more demanding of environmental and personal protection, and of information. Simultaneously, the commercial and government information technologies markets are providing new technologies like commercial off-the-shelf (COTS) software, common datasets, ``open`` GIS, recordable CD-ROM, and the World Wide Web. Thus one has the raw ingredients for creating new techniques and tools for spatial analysis, and these tools can support participative study and decision-making. By carrying out a strategy of thorough and demonstrably correct science, design, and development, can move forward into a new generation of participative risk assessment and routing for radioactive and hazardous materials.

  20. Generation of a Gaussia luciferase-expressing endotheliotropic cytomegalovirus for screening approaches and mutant analyses.

    Science.gov (United States)

    Falk, Jessica J; Laib Sampaio, Kerstin; Stegmann, Cora; Lieber, Diana; Kropff, Barbara; Mach, Michael; Sinzger, Christian

    2016-09-01

    For many questions in human cytomegalovirus (HCMV) research, assays are desired that allow robust and fast quantification of infection efficiencies under high-throughput conditions. The secreted Gaussia luciferase has been demonstrated as a suitable reporter in the context of a fibroblast-adapted HCMV strain, which however is greatly restricted in the number of cell types to which it can be applied. We inserted the Gaussia luciferase expression cassette into the BAC-cloned virus strain TB40-BAC4, which displays the natural broad cell tropism of HCMV and hence allows application to screening approaches in a variety of cell types including fibroblasts, epithelial, and endothelial cells. Here, we applied the reporter virus TB40-BAC4-IE-GLuc to identify mouse hybridoma clones that preferentially neutralize infection of endothelial cells. In addition, as the Gaussia luciferase is secreted into culture supernatants from infected cells it allows kinetic analyses in living cultures. This can speed up and facilitate phenotypic characterization of BAC-cloned mutants. For example, we analyzed a UL74 stop-mutant of TB40-BAC4-IE-GLuc immediately after reconstitution in transfected cultures and found the increase of luciferase delayed and reduced as compared to wild type. Phenotypic monitoring directly in transfected cultures can minimize the risk of compensating mutations that might occur with extended passaging. PMID:27326666

  1. Software tool for analysing the family shopping basket without candidate generation

    Directory of Open Access Journals (Sweden)

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  2. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  3. Analysing the Risks and Challenges of the Pad Device as an Education Tool and Its Influence on Generation Y

    OpenAIRE

    Sun, Jingxuan

    2012-01-01

    Sun, Jingxuan 2012. Analysing the Risks and Challenges of the Pad Device as an Education Tool and Its Influence on Generation Y: Bachelor’s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 68. Appendices 2. The objective of this research was to explore the use of the pad device as an education tool. Furthermore, the research work also intended to study the impact of the pad device among young students. The advantages of using the pad devices instead of compu...

  4. A comparative study between xerographic, computer-assisted overlay generation and animated-superimposition methods in bite mark analyses.

    Science.gov (United States)

    Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran

    2016-09-01

    This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks.

  5. A comparative study between xerographic, computer-assisted overlay generation and animated-superimposition methods in bite mark analyses.

    Science.gov (United States)

    Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran

    2016-09-01

    This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. PMID:27591538

  6. A comparative study of first and all-author co-citation counting, and two different matrix generation approaches applied for author co-citation analyses

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg; Larsen, Birger; Ingwersen, Peter

    2009-01-01

    Aim: The present article contributes to the current methodological debate concerning author co-citation analyses. (ACA) The study compares two different units of analyses, i.e. first- versus inclusive all-author co-citation counting, as well as two different matrix generation approaches, i...... for the seemingly advantages of the Drexel approach....

  7. Gene Mutation Profiles in Primary Diffuse Large B Cell Lymphoma of Central Nervous System: Next Generation Sequencing Analyses

    Directory of Open Access Journals (Sweden)

    Milena Todorovic Balint

    2016-05-01

    Full Text Available The existence of a potential primary central nervous system lymphoma-specific genomic signature that differs from the systemic form of diffuse large B cell lymphoma (DLBCL has been suggested, but is still controversial. We investigated 19 patients with primary DLBCL of central nervous system (DLBCL CNS using the TruSeq Amplicon Cancer Panel (TSACP for 48 cancer-related genes. Next generation sequencing (NGS analyses have revealed that over 80% of potentially protein-changing mutations were located in eight genes (CTNNB1, PIK3CA, PTEN, ATM, KRAS, PTPN11, TP53 and JAK3, pointing to the potential role of these genes in lymphomagenesis. TP53 was the only gene harboring mutations in all 19 patients. In addition, the presence of mutated TP53 and ATM genes correlated with a higher total number of mutations in other analyzed genes. Furthermore, the presence of mutated ATM correlated with poorer event-free survival (EFS (p = 0.036. The presence of the mutated SMO gene correlated with earlier disease relapse (p = 0.023, inferior event-free survival (p = 0.011 and overall survival (OS (p = 0.017, while mutations in the PTEN gene were associated with inferior OS (p = 0.048. Our findings suggest that the TP53 and ATM genes could be involved in the molecular pathophysiology of primary DLBCL CNS, whereas mutations in the PTEN and SMO genes could affect survival regardless of the initial treatment approach.

  8. Gene Mutation Profiles in Primary Diffuse Large B Cell Lymphoma of Central Nervous System: Next Generation Sequencing Analyses

    Science.gov (United States)

    Todorovic Balint, Milena; Jelicic, Jelena; Mihaljevic, Biljana; Kostic, Jelena; Stanic, Bojana; Balint, Bela; Pejanovic, Nadja; Lucic, Bojana; Tosic, Natasa; Marjanovic, Irena; Stojiljkovic, Maja; Karan-Djurasevic, Teodora; Perisic, Ognjen; Rakocevic, Goran; Popovic, Milos; Raicevic, Sava; Bila, Jelena; Antic, Darko; Andjelic, Bosko; Pavlovic, Sonja

    2016-01-01

    The existence of a potential primary central nervous system lymphoma-specific genomic signature that differs from the systemic form of diffuse large B cell lymphoma (DLBCL) has been suggested, but is still controversial. We investigated 19 patients with primary DLBCL of central nervous system (DLBCL CNS) using the TruSeq Amplicon Cancer Panel (TSACP) for 48 cancer-related genes. Next generation sequencing (NGS) analyses have revealed that over 80% of potentially protein-changing mutations were located in eight genes (CTNNB1, PIK3CA, PTEN, ATM, KRAS, PTPN11, TP53 and JAK3), pointing to the potential role of these genes in lymphomagenesis. TP53 was the only gene harboring mutations in all 19 patients. In addition, the presence of mutated TP53 and ATM genes correlated with a higher total number of mutations in other analyzed genes. Furthermore, the presence of mutated ATM correlated with poorer event-free survival (EFS) (p = 0.036). The presence of the mutated SMO gene correlated with earlier disease relapse (p = 0.023), inferior event-free survival (p = 0.011) and overall survival (OS) (p = 0.017), while mutations in the PTEN gene were associated with inferior OS (p = 0.048). Our findings suggest that the TP53 and ATM genes could be involved in the molecular pathophysiology of primary DLBCL CNS, whereas mutations in the PTEN and SMO genes could affect survival regardless of the initial treatment approach. PMID:27164089

  9. Value added structures and coordination structures of the decentral power generation. An actor-centered and institution-centered analyses by means of selected case examples; Wertschoepfungs- und Koordinationsstrukturen der dezentralen Stromerzeugung. Eine akteur- und institutionenzentrierte Analyse anhand ausgewaehlter Fallbeispiele

    Energy Technology Data Exchange (ETDEWEB)

    Brocke, Tobias

    2012-07-01

    Against the background of energy policy and climate policy decisions, the decentralized power generation has gained in importance in Germany. Previous research activities on this topic mostly concerned with technical, legal, environmental and economic issues as well as potential analyses for certain forms of power generation. In contrast, the contribution under consideration deals with the organizational structures and governance structures of the decentralized power generation at local and regional level. In particular, it concerns the question to what extent the decentralized power generation results in the formation of localized production connections. In addition, it is about the importance of institutional framework as well as the role of regulatory, political and civil society actors who are affected by the distributed power generation.

  10. Detailed phenotypic and molecular analyses of genetically modified mice generated by CRISPR-Cas9-mediated editing.

    Directory of Open Access Journals (Sweden)

    Bijal A Parikh

    Full Text Available The bacterial CRISPR-Cas9 system has been adapted for use as a genome editing tool. While several recent reports have indicated that successful genome editing of mice can be achieved, detailed phenotypic and molecular analyses of the mutant animals are limited. Following pronuclear micro-injection of fertilized eggs with either wild-type Cas9 or the nickase mutant (D10A and single or paired guide RNA (sgRNA for targeting of the tyrosinase (Tyr gene, we assessed genome editing in mice using rapid phenotypic readouts (eye and coat color. Mutant mice with insertions or deletions (indels in Tyr were efficiently generated without detectable off-target cleavage events. Gene correction of a single nucleotide by homologous recombination (HR could only occur when the sgRNA recognition sites in the donor DNA were modified. Gene repair did not occur if the donor DNA was not modified because Cas9 catalytic activity was completely inhibited. Our results indicate that allelic mosaicism can occur following -Cas9-mediated editing in mice and appears to correlate with sgRNA cleavage efficiency at the single-cell stage. We also show that larger than expected deletions may be overlooked based on the screening strategy employed. An unbiased analysis of all the deleted nucleotides in our experiments revealed that the highest frequencies of nucleotide deletions were clustered around the predicted Cas9 cleavage sites, with slightly broader distributions than expected. Finally, additional analysis of founder mice and their offspring indicate that their general health, fertility, and the transmission of genetic changes were not compromised. These results provide the foundation to interpret and predict the diverse outcomes following CRISPR-Cas9-mediated genome editing experiments in mice.

  11. Child Benefits and Welfare for Current and Future Generations: Simulation Analyses in an Overlapping-Generations Model with Endogenous Fertility(in Japanese)

    OpenAIRE

    Oguro, Kazumasa; SHIMASAWA Manabu; TAKAHATA Junichiro

    2010-01-01

    We constructed an overlapping-generations model with endogenous fertility to analyze the effect of child benefits and pensions on welfare for current and future generations. The following results were obtained. First, when financial sustainability is not taken into account, the best policy to improve the welfare of future generations is to increase child benefits, financed by issuing government debt. On the other hand, when financial sustainability is taken into account, the best policy is to...

  12. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  13. A Novel Numerical Algorithm for Optimal Sizing of a Photovoltaic/Wind/Diesel Generator/Battery Microgrid Using Loss of Load Probability Index

    Directory of Open Access Journals (Sweden)

    Hussein A. Kazem

    2013-01-01

    Full Text Available This paper presents a method for determining optimal sizes of PV array, wind turbine, diesel generator, and storage battery installed in a building integrated system. The objective of the proposed optimization is to design the system that can supply a building load demand at minimum cost and maximum availability. The mathematical models for the system components as well as meteorological variables such as solar energy, temperature, and wind speed are employed for this purpose. Moreover, the results showed that the optimum sizing ratios (the daily energy generated by the source to the daily energy demand for the PV array, wind turbine, diesel generator, and battery for a system located in Sohar, Oman, are 0.737, 0.46, 0.22, and 0.17, respectively. A case study represented by a system consisting of 30 kWp PV array (36%, 18 kWp wind farm (55%, and 5 kVA diesel generator (9% is presented. This system is supposed to power a 200 kWh/day load demand. It is found that the generated energy share of the PV array, wind farm, and diesel generator is 36%, 55%, and 9%, respectively, while the cost of energy is 0.17 USD/kWh.

  14. Collection of emanating 222Rn for the preparation of a 210Pb-210Po alpha-source and the building of a mobile random pulse and probability generator utilizing alpha-counting technique

    International Nuclear Information System (INIS)

    A random pulse and probability generator (RPG) has been developed utilizing the detection technique of alpha-particles as the random signal source. The collection technique for 222Rn emanated from natural uranium ore was examined for preparing highly pure 210Pb-210Po as an alpha source for RPG. The yield with a trap refrigerated by liquid nitrogen was observed to be above 99% for 222Rn collection. (author)

  15. An automatic generation of non-uniform mesh for CFD analyses of image-based multiscale human airway models

    Science.gov (United States)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2014-11-01

    The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.

  16. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  17. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  18. LMFBR steam generator tube-to-tubesheet weld with dissimilar ferritic steels - qualification tests and inelastic analyses

    International Nuclear Information System (INIS)

    To weld steam-generating tubes of EM12 to a tubesheet provided with boss of 2.25 Cr. 1 Mo, STEIN INDUSTRIE has developed a process of internal welding without filler metal. The characterization tests carried out on test assemblies have shown the excellent metallurgical quality of welding performed by this process. High-temperature strength tests showed a safety margin as compared with the results of the finite element calculations. The hypotheses made for these calculations which take into account the elastoviscoplatic properties of materials, and particularly the extension of the properties of 2.25 Cr. 1 Mo to the weld, can therefore be applied to steam-generator sizing calutations. (orig.)

  19. Performance of Monte Carlo Event Generators for the Production of Boson and Multi-Boson States ATLAS Analyses

    CERN Document Server

    Gutschow, Christian; The ATLAS collaboration

    2016-01-01

    The Monte Carlo setups used by ATLAS to model boson+jets and multi-boson processes in 13 TeV pp collisions are described. Comparisons between data and several events generators are provided for key kinematic distributions at 7 TeV, 8 TeV and 13 TeV. Issues associated with sample normalisation and the evaluation of systematic uncertainties are also discussed.

  20. Thermal-hydraulic safety analyses supporting the steam generator replacement and uprating at Krško nuclear power plant

    OpenAIRE

    Mavko, Borut; Prošek, Andrej

    2015-01-01

    The Krško nuclear power plant has undertaken a major modernization project. The objectives of the project are: long-term stabilization of the plant's operation, uprating of the net electrical power output, higher availability and enhanced safety of the plant. The modernization also requires a thorough safety re-avaluation and therefore new thermal hydraulic, mechanical and structural analysis. The thermal-hydraulic part of the safety analysis necessary for the steam generator replacement and ...

  1. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  2. Contributions to quantum probability

    International Nuclear Information System (INIS)

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  3. Experimental and thermodynamical analyses of the diesel exhaust vortex generator heat exchanger for optimizing its operating condition

    International Nuclear Information System (INIS)

    In this research, a vortex generator heat exchanger is used to recover exergy from the exhaust of an OM314 diesel engine. Twenty vortex generators with 30° angle of attack are used to increase the heat recovery as well as the low back pressure in the exhaust. The experiments are prepared for five engine loads (0, 20, 40, 60 and 80% of full load), two exhaust gases amount (50 and 100%) and four water mass flow rates (50, 40, 30 and 20 g/s). After a thermodynamical analysis on the obtained data, an optimization study based on Central Composite Design (CCD) is performed due to complex effect of engine loads and water mass flow rates on exergy recovery and irreversibility to reach the best operating condition. - Highlights: • A vortex generator heat exchanger is used for diesel exhaust heat recovery. • A thermodynamic analysis is performed for experimental data. • Exergy recovery, irreversibility are calculated in different exhaust gases amount. • Optimization study is performed using response surface method

  4. Static analysis: from theory to practice; Static analysis of large-scale embedded code, generation of abstract domains; Analyse statique: de la theorie a la pratique; analyse statique de code embarque de grande taille, generation de domaines abstraits

    Energy Technology Data Exchange (ETDEWEB)

    Monniaux, D.

    2009-06-15

    Software operating critical systems (aircraft, nuclear power plants) should not fail - whereas most computerised systems of daily life (personal computer, ticket vending machines, cell phone) fail from time to time. This is not a simple engineering problem: it is known, since the works of Turing and Cook, that proving that programs work correctly is intrinsically hard. In order to solve this problem, one needs methods that are, at the same time, efficient (moderate costs in time and memory), safe (all possible failures should be found), and precise (few warnings about nonexistent failures). In order to reach a satisfactory compromise between these goals, one can research fields as diverse as formal logic, numerical analysis or 'classical' algorithmics. From 2002 to 2007 I participated in the development of the Astree static analyser. This suggested to me a number of side projects, both theoretical and practical (use of formal proof techniques, analysis of numerical filters...). More recently, I became interested in modular analysis of numerical property and in the applications to program analysis of constraint solving techniques (semi-definite programming, SAT and SAT modulo theory). (author)

  5. Hierarchical spatiotemporal analyses of reactions using synchrotron radiation and the design of next-generation energy conversion devices

    International Nuclear Information System (INIS)

    To improve the performance of electrochemical devices such as batteries and fuel cells, it is essential to understand reaction hierarchies over wide temporal and spatial ranges. To this end, operando measurement techniques have been developed that enable analysis of the electrode/electrolyte interface of the reaction site, phase transitions of active materials, and macro reactions within real electrodes over various spatial and temporal scales. These analytic techniques pioneer a new way of performing kinetic analysis by introducing axes of space and time into reaction analyses, and are applicable to various types of electrochemical devices. Moreover, a magnesium rechargeable battery featuring the merits of high theoretical energy density, high safety, and easily acquirable raw materials was developed by employing these operando analytic techniques. (author)

  6. Analysing future solid waste generation - Soft linking a model of waste management with a CGE-model for Sweden

    OpenAIRE

    Östblom, Göran; Ljunggren Söderman, Maria; Sjöström, Magnus

    2010-01-01

    Parallel to the efforts of the EU to achieve a significant and overall reduction of waste quantities within the EU, the Swedish parliament enacted an environmental quality objective stating that ‘the total quantity of waste must not increase …’ i.e. an eventual absolute decoupling of waste generation from GDP. The decoupling issue is ad-dressed, in the present paper, by assessing future waste quantities, for a number of economic scenarios of the Swedish economy to 2030 with alternative assump...

  7. De novo assembly and next-generation sequencing to analyse full-length gene variants from codon-barcoded libraries.

    Science.gov (United States)

    Cho, Namjin; Hwang, Byungjin; Yoon, Jung-ki; Park, Sangun; Lee, Joongoo; Seo, Han Na; Lee, Jeewon; Huh, Sunghoon; Chung, Jinsoo; Bang, Duhee

    2015-01-01

    Interpreting epistatic interactions is crucial for understanding evolutionary dynamics of complex genetic systems and unveiling structure and function of genetic pathways. Although high resolution mapping of en masse variant libraries renders molecular biologists to address genotype-phenotype relationships, long-read sequencing technology remains indispensable to assess functional relationship between mutations that lie far apart. Here, we introduce JigsawSeq for multiplexed sequence identification of pooled gene variant libraries by combining a codon-based molecular barcoding strategy and de novo assembly of short-read data. We first validate JigsawSeq on small sub-pools and observed high precision and recall at various experimental settings. With extensive simulations, we then apply JigsawSeq to large-scale gene variant libraries to show that our method can be reliably scaled using next-generation sequencing. JigsawSeq may serve as a rapid screening tool for functional genomics and offer the opportunity to explore evolutionary trajectories of protein variants. PMID:26387459

  8. Uniqueness in ergodic decomposition of invariant probabilities

    OpenAIRE

    Zimmermann, Dieter

    1992-01-01

    We show that for any set of transition probabilities on a common measurable space and any invariant probability, there is at most one representing measure on the set of extremal, invariant probabilities with the $\\sigma$-algebra generated by the evaluations. The proof uses nonstandard analysis.

  9. Enhancement in the structure quality of ZnO nanorods by diluted Co dopants: Analyses via optical second harmonic generation

    International Nuclear Information System (INIS)

    We report a systematic study about the effect of cobalt concentration in the growth solution over the crystallization, growth, and optical properties of hydrothermally synthesized Zn1−xCoxO [0 ≤ x ≤ 0.40, x is the weight (wt.) % of Co in the growth solution] nanorods. Dilute Co concentration of 1 wt. % in the growth solution enhances the bulk crystal quality of ZnO nanorods, and high wt. % leads to distortion in the ZnO lattice that depresses the crystallization, growth as well as the surface structure quality of ZnO. Although, Co concentration in the growth solution varies from 1 to 40 wt. %, the real doping concentration is limited to 0.28 at. % that is due to the low growth temperature of 80 °C. The enhancement in the crystal quality of ZnO nanorods at dilute Co concentration in the solution is due to the strain relaxation that is significantly higher for ZnO nanorods prepared without, and with high wt. % of Co in the growth solution. Second harmonic generation is used to investigate the net dipole distribution from these coatings, which provides detailed information about bulk and surface structure quality of ZnO nanorods at the same time. High quality ZnO nanorods are fabricated by a low-temperature (80 °C) hydrothermal synthesis method, and no post synthesis treatment is needed for further crystallization. Therefore, this method is advantageous for the growth of high quality ZnO coatings on plastic substrates that may lead toward its application in flexible electronics

  10. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  11. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  12. Non-Archimedean Probability

    CERN Document Server

    Benci, Vieri; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.

  13. Estimating Small Probabilities for Langevin Dynamics

    OpenAIRE

    Aristoff, David

    2012-01-01

    The problem of estimating small transition probabilities for overdamped Langevin dynamics is considered. A simplification of Girsanov's formula is obtained in which the relationship between the infinitesimal generator of the underlying diffusion and the change of probability measure corresponding to a change in the potential energy is made explicit. From this formula an asymptotic expression for transition probability densities is derived. Separately the problem of estimating the probability ...

  14. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  15. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  16. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  17. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  18. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  19. Introduction to probability models

    CERN Document Server

    Ross, Sheldon M

    2006-01-01

    Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v

  20. Molecular contingencies: reinforcement probability.

    Science.gov (United States)

    Hale, J M; Shimp, C P

    1975-11-01

    Pigeons obtained food by responding in a discrete-trials two-choice probability-learning experiment involving temporal stimuli. A given response alternative, a left- or right-key peck, had 11 associated reinforcement probabilities within each session. Reinforcement probability for a choice was an increasing or a decreasing function of the time interval immediately preceding the choice. The 11 equiprobable temporal stimuli ranged from 1 to 11 sec in 1-sec classes. Preference tended to deviate from probability matching in the direction of maximizing; i.e., the percentage of choices of the preferred response alternative tended to exceed the probability of reinforcement for that alternative. This result was qualitatively consistent with probability-learning experiments using visual stimuli. The result is consistent with a molecular analysis of operant behavior and poses a difficulty for molar theories holding that local variations in reinforcement probability may safely be disregarded in the analysis of behavior maintained by operant paradigms. PMID:16811883

  1. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship-ship c...

  2. 基于概率最优化方法的水库发电调度风险分析%Risk analysis of reservoir generation dispatching based on probability optimization method

    Institute of Scientific and Technical Information of China (English)

    王丽萍; 张验科; 纪昌明; 郑江涛; 李安强

    2011-01-01

    Risk analysis of hydropower generation scheduling is important for seeking the best scheduling scheme that takes into consideration both the risk and the benefit. A probability optimization method for risk analysis is proposed to avoid the defects of traditional methods, which are difficult to meet the scheduling needs in risk calculating. And a risk evaluation index system of long-term generation scheduling is constructed to quantify the risk. Based on the probability optimization method for risk calculating, a random expectation model of generation scheduling risk analysis is established and at the same time a concept of sensitivity analysis of risk factors is given to measure the conversion relationship between various types of risk indicators. The example results show that this model can take some risks in the pursuit of maximum power generation efficiency, and has some theoretical significance and reference value for scheduling program development and decision-making.%水电站水库发电调度风险分析对于寻求兼顾风险与效益的最佳调度方案具有重要意义.针对传统风险计算方法难以适应中长期发电调度需要的特点,提出了概率最优化风险分析方法,构建中长期发电调度风险评价指标体系来量化风险大小,基于概率最优化风险分析方法的思想建立了中长期发电调度风险分析的随机期望值模型,并提出风险因子的灵敏度分析概念来衡量各类风险指标之间的转化关系.算例结果表明,该模型能使水库在承担一定风险情况下追求发电效益的相对最大化,对调度方案的制定与决策具有一定的理论意义和参考价值.

  3. Qubit persistence probability

    International Nuclear Information System (INIS)

    In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)

  4. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  5. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  6. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  7. Discriminant analyses of stock prices by using multifractality of time series generated via multi-agent systems and interpolation based on wavelet transforms

    Science.gov (United States)

    Tokinaga, Shozo; Ikeda, Yoshikazu

    In investments, it is not easy to identify traders'behavior from stock prices, and agent systems may help us. This paper deals with discriminant analyses of stock prices using multifractality of time series generated via multi-agent systems and interpolation based on Wavelet Transforms. We assume five types of agents where a part of agents prefer forecast equations or production rules. Then, it is shown that the time series of artificial stock price reveals as a multifractal time series whose features are defined by the Hausedorff dimension D(h). As a result, we see the relationship between the reliability (reproducibility) of multifractality and D(h) under sufficient number of time series data. However, generally we need sufficient samples to estimate D(h), then we use interpolations of multifractal times series based on the Wavelet Transform.

  8. 基于概率模型的ATC系统冲突目标生成算法%Probability-Based Method of Generating the Conflict Trajectories for ATC System

    Institute of Scientific and Technical Information of China (English)

    苏志刚; 眭聪聪; 吴仁彪

    2011-01-01

    For testing the capability of short term conflict alerting of air traffic control system, two methods are usually used. The former is to set a higher threshold, use the real data testing whether the system can alert when distance between two flights gets lower than the threshold. However, this method is not reliable. The second method is simulating flights which will conflict and obtain their trajectory from calculating, and then send these data to ATC system to see its reaction. This method is usually too simple to test whether the system can pre-detect a conflict effectively. To solve these problems, a probabilistic approach is used in this paper to simulate air-crafts with given probability of conflicting. Firstly, we derived the conflict probability of turing flights from Prandaini' s method of conflict probability estimation for linear flight. Then using reverse derivation we got the motion parameters of two targets whose conflict probability was pre-setted. At last, we simulated this pair of targets' track and anlysised their conflict probability. The simulation results show that the targets' probability of conflict was in line with the previous assumption. The trajectories generated by this algorithm are more realistic then a more effective conclusion of ATC system' s capability of short term conflict alerting and pre-detecting will be provided.%通常用于测试空中交通管制(Air Traffic Control,ATC)自动化系统的飞行冲突告警功能的方法主要有放宽系统告警值和向系统输入模拟的飞行冲突目标的雷达数据.前一种方法存在不可靠性,第二种方法由于只产生简单的确定目标轨迹数据,因此只能简单地测试系统能否告警,无法对系统的飞行冲突预测能力作出评价.为了使用于测试系统的模拟雷达数据更符合实际飞行情况,并检测系统预测飞行冲突的技术水平,本文提出了一种基于飞行冲突概率模型的航迹模拟方法,通过对不同目标

  9. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  10. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  11. Dynamic update with probabilities

    NARCIS (Netherlands)

    J. van Benthem; J. Gerbrandy; B. Kooi

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant pr

  12. Economy, probability and risk

    Directory of Open Access Journals (Sweden)

    Elena Druica

    2007-05-01

    Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.

  13. Probability Statistics Based Reactive Power Optimization of Distribution Network Containing Intermittent Distributed Generations%基于概率统计的含间歇性分布式发电的配电网无功优化

    Institute of Scientific and Technical Information of China (English)

    王淳; 高元海

    2014-01-01

    以有功网损期望值最小为优化目标,以节点电压的合格概率大于一定的阈值为约束条件,建立了同时考虑风能、太阳能分布式发电出力和负荷随机波动的配电网无功优化模型。目标函数和约束项中所涉及的概率潮流由一种结合传统解析法的基于全概率公式的计算方法求得。使用化学反应算法对所建优化模型进行求解。在同时接入风能、太阳能分布式电源的33节点和69节点系统上对所提方法进行了验证,得到了具有概率统计意义的最优方案。通过与包括遗传算法(genetic algorithm , GA)、Stud GA(stud genetic algorithm)、生物地理学算法(biogeography based optimization , BBO)和粒子群算法(particle swarm optimization,PSO)在内的多个智能算法对比,验证了所构建的化学反应算法在求解上述无功优化模型时性能更加稳定。%Taking the minimum expectation of active network loss as the optimization objective and the qualified probability of nodal voltage, which larger than a certain threshold, as the constraint, a reactive power optimization model of distribution network, in which the output fluctuation of distributed wind power generation and PV generation as well as the random fluctuation of load are considered simultaneously, is established. The probabilistic power flows involved in objective function and constraints are solved by a complete probability formula based computing method that combines with traditional analytical method. The established reactive power optimization model is solved by chemical reaction optimization (CRO). The proposed model is verified by IEEE 33-bus system and PG&E 69-bus system respectively, to which the distributed PV generation and wind power generation are simultaneously added, and an optimal scheme possessing the meaning of probability statistics is achieved. Comparing the constructed CRO algorithm with other intelligent algorithms, such as genetic

  14. Abstract Models of Probability

    Science.gov (United States)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  15. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  16. The concept of probability

    International Nuclear Information System (INIS)

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  17. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  18. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  19. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  20. Stochastic Programming with Probability

    CERN Document Server

    Andrieu, Laetitia; Vázquez-Abad, Felisa

    2007-01-01

    In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...

  1. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  4. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  5. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  6. Probability with Roulette

    Science.gov (United States)

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  7. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  8. Asteroidal collision probabilities

    Science.gov (United States)

    Bottke, William F., Jr.; Greenberg, Richard

    1993-01-01

    Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.

  9. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  10. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  11. Launch Collision Probability

    Science.gov (United States)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  12. APLICACIÓN DE SALES DE TETRAZOLIO DE NUEVA GENERACIÓN (XTT PARA LA ESTIMACIÓN DE LA DENSIDAD DE MICROORGANISMOS DEGRADADORES DE HIDROCARBUROS EMPLEANDO LA TÉCNICA DEL NÚMERO MÁS PROBABLE Application of the New Generation Tetrazolium Salt (XTT for the Enumeration of Hydrocarbon Degrading Microorganisms Using the Most Probable Number Method

    Directory of Open Access Journals (Sweden)

    VICTORIA EUGENIA VALLEJO

    Full Text Available El presente estudio evaluó el desempeño de dos sales de tetrazolio, una tradicional: INT y una de nueva generación: XTT, para estimar la densidad de microorganismos degradadores de hidrocarburos (HCs en suelos empleando la técnica del Número Más Probable (NMP. Se analizaron 96 muestras de suelo provenientes de la Ecorregión Cafetera de Colombia. Los microorganismos fueron recuperados en agar mínimo de sales en atmósfera saturada de HCs y la capacidad degradadora fue confirmada por repiques sucesivos utilizando diesel como fuente de carbono. No se observaron diferencias significativas en los recuentos de microorganismos degradadores obtenidos con las dos sales (t de Student, p The objective of this study was to evaluate the performance of two tetrazolium indicators: a traditional one: INT and a new generation one: XTT, for the estimation of hydrocarbon (HC degrading microorganism s density using the Most Probable Number Technique (MPN. Ninety six composite soil samples were taken and analyzed from Ecorregión Cafetera Colombiana. Degrading microorganisms were recovered in minimum salt medium with saturated HC atmosphere. Degrading HC capacity of the microorganisms was confirmed by successive subcultures in the same medium using diesel as only carbon source. Counts obtained with the two salts were not significantly different (Student t test, p < 0,05 but XTT allowed an easier visualization of positive wells due to product solubility of the reduce product. A greater percentage of isolates was obtained using XTT (67%, which suggests that salt type is relevant for recovering of these microorganisms. Additionally, cell detection limit, optimal conditions of XTT concentration and incubation times for detection of activity were evaluated. This evaluation was performed by means of microplate format for hydrocarbon degrading microorganisms using Acinetobacter sp. An inhibitory effect was observed in the recovering of cultivable cells when XTT

  13. Body mass index, exercise, and other lifestyle factors in relation to age at natural menopause: analyses from the breakthrough generations study.

    Science.gov (United States)

    Morris, Danielle H; Jones, Michael E; Schoemaker, Minouk J; McFadden, Emily; Ashworth, Alan; Swerdlow, Anthony J

    2012-05-15

    The authors examined the effect of women's lifestyles on the timing of natural menopause using data from a cross-sectional questionnaire used in the United Kingdom-based Breakthrough Generations Study in 2003-2011. The analyses included 50,678 women (21,511 who had experienced a natural menopause) who were 40-98 years of age at study entry and did not have a history of breast cancer. Cox competing risks proportional hazards models were fitted to examine the relation of age at natural menopause to lifestyle and anthropometric factors. Results were adjusted for age at reporting, smoking status at menopause, parity, and body mass index at age 40 years, as appropriate. All P values were 2-sided. High adult weight (P(trend) vegetarian (P < 0.001) were associated with older age at menopause. Neither height nor history of an eating disorder was associated with menopausal age. These findings show the importance of lifestyle factors in determining menopausal age. PMID:22494951

  14. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  15. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  16. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  17. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  18. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  19. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  20. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  1. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  2. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  3. Logic and probability

    OpenAIRE

    Quznetsov, G. A.

    2003-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  4. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  5. Logic, Truth and Probability

    OpenAIRE

    Quznetsov, Gunn

    1998-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  6. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2014-01-01

    that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  7. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  8. An investigation of the ignition probability and data analysis for the detection of relevant parameters of mechanically generated steel sparks in explosive gas/air-mixtures; Untersuchungen zur Zuendwahrscheinlichkeit und Datenanalyse zur Erfassung der Einflussgroessen mechanisch erzeugter Stahl-Schlagfunktion in explosionsfaehigen Brenngas/Luft-Gemischen

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, Thomas; Finke, Robert; Graetz, Rainer

    2010-07-01

    Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.

  9. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  10. After BRCA1 and BRCA2-what next? Multifactorial segregation analyses of three-generation, population-based Australian families affected by female breast cancer.

    Science.gov (United States)

    Cui, J; Antoniou, A C; Dite, G S; Southey, M C; Venter, D J; Easton, D F; Giles, G G; McCredie, M R; Hopper, J L

    2001-02-01

    Mutations in BRCA1 and BRCA2 that cause a dominantly inherited high risk of female breast cancer seem to explain only a small proportion of the aggregation of the disease. To study the possible additional genetic components, we conducted single-locus and two-locus segregation analyses, with and without a polygenic background, using three-generation families ascertained through 858 women with breast cancer diagnosed at age Australia. Extensive testing for deleterious mutations in BRCA1 and BRCA2, to date, has identified 34 carriers. Our analysis suggested that, after other possible unmeasured familial factors are adjusted for and the known BRCA1 and BRCA2 mutation carriers are excluded, there appears to be a residual dominantly inherited risk of female breast cancer in addition to that derived from mutations in BRCA1 and BRCA2. This study also suggests that there is a substantial recessively inherited risk of early-onset breast cancer. According to the best-fitting model, after excluding known carriers of mutations in BRCA1 and BRCA2, about 1/250 (95% confidence interval [CI] 1/500 to 1/125) women have a recessive risk of 86% (95% CI 69%-100%) by age 50 years and of almost 100% by age 60 years. Possible reasons that our study has implicated a novel strong recessive effect include our inclusion of data on lineal aunts and grandmothers, study of families ascertained through women with early-onset breast cancer, allowance for multiple familial factors in the analysis, and removal of families for whom the cause (i.e., BRCA1 or BRCA2) is known. Our findings may have implications for attempts to identify new breast cancer-susceptibility genes. PMID:11133358

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  14. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  15. Angles as probabilities

    CERN Document Server

    Feldman, David V

    2008-01-01

    We use a probabilistic interpretation of solid angles to generalize the well-known fact that the inner angles of a triangle sum to 180 degrees. For the 3-dimensional case, we show that the sum of the solid inner vertex angles of a tetrahedron T, divided by 2*pi, gives the probability that an orthogonal projection of T onto a random 2-plane is a triangle. More generally, it is shown that the sum of the (solid) inner vertex angles of an n-simplex S, normalized by the area of the unit (n-1)-hemisphere, gives the probability that an orthogonal projection of S onto a random hyperplane is an (n-1)-simplex. Applications to more general polytopes are treated briefly, as is the related Perles-Shephard proof of the classical Gram-Euler relations.

  16. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  17. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  18. Analyses of Generation and Release of Tritium in Nuclear Power Plant%核电厂氚的产生和排放分析

    Institute of Scientific and Technical Information of China (English)

    黎辉; 梅其良; 付亚茹

    2015-01-01

    T ritium research including tritium generation in reactor core and in the primary coolant ,release pathways ,tritium chemical forms and release amount is a very impor‐tant part of environment assessment of nuclear power plant .Based on the international operation practice ,the primary coolant system ,auxiliary systems ,radwaste system and ventilation system were analysed ,and the tritium release pathways and chemical forms were investigated .The results indicate that the theoretic calculation results agree with the nuclear power plant operation data very well .The tritium contained in the primary coolant is mainly produced from the three‐fragment fission reaction ,boron activation in the burnable poison rods and boron ,lithium and deuterium activation w hen they pass through the core . The released tritium to the environment is mainly in the form of tritiated water and the percentage between the liquid and gaseous of release tritium mainly depends on the leakage rate from the primary coolant to the reactor building and auxiliary building .%研究核电厂中氚在堆芯和主冷却剂中的产生方式,以及进入环境的途径、形态和排放量,是核电厂辐射环境影响评价非常重要的内容之一。本文通过分析压水堆核电厂中的主冷却剂系统、辅助系统、三废系统和厂房通风系统的运行模式,结合国际上的运行经验参数,研究主冷却剂中的氚排放进入环境大气的途径和形态。研究结果表明:理论计算分析结果与电厂运行经验数据相吻合,氚主要通过燃料棒中的三元裂变,可燃毒物棒中硼的活化以及主冷却剂中硼、锂和氘流经堆芯时的活化产生,主要以液态氚水形式排放,影响气液两相分配份额的主要因素取决于主冷却剂向反应堆厂房和辅助厂房的泄漏率。

  19. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.;

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...

  20. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  1. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t

  2. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  3. The STD/MHD codes - Comparison of analyses with experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc. [for MHD generator flows

    Science.gov (United States)

    Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.

    1981-01-01

    Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.

  4. Probability Theories and the Justification of Theism

    OpenAIRE

    Portugal, Agnaldo Cuoco

    2003-01-01

    In the present paper I intend to analyse, criticise and suggest an alternative to Richard Swinburne"s use of Bayes"s theorem to justify the belief that there is a God. Swinburne"s contribution here lies in the scope of his project and the interpretation he adopts for Bayes"s formula, a very important theorem of the probability calculus.

  5. Electricity generation analyses in an oil-exporting country: Transition to non-fossil fuel based power units in Saudi Arabia

    International Nuclear Information System (INIS)

    In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop an optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. For instance, by decarbonizing half of its generation mix, Saudi Arabia can release around 0.5 Mb/d barrels of oil equivalent per day from 2020. Moreover, total power generation cost reduction can reach up to around 28% per year from 2030 if Saudi Arabia manages to attain the most optimal generation mix structure introduced in the model (50% of power from renewables and nuclear power plants and 50% from the fossil power plants). - Highlights: • We model the current and future power generation situation of Saudi Arabia. • We take into account the integration of the other alternative resources. • We consider different scenarios of power generation structure for the country. • Optimal generation mix can release considerable amount of oil for export

  6. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  7. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  8. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities. PMID:27570097

  9. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  10. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  11. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  12. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-04-01

    Full Text Available In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  13. The Logic of Parametric Probability

    CERN Document Server

    Norman, Joseph W

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.

  14. Dynamic Estimation of Credit Rating Transition Probabilities

    OpenAIRE

    Berd, Arthur M.

    2009-01-01

    We present a continuous-time maximum likelihood estimation methodology for credit rating transition probabilities, taking into account the presence of censored data. We perform rolling estimates of the transition matrices with exponential time weighting with varying horizons and discuss the underlying dynamics of transition generator matrices in the long-term and short-term estimation horizons.

  15. How to Read Probability Distributions as Statements about Process

    OpenAIRE

    Frank, Steven A.

    2014-01-01

    Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken o...

  16. Electricity generation analyses in an oil-exporting country: Transition to non-fossil fuel based power units in Saudi Arabia

    International Nuclear Information System (INIS)

    In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop a optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. (authors)

  17. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  18. Interpretation of Plateau in High-Harmonic Generation

    Institute of Scientific and Technical Information of China (English)

    程太旺; 李晓峰; 敖淑艳; 傅盘铭

    2003-01-01

    The plateau in high-harmonic generation is investigated in the frequency domain. Probability density of an electron in an electromagnetic field is obtained through analysing the quantized-field Volkov state. The plateau of high-harmonic generation reflects the spectral density of the electron at the location of nucleus after abovethreshold ionization.

  19. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  20. Research And Development Of A Pilot First Generation PGNAA OFF-BELT System For Analysing Of Composition Of Cement And Bauxite Raw Material

    International Nuclear Information System (INIS)

    With purpose of development of the PGNAA system which can operate on the field in the condition of considerably changing of temperature and moisture, a multi channel analyzer-MCA 2k was designed and developed, which can compatibly operate with BGO detector, connected with computer through USB 2.0 port. Beside that, a software for obtaining and displaying the prompt gamma spectrum, with the spectrum stability function and convenience in the data was also designed and developed. The first generation PGNAA system was experimented in the changing condition of temperature in the laboratory. The result give out that the prompt gamma spectrum was stability during temperature changing, the peak area of the elements in the samples changed about 7%. Besides that, the first generation PGNAA system was also experimented to analyze the cement and bauxite samples. The result was also matched with the other analysis methods such as chemistry method and INAA method with the difference about 10%. (author)

  1. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. PMID:25704578

  2. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  3. Investigation of probable decays in rhenium isotopes

    International Nuclear Information System (INIS)

    Making use of effective liquid drop model (ELDM), the feasibility of proton and alpha decays and various cluster decays is analysed theoretically. For different neutron-rich and neutron-deficient isotopes of Rhenium in the mass range 150 < A < 200, the half-lives of proton and alpha decays and probable cluster decays are calculated considering the barrier potential as the effective liquid drop one which is the sum of Coulomb, surface and centrifugal potentials. The calculated half-lives for proton decay from various Rhenium isotopes are then compared with the universal decay law (UDL) model to assess the efficiency of the present formalism. Geiger-Nuttal plots of the probable decays are analysed and their respective slopes and intercepts are evaluated

  4. Microbially influenced corrosion (MIC) analyses of the BNGS-B vacuum building. Report No. 92-185-K. [BNGS (Bruce Nuclear Generating Station)

    Energy Technology Data Exchange (ETDEWEB)

    Jain, D.K.

    1992-01-01

    Microbially influenced corrosion (MIC) has been found to play a significant role in causing corrosion, especially in those industries which use natural waters. The most significant of the organisms found to cause corrosion are the sulphate-reducing bacteria (SRB), particularly with anoxic deposits or stagnant weirs. In May 1992, the Bruce Nuclear Generating Station B Vacuum Building was inspected for MIC after being in service for 10 years. This report provides results for both on-site MIC inspection and for microbiological analysis of sediments, water, and slime deposits for evidence of MIC bacteria.

  5. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  6. In silico and in vitro analyses of the angiotensin-I converting enzyme inhibitory activity of hydrolysates generated from crude barley (Hordeum vulgare) protein concentrates.

    Science.gov (United States)

    Gangopadhyay, Nirupama; Wynne, Kieran; O'Connor, Paula; Gallagher, Eimear; Brunton, Nigel P; Rai, Dilip K; Hayes, Maria

    2016-07-15

    Angiotensin-I-converting enzyme (ACE-I) plays a key role in control of hypertension, and type-2 diabetes mellitus, which frequently co-exist. Our current work utilised in silico methodologies and peptide databases as tools for predicting release of ACE-I inhibitory peptides from barley proteins. Papain was the enzyme of choice, based on in silico analysis, for experimental hydrolysis of barley protein concentrate, which was performed at the enzyme's optimum conditions (60 °C, pH 6.0) for 24 h. The generated hydrolysate was subjected to molecular weight cut-off (MWCO) filtration, following which the non-ultrafiltered hydrolysate (NUFH), and the generated 3 kDa and 10 kDa MWCO filtrates were assessed for their in vitro ACE-I inhibitory activities. The 3 kDa filtrate (1 mg/ml), that demonstrated highest ACE-I inhibitory activity of 70.37%, was characterised in terms of its peptidic composition using mass spectrometry and 1882 peptides derived from 61 barley proteins were identified, amongst which 15 peptides were selected for chemical synthesis based on their predicted ACE-I inhibitory properties. Of the synthesized peptides, FQLPKF and GFPTLKIF were most potent, demonstrating ACE-I IC50 values of 28.2 μM and 41.2 μM respectively. PMID:26948626

  7. A generative inference framework for analysing patterns of cultural change in sparse population data with evidence for fashion trends in LBK culture.

    Science.gov (United States)

    Kandler, Anne; Shennan, Stephen

    2015-12-01

    Cultural change can be quantified by temporal changes in frequency of different cultural artefacts and it is a central question to identify what underlying cultural transmission processes could have caused the observed frequency changes. Observed changes, however, often describe the dynamics in samples of the population of artefacts, whereas transmission processes act on the whole population. Here we develop a modelling framework aimed at addressing this inference problem. To do so, we firstly generate population structures from which the observed sample could have been drawn randomly and then determine theoretical samples at a later time t2 produced under the assumption that changes in frequencies are caused by a specific transmission process. Thereby we also account for the potential effect of time-averaging processes in the generation of the observed sample. Subsequent statistical comparisons (e.g. using Bayesian inference) of the theoretical and observed samples at t2 can establish which processes could have produced the observed frequency data. In this way, we infer underlying transmission processes directly from available data without any equilibrium assumption. We apply this framework to a dataset describing pottery from settlements of some of the first farmers in Europe (the LBK culture) and conclude that the observed frequency dynamic of different types of decorated pottery is consistent with age-dependent selection, a preference for 'young' pottery types which is potentially indicative of fashion trends. PMID:26674195

  8. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  9. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies.

    Science.gov (United States)

    de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries. PMID:26125026

  10. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Directory of Open Access Journals (Sweden)

    Alexandre G. de Brevern

    2015-01-01

    Full Text Available Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  11. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  12. Probabilities of multiple quantum teleportation

    OpenAIRE

    Woesler, Richard

    2002-01-01

    Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...

  13. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  14. Genome-wide analyses of radioresistance-associated miRNA expression profile in nasopharyngeal carcinoma using next generation deep sequencing.

    Directory of Open Access Journals (Sweden)

    Guo Li

    Full Text Available BACKGROUND: Rapidly growing evidence suggests that microRNAs (miRNAs are involved in a wide range of cancer malignant behaviours including radioresistance. Therefore, the present study was designed to investigate miRNA expression patterns associated with radioresistance in NPC. METHODS: The differential expression profiles of miRNAs and mRNAs associated with NPC radioresistance were constructed. The predicted target mRNAs of miRNAs and their enriched signaling pathways were analyzed via biological informatical algorithms. Finally, partial miRNAs and pathways-correlated target mRNAs were validated in two NPC radioreisitant cell models. RESULTS: 50 known and 9 novel miRNAs with significant difference were identified, and their target mRNAs were narrowed down to 53 nasopharyngeal-/NPC-specific mRNAs. Subsequent KEGG analyses demonstrated that the 53 mRNAs were enriched in 37 signaling pathways. Further qRT-PCR assays confirmed 3 down-regulated miRNAs (miR-324-3p, miR-93-3p and miR-4501, 3 up-regulated miRNAs (miR-371a-5p, miR-34c-5p and miR-1323 and 2 novel miRNAs. Additionally, corresponding alterations of pathways-correlated target mRNAs were observed including 5 up-regulated mRNAs (ICAM1, WNT2B, MYC, HLA-F and TGF-β1 and 3 down-regulated mRNAs (CDH1, PTENP1 and HSP90AA1. CONCLUSIONS: Our study provides an overview of miRNA expression profile and the interactions between miRNA and their target mRNAs, which will deepen our understanding of the important roles of miRNAs in NPC radioresistance.

  15. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to charac......An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  16. Thermal And Spectroscopic Analyses Of Next Generation Caustic Side Solvent Extraction Solvent Contacted With 3, 8, And 16 Molar Nitric Acid

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F.; Fink, S. D.

    2011-12-07

    A new solvent system referred to as Next Generation Solvent or NGS, has been developed at Oak Ridge National Laboratory for the removal of cesium from alkaline solutions in the Caustic Side Solvent Extraction process. The NGS is proposed for deployment at MCU{sup a} and at the Salt Waste Processing Facility. This work investigated the chemical compatibility between NGS and 16 M, 8 M, and 3 M nitric acid from contact that may occur in handling of analytical samples from MCU or, for 3 M acid, which may occur during contactor cleaning operations at MCU. This work shows that reactions occurred between NGS components and the high molarity nitric acid. Reaction rates are much faster in 8 M and 16 M nitric acid than in 3 M nitric acid. In the case of 16 M and 8 M nitric acid, the nitric acid reacts with the extractant to produce initially organo-nitrate species. The reaction also releases soluble fluorinated alcohols such as tetrafluoropropanol. With longer contact time, the modifier reacts to produce a tarry substance with evolved gases (NO{sub x} and possibly CO). Calorimetric analysis of the reaction product mixtures revealed that the organo-nitrates reaction products are not explosive and will not deflagrate.

  17. 制氢装置蒸汽发生器内漏诊断与修复%Analyses and Repair for Steam Generator in Hydrogen Unit

    Institute of Scientific and Technical Information of China (English)

    马红涛

    2011-01-01

    The steam generator appearing outside wall of over-temperature caused by lining materials and construction quality were analyzed of a new refinery 2 × 104 m3/h (standard operation)hydrogen unit. The results show that what is said above, welding joint between heat exchange tube and tube plate, welding procedure and process irrational, under the above factor combined action, not only makes outside wall of over-temperature, but makes the tube plate welded crack problems as well. According to the above analysis, the measures were taken and the prevention improvement measures were proposed.%对某炼油厂2万m3/h(标准体积)原料气制氢装置转化气蒸汽发生器管程入口锥壳壁局部温度超标的问题进行了详细分析.结果表明,在管程壳体衬里材料和衬里施工质量不符合要求、管板与换热管之间的结构和焊接工艺不完善以及操作过程不规范使换热管变形不协调的综合作用下,不仅使管程入口锥壳壁局部温度超标,而且在管程入口端管板上产生了穿透管接头焊缝、连接两管程以及管桥的裂纹.据此对设备进行了修复,提出了防止发生此类问题的预防和改进措施.

  18. Comparison of human gut microbiota in control subjects and patients with colorectal carcinoma in adenoma: Terminal restriction fragment length polymorphism and next-generation sequencing analyses.

    Science.gov (United States)

    Kasai, Chika; Sugimoto, Kazushi; Moritani, Isao; Tanaka, Junichiro; Oya, Yumi; Inoue, Hidekazu; Tameda, Masahiko; Shiraki, Katsuya; Ito, Masaaki; Takei, Yoshiyuki; Takase, Kojiro

    2016-01-01

    Colorectal cancer (CRC) is the third leading cause of cancer-related deaths in Japan. The etiology of CRC has been linked to numerous factors including genetic mutation, diet, life style, inflammation, and recently, the gut microbiota. However, CRC-associated gut microbiota is still largely unexamined. This study used terminal restriction fragment length polymorphism (T-RFLP) and next-generation sequencing (NGS) to analyze and compare gut microbiota of Japanese control subjects and Japanese patients with carcinoma in adenoma. Stool samples were collected from 49 control subjects, 50 patients with colon adenoma, and 9 patients with colorectal cancer (3/9 with invasive cancer and 6/9 with carcinoma in adenoma) immediately before colonoscopy; DNA was extracted from each stool sample. Based on T-RFLP analysis, 12 subjects (six control and six carcinoma in adenoma subjects) were selected; their samples were used for NGS and species-level analysis. T-RFLP analysis showed no significant differences in bacterial population between control, adenoma and cancer groups. However, NGS revealed that i), control and carcinoma in adenoma subjects had different gut microbiota compositions, ii), one bacterial genus (Slackia) was significantly associated with the control group and four bacterial genera (Actinomyces, Atopobium, Fusobacterium, and Haemophilus) were significantly associated with the carcinoma-in-adenoma group, and iii), several bacterial species were significantly associated with each type (control: Eubacterium coprostanoligens; carcinoma in adenoma: Actinomyces odontolyticus, Bacteroides fragiles, Clostridium nexile, Fusobacterium varium, Haemophilus parainfluenzae, Prevotella stercorea, Streptococcus gordonii, and Veillonella dispar). Gut microbial properties differ between control subjects and carcinoma-in-adenoma patients in this Japanese population, suggesting that gut microbiota is related to CRC prevention and development.

  19. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  20. Analysing the role of abandoned agricultural terraces on flood generation in a set of small Mediterranean mountain research catchments (Vallcebre, NE Spain)

    Science.gov (United States)

    Gallart, Francesc; Llorens, Pilar; Pérez-Gallego, Nuria; Latron, Jérôme

    2016-04-01

    The Vallcebre research catchments are located in NE Spain, in a middle mountain area with a Mediterranean sub-humid climate. Most of the bedrock consists of continental red lutites that are easily weathered into loamy soils. This area was intensely used for agriculture in the past when most of the sunny gentle hillslopes were terraced. The land was progressively abandoned since the mid-20th Century and most of the fields were converted to meadows or were spontaneously forested. Early studies carried out in the terraced Cal Parisa catchment demonstrated the occurrence of two types of frequently saturated areas, ones situated in downslope locations with high topographic index values, and the others located in the inner parts of many terraces, where the shallow water table usually outcrops due to the topographical modifications linked to terrace construction. Both the increased extent of saturated areas and the role of a man-made elementary drainage system designed for depleting water from the terraces suggested that terraced areas would induce an enhanced hydrological response during rainfall events when compared with non-terraced hillslopes. The response of 3 sub-catchments, of increasing area and decreasing percentage of terraced area, during a set of major events collected during over 15 years has been analysed. The results show that storm runoff depths were roughly proportional to precipitations above 30 mm although the smallest catchment (Cal Parisa), with the highest percentage of terraces, was able to completely buffer rainfall events of 60 mm in one hour without any runoff when antecedent conditions were dry. Runoff coefficients depended on antecedent conditions and peak discharges were weakly linked to rainfall intensities. Peak lag times, peak runoff rates and recession coefficients were similar in the 3 catchments; the first variable values were in the range between Hortonian and saturation overland flow and the two last ones were in the range of

  1. Pretest probability assessment derived from attribute matching

    OpenAIRE

    Hollander Judd E; Diercks Deborah B; Pollack Charles V; Johnson Charles L; Kline Jeffrey A; Newgard Craig D; Garvey J Lee

    2005-01-01

    Abstract Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possib...

  2. Probability theory and its models

    OpenAIRE

    Humphreys, Paul

    2008-01-01

    This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.

  3. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  4. Varieties of Belief and Probability

    NARCIS (Netherlands)

    Eijck, D.J.N. van; Ghosh, S.; Szymanik, J.

    2015-01-01

    For reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for knowledge and b

  5. Subjective probability models for lifetimes

    CERN Document Server

    Spizzichino, Fabio

    2001-01-01

    Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...

  6. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  7. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  8. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  9. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  10. Reliability analysis of reactor systems by applying probability method

    International Nuclear Information System (INIS)

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component

  11. Convergence of simulated annealing by the generalized transition probability

    OpenAIRE

    Nishimori, Hidetoshi; Inoue, Jun-Ichi

    1998-01-01

    We prove weak ergodicity of the inhomogeneous Markov process generated by the generalized transition probability of Tsallis and Stariolo under power-law decay of the temperature. We thus have a mathematical foundation to conjecture convergence of simulated annealing processes with the generalized transition probability to the minimum of the cost function. An explicitly solvable example in one dimension is analyzed in which the generalized transition probability leads to a fast convergence of ...

  12. Time Varying Transition Probabilities for Markov Regime Switching Models

    OpenAIRE

    Bazzi, Marco; Blasques, Francisco; Koopman, Siem Jan; Lucas, Andre

    2014-01-01

    We propose a new Markov switching model with time varying probabilities for the transitions. The novelty of our model is that the transition probabilities evolve over time by means of an observation driven model. The innovation of the time varying probability is generated by the score of the predictive likelihood function. We show how the model dynamics can be readily interpreted. We investigate the performance of the model in a Monte Carlo study and show that the model is successful in estim...

  13. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  14. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  15. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  16. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  17. Transition probabilities of Br II

    Science.gov (United States)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  18. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  19. Logical, conditional, and classical probability

    OpenAIRE

    Quznetsov, G. A.

    2005-01-01

    The propositional logic is generalized on the real numbers field. the logical function with all properties of the classical probability function is obtained. The logical analog of the Bernoulli independent tests scheme is constructed. The logical analog of the Large Number Law is deduced from properties of these functions. The logical analog of thd conditional probability is defined. Consistency encured by a model on a suitable variant of the nonstandard analysis.

  20. Compliance with endogenous audit probabilities

    OpenAIRE

    Konrad, Kai A.; Lohse, Tim; Qari, Salmai

    2015-01-01

    This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...

  1. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  2. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Novel Bounds on Marginal Probabilities

    OpenAIRE

    Mooij, Joris M.; Kappen, Hilbert J

    2008-01-01

    We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...

  4. Trajectory probability hypothesis density filter

    OpenAIRE

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  5. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  6. Socio-economic well-to-wheel analysis of biofuels. Scenarios for rapeseed diesel (RME) and 1. and 2. generation bioethanol; Samfundsoekonomisk well-to-wheel-analyse af biobraendstoffer. Scenarieberegninger for rapsdiesel (RME) og 1.- og 2.-generations bioethanol

    Energy Technology Data Exchange (ETDEWEB)

    Slentoe, E.; Moeller, F.; Winther, M.; Hjort Mikkelsen, M.

    2010-10-15

    The report examines in an integrated form, the energy, emissions and welfare economic implications of introducing Danish produced biodiesel, i.e. rapeseed diesel (RME) and the first and second generation wheat ethanol in two scenarios with low and high rate of blending with fossil fuel based automotive fuels. Within this project's, analytical framework and assumptions the welfare economic analysis shows, that it would be beneficial for society to realize the biofuel scenarios to some extent by oil prices above $ 100 a barrel, while it will cause losses by oil prices at $ 65. In all cases, the fossil fuel consumption and the emissions CO2eq emissions are reduced, the effect of which is priced and included in the welfare economic analysis. The implementation of biofuels in Denmark will be dependent on market price. As it stands now, it is not favorable in terms of biofuels. The RME is currently produced in Denmark is exported to other European countries where there are state subsidies. Subsidies would also be a significant factor in Denmark to achieve objectives for biofuel blending. (ln)

  7. Incrementalization of Analyses for Next Generation IDEs

    OpenAIRE

    Kloppenburg, Sven

    2009-01-01

    To support developers in their day–to–day work, Integrated Develoment En- vironments (IDEs) incorporate more and more ways to help developers focus on the inherent complexities of developing increasingly larger software systems. The complexity of developing large software systems can be categorized into inherent complexity that stems from the complexity of the problem domain and accidential complexity that stems from the shortcomings of the tools and methods used to tackle the problem. For ex...

  8. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  9. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  10. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  11. Estimation of the ignition probability due to mechanically generated impact sparks in explosive gas/air-mixtures. Examinations of the materials combination: steel/steel; Ermittlung der Zuendwahrscheinlichkeit mechanisch erzeugter Schlagfunken in explosionsfaehigen Brenngas/Luft-Gemischen. Untersuchung der Werkstoffkombination Stahl/Stahl

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, T.; Graetz, R.

    2007-09-29

    Equipment intended for use in potentially explosive atmospheres must meet the requirements of the European directive 94/9/EC. The declaration of conformity of the manufacturer testifies that they meet the requirements. The conformity assessment is based on the risk (ignition) assessment which identifies and estimates the ignition sources. The European standards in the area of the directive 94/9/EC (like EN 1127-1, EN 13463-1) describe 13 possible ignition sources. Mechanically generated sparks are one of them. Statements to the ignition effectiveness and especially the ignition probability in case of mechanically generated sparks for a given kinetic impact energy and given explosive gas/air-mixtures are not possible. An extensive literature looking confirms this state. This was and is a problem in making and revising standards. Simple ferritic steel is a common material for the construction of equipment also for non electrical applications intended for use in potentially explosive atmospheres for chemical and mechanical engineering and manufacturing technology. Therefore it was the objective of this study to get some statistical ignition probabilities depending on the kinetic impact energy and the minimum ignition energy of the explosive gas/air-mixture. This study was made with impact testing machines of BAM (Federal Institute of Materials Research and Testing) at three kinetic impact energies. The following results were obtained for all the reference gas/air-mixtures of the IEC-explosion groups (I methane, IIA propane, IIB ethylene, IIC acetylene, hydrogen): 1. It was not possible to generate ignitable mechanically sparks for kinetic impact energies below 3 Nm for the test conditions in this study respectively the impact kinetics and impact geometry of the impact machines. 2. Single mechanically generated particles were able to be a dangerous ignition source through oxidation process at kinetic impact energies of 10 Nm. Furthermore the tests have shown that the

  12. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  13. Born Rule and Noncontextual Probability

    CERN Document Server

    Logiurato, Fabrizio

    2012-01-01

    The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...

  14. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  15. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  16. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  17. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  18. Probability as a physical motive

    CERN Document Server

    Martin, P

    2007-01-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking the physical principle of Maximum Entropy Production ("MEP") to the information-theoretical "MaxEnt" principle of scientific inference, together with conjectures from theoretical physics that there may be no fundamental causal laws but only probabilities for physical processes, and from evolutionary theory that biological systems expand "the adjacent possible" as rapidly as possible, all lend credence to the proposition that probability should be recognized as a fundamental physical motive. It is further proposed that spatial order and temporal order are two aspects of the same thing, and that this is the essence of the second law of thermodynamics.

  19. Fusion Probability in Dinuclear System

    CERN Document Server

    Hong, Juhee

    2015-01-01

    Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.

  20. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  1. Pollock on probability in epistemology

    OpenAIRE

    Fitelson, Branden

    2010-01-01

    In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.

  2. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  3. Quantum correlations; quantum probability approach

    OpenAIRE

    Majewski, W A

    2014-01-01

    This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...

  4. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  5. Asbestos and Probable Microscopic Polyangiitis

    OpenAIRE

    George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W

    2004-01-01

    Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...

  6. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  7. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  8. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  9. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    Science.gov (United States)

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  10. Transition probability and preferential gauge

    OpenAIRE

    Chen, C.Y.

    1999-01-01

    This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.

  11. Quantifying Extinction Probabilities from Sighting Records: Inference and Uncertainties

    OpenAIRE

    Peter Caley; Simon C Barry

    2014-01-01

    Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and surv...

  12. Ruin probabilities for a regenerative Poisson gap generated risk process

    DEFF Research Database (Denmark)

    Asmussen, Søren; Biard, Romain

    A risk process with constant premium rate c and Poisson arrivals of claims is considered. A threshold r is defined for claim interarrival times, such that if k consecutive interarrival times are larger than r, then the next claim has distribution G. Otherwise, the claim size distribution is F. Asy...

  13. Digital differential analysers

    CERN Document Server

    Shilejko, A V; Higinbotham, W

    1964-01-01

    Digital Differential Analysers presents the principles, operations, design, and applications of digital differential analyzers, a machine with the ability to present initial quantities and the possibility of dividing them into separate functional units performing a number of basic mathematical operations. The book discusses the theoretical principles underlying the operation of digital differential analyzers, such as the use of the delta-modulation method and function-generator units. Digital integration methods and the classes of digital differential analyzer designs are also reviewed. The te

  14. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  15. Classical Probability and Quantum Outcomes

    Directory of Open Access Journals (Sweden)

    James D. Malley

    2014-05-01

    Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.

  16. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  17. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  18. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  19. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  20. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  1. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  2. Subjective probability and quantum certainty

    CERN Document Server

    Caves, C M; Schack, R; Caves, Carlton M.; Fuchs, Christopher A.; Schack, Ruediger

    2006-01-01

    In the Bayesian approach to quantum mechanics, probabilities--and thus quantum states--represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Our analysis reveals fundamental differences between our Bayesian approach on the one hand and the Copenhagen interpretation and similar interpretations of quantum states on the other hand. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then show that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that with-certainty predictions derived from such a state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply certainty for a measurement outcome, that outcome would effectively correspond to a preexisting system pr...

  3. The probability of extraterrestrial life

    International Nuclear Information System (INIS)

    Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)

  4. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  5. Tight Bernoulli tail probability bounds

    OpenAIRE

    Dzindzalieta, Dainius

    2014-01-01

    The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...

  6. Asbestos and Probable Microscopic Polyangiitis

    Directory of Open Access Journals (Sweden)

    George S Rashed Philteos

    2004-01-01

    Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.

  7. Relative transition probabilities of cobalt

    Science.gov (United States)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  8. Calculational framework for safety analyses of non-reactor nuclear facilities

    International Nuclear Information System (INIS)

    A calculational framework for the consequences analysis of non-reactor nuclear facilities is presented. The analysis framework starts with accident scenarios which are developed through a traditional hazard analysis and continues with a probabilistic framework for the consequences analysis. The framework encourages the use of response continua derived from engineering judgment and traditional deterministic engineering analyses. The general approach consists of dividing the overall problem into a series of interrelated analysis cells and then devising Markov chain like probability transition matrices for each of the cells. An advantage of this division of the problem is that intermediate output (as probability state vectors) are generated at each calculational interface. The series of analyses when combined yield risk analysis output. The analysis approach is illustrated through application to two non-reactor nuclear analyses: the Ulysses Space Mission, and a hydrogen burn in the Hanford waste storage tanks

  9. Conceptualizing Media Generations: the Print-, Online- and Individualized Generations

    DEFF Research Database (Denmark)

    Westlund, Oscar; A Färdigh, Mathias

    2012-01-01

    the generational cohorts into a conceptualization involving three media generations. The print generation (1920s- 1940s) shows high probability (137%) and scored its highest value for reading only printed newspaper (Pearson’s r = .135). The online generation (1950s-1970s) shows high probability (97%) and scored...

  10. Effects of technological progress and climate protection on electric power generation. Analyses using a General Equilibrium Model; Auswikungen des technologischen Fortschritts und des Klimaschutzes auf die Stromerzeugung. Analysen mit einem Allgemeinen Gleichgewichtsmodell

    Energy Technology Data Exchange (ETDEWEB)

    Zuern, Marcel

    2010-07-01

    The target of this thesis is the analysis of the connection between technological change and the development of global GHG with a quantitative analytic framework. Due to the special importance of the electricity generation sector for the mitigation of CO{sub 2} special attention is paid to this sector. The analysis of technological progress, particularly in the power generation sector on a global level, asks for substantial requirements of the analytic framework. The great number of actors and the interplay of interdependent factors make an analytical solution to the problem impossible. Therefore, a quantitative numerical model is necessary in order to analyse technological change on a global level. For the analysis of innovation and technological progress the sectoral, regional and chronological dimensions have to be considered explicitly: The analysis should take all economic areas into account because innovations are not restricted to a certain industrial sector or certain area of the economy but involve the whole economy. Concerning the geographical dimension innovations are not bound to a single country but spread out over national borders. Adjustments to technological development take time to unfold, and therefore an analytical framework should cover a long-term horizon. The same requirements apply to the regional, geographical and chronological dimensions when analysing measures to reduce GHG. The general equilibrium model used in this work (CGE - Computable General Equilibrium) fulfils all of the requirements listed above. Since the GHG problem is a global one, its analysis demands of model that is appropriate for this level. The structure of GHG models and the use of economic data on the global level allow for the correct methodology therefore. Since adjustments to measures of climate protection as well as innovations and technological change need time, a dynamic general equilibrium model with a long-term time horizon is used. Further advantages of CGE

  11. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  12. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  13. Nonlocality, Bell's Ansatz and Probability

    CERN Document Server

    Kracklauer, A F

    2006-01-01

    Quantum Mechanics lacks an intuitive interpretation, which is the cause of a generally formalistic approach to its use. This in turn has led to a certain insensitivity to the actual meaning of many words used in its description and interpretation. Herein, we analyze carefully the possible meanings of those terms used in analysis of EPR's contention, that Quantum Mechanics is incomplete, as well as Bell's work descendant therefrom. As a result, many inconsistencies and errors in contemporary discussions of nonlocality, as well as in Bell's Ansatz with respect to the laws of probability, are identified. Evading these errors precludes serious conflicts between Quantum Mechanics and Special Relativity and Philosophy.

  14. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  15. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  16. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  17. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  18. A Model of Protocoalition Bargaining with Breakdown Probability

    Directory of Open Access Journals (Sweden)

    Maria Montero

    2015-04-01

    Full Text Available This paper analyses a model of legislative bargaining in which parties form tentative coalitions (protocoalitions before deciding on the allocation of a resource. Protocoalitions may fail to reach an agreement, in which case they may be dissolved (breakdown and a new protocoalition may form. We show that agreement is immediate in equilibrium, and the proposer advantage disappears as the breakdown probability goes to zero. We then turn to the special case of apex games and explore the consequences of varying the probabilities that govern the selection of formateurs and proposers. Letting the breakdown probability go to zero, most of the probabilities considered lead to the same ex post pay-off division. Ex ante expected pay-offs may follow a counterintuitive pattern: as the bargaining power of weak players within a protocoalition increases, the weak players may expect a lower pay-off ex ante.

  19. Applications of the Dirichlet distribution to forensic match probabilities.

    Science.gov (United States)

    Lange, K

    1995-01-01

    The Dirichlet distribution provides a convenient conjugate prior for Bayesian analyses involving multinomial proportions. In particular, allele frequency estimation can be carried out with a Dirichlet prior. If data from several distinct populations are available, then the parameters characterizing the Dirichlet prior can be estimated by maximum likelihood and then used for allele frequency estimation in each of the separate populations. This empirical Bayes procedure tends to moderate extreme multinomial estimates based on sample proportions. The Dirichlet distribution can also be employed to model the contributions from different ancestral populations in computing forensic match probabilities. If the ancestral populations are in genetic equilibrium, then the product rule for computing match probabilities is valid conditional on the ancestral contributions to a typical person of the reference population. This fact facilitates computation of match probabilities and tight upper bounds to match probabilities.

  20. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  1. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  2. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  3. Transition probabilities for argon I

    International Nuclear Information System (INIS)

    Transition probabilities for ArI lines have been calculated on the basis of the (j,k)-coupling scheme for more than 16000 spectral lines belonging to the transition arrays 4s-np (n=4 to n=9), 5s-np (n=5 to n=9), 6s-np (n=6 to n=9), 7s-np (n=8 to n=9), 4p-ns (n=5 to n=10), 5p-ns (n=6 to n=9), 6p-ns (n=7 to n=8), 4p-nd (n=3 to n=9), 5p-nd (n=4 to n=9), 3d-np (n=5 to n=9), 4d-np (n=6 to n=9), 5d-np (n=7 to n=9), 3d-nf (n=4 to n=9), 4d-nf (n=4 to n=9), 5d-nf (n=5 to n=9), 4f-nd (n=5 to n=9) 5f-nd (n=6 to n=9), 4f-ng (n=5 to n=9), 5f-ng (n=6 to n=9). Inso far as values by other authors exist, comparison is made with these values. It turns out that the results obtained in (j,k)-coupling are close to those obtained in intermediate coupling except for intercombination lines. For high principal and/or orbital quantum numbers the transition probabilities for a multiplet approach those of the corresponding transitions in atomic hydrogen. The calculated values are applied to construct a simplified argon-atom model, which reflects the real transition properties and which allows simplified but realistic non-equilibrium calculations for argon plasmas which deviate from local thermodynamic equilibrium (LTE)

  4. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  5. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  6. Cosmological dynamics in tomographic probability representation

    OpenAIRE

    Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.

    2004-01-01

    The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...

  7. Multiple Regression Analyses of Main Properties Induced Mutation in the Onion L2 Generation%激光诱变洋葱L2代主要性状的多重回归分析

    Institute of Scientific and Technical Information of China (English)

    赵彤

    2004-01-01

    采用CO2和He-Ne 两种激光的三种剂量,分别辐照两个洋葱品种的湿种子,试验采用随机区组设计,重复三次,利用生物统计学的方法,从个体水平上考查激光诱变洋葱L2代的鳞茎鲜重,横茎等主要性状的回归分析和遗传变异.结果表明:洋葱鳞茎的横茎、纵茎、单株生物产量与鳞茎鲜重间的回归关系显著,横茎的作用大于纵茎,育种中应重视鳞茎、横茎的选择.%Wet seeds of onion breeds were treated by He-Ne laser and CO2 laser at three dose level separately. A randomized complete block design of three replications was adopted,in two generation regression analyses were experimented with biostatistical methods,on the thickness of onion's bulb,cross diameter,vertical diameter,yield,leaf number and weight of bulbs. Our results have shown that remarkable regression relationship are among the cross diameter,vertical diameter,yield of a single plant and the weight of bulbs and that the function of cross diameter is more influential than that of vertical diameter.

  8. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections

  9. Avoiding Negative Probabilities in Quantum Mechanics

    CERN Document Server

    Nyambuya, Golden Gadzirayi

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...

  10. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  11. Probable Linezolid-Induced Pancytopenia

    Directory of Open Access Journals (Sweden)

    Nita Lakhani

    2005-01-01

    Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.

  12. Trajectory versus probability density entropy

    Science.gov (United States)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  13. Random medium shiver movement probability and surface subsidence

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Z.; Yin, Z.; Wang, J. [China University of Mining and Technology (China). Beijing Campus

    2000-06-01

    Based on the physical model of random medium, the shiver movement probability has been analysed and the surface movement prediction model under sub-critical extraction condition is obtained. The advantage of this prediction model when compared with the probability integration method is that the movement distance of the inflection point does not need to be determined. And because the rate of surface subsidence is taken directly as the prediction parameter, the prediction accuracy is improved. The results have some reference value for the prediction of surface movement caused by sub-critical extraction. 2 refs., 4 figs., 2 tabs.

  14. Proton cellular influx as a probable mechanism of variation potential influence on photosynthesis in pea.

    Science.gov (United States)

    Sukhov, Vladimir; Sherstneva, Oksana; Surova, Lyubov; Katicheva, Lyubov; Vodeneev, Vladimir

    2014-11-01

    Electrical signals (action potential and variation potential, VP) caused by environmental stimuli are known to induce various physiological responses in plants, including changes in photosynthesis; however, their functional mechanisms remain unclear. In this study, the influence of VP on photosynthesis in pea (Pisum sativum L.) was investigated and the proton participation in this process analysed. VP, induced by local heating, inactivated photosynthesis and activated respiration, with the initiation of the photosynthetic response connected with inactivation of the photosynthetic dark stage; however, direct VP influence on the light stage was also probable. VP generation was accompanied with pH increases in apoplasts (0.17-0.30 pH unit) and decreases in cytoplasm (0.18-0.60 pH unit), which probably reflected H(+) -ATPase inactivation and H(+) influx during this electrical event. Imitation of H(+) influx using the protonophore carbonyl cyanide m-chlorophenylhydrazone (CCCP) induced a photosynthetic response that was similar with a VP-induced response. Experiments on chloroplast suspensions showed that decreased external pH also induced an analogous response and that its magnitude depended on the magnitude of pH change. Thus, the present results showed that proton cellular influx was the probable mechanism of VP's influence on photosynthesis in pea. Potential means of action for this influence are discussed.

  15. Computer Simulations and Problem-Solving in Probability.

    Science.gov (United States)

    Camp, John S.

    1978-01-01

    The purpose of this paper is to present problems (and solutions) from the areas of marketing, population planning, system reliability, and mathematics to show how a computer simulation can be used as a problem-solving strategy in probability. Examples using BASIC and two methods of generating random numbers are given. (Author/MP)

  16. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  17. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  18. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  19. The Cognitive Substrate of Subjective Probability

    Science.gov (United States)

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  20. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  1. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  2. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  3. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number...

  4. On the measurement probability of quantum phases

    OpenAIRE

    Schürmann, Thomas

    2006-01-01

    We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

  5. Network class superposition analyses.

    Science.gov (United States)

    Pearson, Carl A B; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30) for the yeast cell cycle process), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  7. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  8. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  9. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  10. On Markov Chains Induced by Partitioned Transition Probability Matrices

    Institute of Scientific and Technical Information of China (English)

    Thomas KAIJSER

    2011-01-01

    Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P, then we call M a partition of P. Let K denote the set of probability vectors on S. With every partition M of P we can associate a transition probability function PM on K defined in such a way that if p ∈ K and M ∈ M are such that ‖pM‖ > 0, then, with probability ‖pM‖, the vector p is transferred to the vector pM/‖pM‖. Here ‖· ‖ denotes the l1-norm. In this paper we investigate the convergence in distribution for Markov chains generated by transition probability functions induced by partitions of transition probability matrices. The main motivation for this investigation is the application of the convergence results obtained to filtering processes of partially observed Markov chains with denumerable state space.

  11. Employment and Wage assimilation of Male First Generation Immigrants in Denmark

    DEFF Research Database (Denmark)

    Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;

    Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...

  12. Employment and Wage Assimilation of Male First Generation Immigrants in Denmark

    DEFF Research Database (Denmark)

    Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;

    2000-01-01

    Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...

  13. Employment and Wage Assimilation of Male First-generation immigrants in Denmark

    DEFF Research Database (Denmark)

    Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;

    2001-01-01

    Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...

  14. Quantification of digital forensic hypotheses using probability theory

    OpenAIRE

    Overill, RE; Silomon, JAM; Tse, HKS; Chow, KP

    2013-01-01

    The issue of downloading illegal material from a website onto a personal digital device is considered from the perspective of conventional (Pascalian) probability theory. We present quantitative results for a simple model system by which we analyse and counter the putative defence case that the forensically recovered illegal material was downloaded accidentally by the defendant. The model is applied to two actual prosecutions involving possession of child pornography.

  15. Probability and uncertainty in Keynes's The General Theory

    OpenAIRE

    Gillies, D

    2003-01-01

    Book description: John Maynard Keynes is undoubtedly the most influential Western economist of the twentieth century. His emphasis on the nature and role of uncertainty in economic thought is a dominant theme in his writings. This book brings together a wide array of experts on Keynes' thought such as Gay Tulip Meeks, Sheila Dow and John Davis who discuss, analyse and criticise such themes as Keynesian probability and uncertainty, the foundations of Keynes' economics and the relationship b...

  16. Collision strengths and transition probabilities for Co III forbidden lines

    CERN Document Server

    Storey, P J

    2016-01-01

    In this paper we compute the collision strengths and their thermally-averaged Maxwellian values for electron transitions between the fifteen lowest levels of doubly-ionised cobalt, Co^{2+}, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  17. Collision strengths and transition probabilities for Co III forbidden lines

    Science.gov (United States)

    Storey, P. J.; Sochi, Taha

    2016-07-01

    In this paper we compute the collision strengths and their thermally averaged Maxwellian values for electron transitions between the 15 lowest levels of doubly ionized cobalt, Co2+, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  18. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  19. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  20. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  1. Pretest probability assessment derived from attribute matching

    Directory of Open Access Journals (Sweden)

    Hollander Judd E

    2005-08-01

    Full Text Available Abstract Background Pretest probability (PTP assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS. We compare the new method with a validated logistic regression equation (LRE. Methods Eight clinical variables (attributes were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82 for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77 for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25% patients as having a PTP Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE.

  2. Elaboration of the methodological referential for life cycle analysis of first generation biofuels in the French context; Elaboration d'un referentiel methodologique pour la realisation d'Analyses de Cycle de Vie appliquees aux biocarburants de premiere generation en France. Rapport final

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    This study was made under the particular context of a strong growth of biofuels market, and the implication of French and European public authorities, and certain Member States (Germany, Netherlands, UK), for the development of certification schemes for first generation biofuels. The elaboration of such schemes requires a consensus on the methodology to apply when producing Life Cycle Analysis (LCA) of biofuels. To answer this demand, the study built up the methodological referential for biofuels LCAs in order to assess the Greenhouse Gases (GHG) emissions, fossil fuels consumptions and local atmospheric pollutants emissions induced by the different biofuel production pathways. The work consisted in methodological engineering, and was accomplished thanks to the participation of all the members of the Technical Committee of the study. An initial bibliographic review on biofuels LCAs allowed the identification of the main methodological issues (listed below). For each point, the impact of the methodological choices on the biofuels environmental balances was assessed by several sensitivity analyses. The results of these analyses were taken into account for the elaboration of the recommendations: - Consideration of the environmental burdens associated with buildings, equipments and their maintenance - Quantification of nitrous oxide (N{sub 2}O) emissions from fields - Impact of the Land Use Change (LUC) - Allocation method for the distribution of the environmental impacts of biofuel production pathways between the different products and coproducts generated. Within the framework of this study, we made no distinction in terms of methodological approach between GHG emissions and local pollutants emissions. This results from the fact that the methodological issues cover all the environmental burdens and do not require specific approaches. This executive summary presents the methodological aspects related to biofuels LCAs. The complete report of the study presents in

  3. Bell Could Become the Copernicus of Probability

    Science.gov (United States)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  4. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  5. Cost-benefit-analyses of selected smart grid use cases for the efficient integration of dispersed generation into low voltage grids; Kosten-Nutzen-Betrachtung zur Umsetzung ausgewaehlter Smart-Grid-Massnahmen bei der Integration dezentraler Erzeuger im Niederspannungsnetz

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, Christian; Backes, Juergen; Li, Ran; Schorn, Christian [EnBW Regional AG, Stuttgart (Germany)

    2011-07-01

    DE024948976The integration of distributed generation into today's medium voltage (MV) and low voltage (LV) grids in Germany grids requires high investments for grid reinforcement. The implementation of the EU climate targets until 2020 and the environmental concept of the German Federal Government give a clear indication that the grid connection of renewable and dispersed generation will continue just like during the last years. This expectation makes it necessary to develop innovative concepts which allow the grid integration of dispersed generation in a more efficient way than the pure conventional grid reinforcement. This paper is focused on two relevant use cases of the Smart Grid: the wide-spread use of MV/LV transformers with on-load tap changers and the regular use of generation management, along with compensation payments for the operators of the generation units. Based on two representative model networks and on different scenarios for the integration of renewable resources the cost of conventional grid expansion are compared to the cost of this type of a smart grid. The result is, that the Smart Grid option leads to substantial cost reduction. Whereas the transformer control is effective mainly in rural grids with voltage constraints tripping the grid measures, generation management is a universal option - with significantly higher cost when used. The maximum benefit is achieved when combining both measures. The regular use of generation management however today is not permissible under the conditions of the German Renewable Resources Energy Act. It is only allowed as a temporary measure until the grid capacity has been extended. This leads to the recommendation to accent generation management as a regular measure for grid operation in the future. It leads to a more efficient grid operation and when combined with financial compensation avoids disadvantages for the operators of renewable generation. (orig.)

  6. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  7. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  8. Probability and Quantum Paradigms: the Interplay

    Science.gov (United States)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  9. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    Science.gov (United States)

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  10. Introduction: Research and Developments in Probability Education

    OpenAIRE

    Manfred Borovcnik; Ramesh Kapadia

    2009-01-01

    In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the wor...

  11. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  12. Time and probability in quantum cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Greensite, J. (San Francisco State Univ., CA (USA). Dept. of Physics and Astronomy)

    1990-10-01

    A time function, an exactly conserved probability measure, and a time-evolution equation (related to the Wheeler-DeWitt equation) are proposed for quantum cosmology. The time-integral of the probability measure is the measure proposed by Hawking and Page. The evolution equation reduces to the Schroedinger equation, and probability measure to the Born measure, in the WKB approximation. The existence of this 'Schroedinger-limit', which involves a cancellation of time-dependencies in the probability density between the WKB prefactor and integration measure, is a consequence of laplacian factor ordering in the Wheeler-DeWitt equation. (orig.).

  13. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  14. Bayesian logistic betting strategy against probability forecasting

    CERN Document Server

    Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei

    2012-01-01

    We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.

  15. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  16. Quantum Statistical Mechanics. III. Equilibrium Probability

    OpenAIRE

    Attard, Phil

    2014-01-01

    Given are a first principles derivation and formulation of the probabilistic concepts that underly equilibrium quantum statistical mechanics. The transition to non-equilibrium probability is traversed briefly.

  17. Predicting most probable conformations of a given peptide sequence in the random coil state.

    Science.gov (United States)

    Bayrak, Cigdem Sevim; Erman, Burak

    2012-11-01

    In this work, we present a computational scheme for finding high probability conformations of peptides. The scheme calculates the probability of a given conformation of the given peptide sequence using the probability distribution of torsion states. Dependence of the states of a residue on the states of its first neighbors along the chain is considered. Prior probabilities of torsion states are obtained from a coil library. Posterior probabilities are calculated by the matrix multiplication Rotational Isomeric States Model of polymer theory. The conformation of a peptide with highest probability is determined by using a hidden Markov model Viterbi algorithm. First, the probability distribution of the torsion states of the residues is obtained. Using the highest probability torsion state, one can generate, step by step, states with lower probabilities. To validate the method, the highest probability state of residues in a given sequence is calculated and compared with probabilities obtained from the Coil Databank. Predictions based on the method are 32% better than predictions based on the most probable states of residues. The ensemble of "n" high probability conformations of a given protein is also determined using the Viterbi algorithm with multistep backtracking. PMID:22955874

  18. Analytical theory of the probability distribution function of structure formation

    OpenAIRE

    Anderson, Johan; Kim, Eun-Jin

    2009-01-01

    The probability distribution function (PDF) tails of the zonal flow structure formation and the PDF tails of momentum flux by incorporating effect of a shear flow in ion-temperature-gradient (ITG) turbulence are computed in the present paper. The bipolar vortex soliton (modon) is assumed to be the coherent structure responsible for bursty and intermittent events driving the PDF tails. It is found that stronger zonal flows are generated in ITG turbulence than Hasegawa-Mima (HM) turbulence as w...

  19. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  20. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  1. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  2. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  3. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  4. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  5. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  6. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  7. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  8. The probability premium: a graphical representation

    NARCIS (Netherlands)

    L.R. Eeckhoudt; R.J.A. Laeven

    2015-01-01

    We illustrate that Pratt’s probability premium can be given a simple graphical representation allowing a direct comparison to the equivalent but more prevalent concept of risk premium under expected utility. We also show that the probability premium’s graphical representation under the dual theory m

  9. Quantifying extinction probabilities from sighting records: inference and uncertainties.

    Science.gov (United States)

    Caley, Peter; Barry, Simon C

    2014-01-01

    Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes) into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes. PMID:24788945

  10. Quantifying extinction probabilities from sighting records: inference and uncertainties.

    Directory of Open Access Journals (Sweden)

    Peter Caley

    Full Text Available Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes.

  11. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  12. An Improved Model of Attack Probability Prediction System

    Institute of Scientific and Technical Information of China (English)

    WANG Hui; LIU Shufen; ZHANG Xinjia

    2006-01-01

    This paper presents a novel probability generation algorithm to predict attacks from an insider who exploits known system vulnerabilities through executing authorized operations. It is different from most intrusion detection systems (IDSs) because these IDSs are inefficient to resolve threat from authorized insiders. To deter cracker activities, this paper introduces an improved structure of augmented attack tree and a notion of "minimal attack tree", and proposes a new generation algorithm of minimal attack tree. We can provide a quantitative approach to help system administrators make sound decision.

  13. New method for estimating low-earth-orbit collision probabilities

    Science.gov (United States)

    Vedder, John D.; Tabor, Jill L.

    1991-01-01

    An unconventional but general method is described for estimating the probability of collision between an earth-orbiting spacecraft and orbital debris. This method uses a Monte Caralo simulation of the orbital motion of the target spacecraft and each discrete debris object to generate an empirical set of distances, each distance representing the separation between the spacecraft and the nearest debris object at random times. Using concepts from the asymptotic theory of extreme order statistics, an analytical density function is fitted to this set of minimum distances. From this function, it is possible to generate realistic collision estimates for the spacecraft.

  14. Analyses on Generator-transformer Protection Configuration of Xiluodu Hydropower Station%溪洛渡水电站水轮发电机组保护配置分析

    Institute of Scientific and Technical Information of China (English)

    李光耀; 骆佳勇; 封孝松; 龚林平

    2013-01-01

    The configurations of generator-transformer protections and control and monitoring panels in Xiluodu Hydropower Station are briefly introduced,and the complete longitudinal differential protection,unit-transverse differential protection and rotor earth fault are analyzed in detail.The internal fault simulation of generator-transformer undertaken by Tsinghua University shows that the configuration of generator-transformer main protection in Xiluodu Hydropower Station is scientific and rational,and the dual configuration of main protection,abnormal operation protection and backup protection are achieved.The design of generator-transformer protection in Xiluodu Hydropower Station is simple and reliable and its operation and maintenance is easy.%对溪洛渡水电站发变组保护配置方案和组屏方案进行了简要介绍,就发变组保护中的完全纵差保护、单元件横差保护与转子接地保护做了详细介绍.经过清华大学内部故障全面仿真,溪洛渡电站发电机组主保护的保护配置科学合理,并实现了主保护、异常运行保护、后备保护的保护双重化配置方案,设计简洁可靠,运行维护方便.

  15. Exact capture probability analysis of GSC receivers over Rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2010-01-01

    For third generation systems and ultrawideband systems, RAKE receivers have been introduced due to the advantage of RAKE receivers which is their ability to combine different replicas of the transmitted signal arriving at different delays in a rich multipath environment. In principle, RAKE receivers combine all resolvable paths which gives the best performance in a rich diversity environment. However, this is usually costly in terms of hardware required as the number of RAKE fingers increases. Therefore, generalized selection combining (GSC) RAKE reception was proposed and has been studied by many researcher as an alternative to the classical two fundamental diversity schemes: maximal ratio combining and selection combining. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closedform expressions for various performance measures. However, the remaining set of uncombined paths affect the overall performance both in terms of loss in power. Therefore, to have a full understanding of the performance of GSC RAKE receivers, we introduce in this paper the notion of capture probability, which is defined as the ratio of the captured power (essentially combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  16. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by keff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  17. Sproglig Metode og Analyse

    DEFF Research Database (Denmark)

    le Fevre Jakobsen, Bjarne

    Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011......Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011...

  18. Angular anisotropy representation by probability tables

    International Nuclear Information System (INIS)

    In this paper, we improve point-wise or group-wise angular anisotropy representation by using probability tables. The starting point of this study was to give more flexibility (sensitivity analysis) and more accuracy (ray effect) to group-wise anisotropy representation by Dirac functions, independently introduced at CEA (Mao, 1998) and at IRSN (Le Cocq, 1998) ten years ago. Basing ourselves on our experience of cross-section description, acquired in CALENDF (Sublet et al., 2006), we introduce two kinds of moment based probability tables, Dirac (DPT) and Step-wise (SPT) Probability Tables where the angular probability distribution is respectively represented by Dirac functions or by a step-wise function. First, we show how we can improve equi-probable cosine representation of point-wise anisotropy by using step-wise probability tables. Then we show, by Monte Carlo techniques, how we can obtain a more accurate description of group-wise anisotropy than the one usually given by a finite expansion on a Legendre polynomial basis (that can induce negative values) and finally, we describe it by Dirac probability tables. This study is carried out in the framework of GALILEE project R and D activities (Coste-Delclaux, 2008). (authors)

  19. Survival probability in patients with liver trauma.

    Science.gov (United States)

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  20. Mehanske in trdnostne analize v podporo zamenjavi uparjalnikov in povečanju moči JE Krško: Mechanical and structural analyses supporting the steam generator replacement and power uprating at the Krško NPP:

    OpenAIRE

    Krajnc, Božidar; Župec, Janez

    2000-01-01

    Krško nuclear power plant (NPP) is one the last presurized water reactor NPPs of western design in Europe, which has decided to replace the existing steam generators and at the same time perform a power uprating. A compehensive set of design calculations and safety analyses have been performed to demonstrate: -that the new steam generators are compatible with the existing plant, - that the plant can operate safely and with adequate margins at the uprated power. In this power only the mechanic...

  1. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  2. Advanced Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob

  3. Basic Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first

  4. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  5. Probability Distributions for a Surjective Unimodal Map

    Institute of Scientific and Technical Information of China (English)

    HongyanSUN; LongWANG

    1996-01-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  6. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  7. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  8. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  9. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  10. The influence of initial beliefs on judgments of probability

    Directory of Open Access Journals (Sweden)

    Erica Catherine Yu

    2012-10-01

    Full Text Available This study aims to investigate whether experimentally-induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities. Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants’ judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  11. Size constrained unequal probability sampling with a non-integer sum of inclusion probabilities

    OpenAIRE

    Grafström, Anton; Qualité, Lionel; Tillé, Yves; Matei, Alina

    2012-01-01

    More than 50 methods have been developed to draw unequal probability samples with fixed sample size. All these methods require the sum of the inclusion probabilities to be an integer number. There are cases, however, where the sum of desired inclusion probabilities is not an integer. Then, classical algorithms for drawing samples cannot be directly applied. We present two methods to overcome the problem of sample selection with unequal inclusion probabilities when their sum is not an integer ...

  12. Choosing information variables for transition probabilities in a time-varying transition probability Markov switching model

    OpenAIRE

    Andrew J. Filardo

    1998-01-01

    This paper discusses a practical estimation issue for time-varying transition probability (TVTP) Markov switching models. Time-varying transition probabilities allow researchers to capture important economic behavior that may be missed using constant (or fixed) transition probabilities. Despite its use, Hamilton’s (1989) filtering method for estimating fixed transition probability Markov switching models may not apply to TVTP models. This paper provides a set of sufficient conditions to justi...

  13. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  14. Probability of spent fuel transportation accidents

    Energy Technology Data Exchange (ETDEWEB)

    McClure, J. D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10/sup -7/ spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10/sup -9//mile.

  15. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  16. Transition probabilities in superfluid He4

    International Nuclear Information System (INIS)

    The transition probabilities between various states of superfluid helium-4 are found by using the approximation method of Bogolyubov and making use of his canonical transformations for different states of transitions. (author)

  17. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  18. Inclusion probability with dropout: an operational formula.

    Science.gov (United States)

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  19. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  20. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  1. Asymmetry of the work probability distribution

    OpenAIRE

    Saha, Arnab; Bhattacharjee, J. K.

    2006-01-01

    We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.

  2. Transition Probability and the ESR Experiment

    Science.gov (United States)

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  3. Transition Probability Estimates for Reversible Markov Chains

    OpenAIRE

    Telcs, Andras

    2000-01-01

    This paper provides transition probability estimates of transient reversible Markov chains. The key condition of the result is the spatial symmetry and polynomial decay of the Green's function of the chain.

  4. The Animism Controversy Revisited: A Probability Analysis

    Science.gov (United States)

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  5. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  6. Avoiding Negative Probabilities in Quantum Mechanics

    OpenAIRE

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless questi...

  7. Probability, clinical decision making and hypothesis testing

    Directory of Open Access Journals (Sweden)

    A Banerjee

    2009-01-01

    Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.

  8. Breakdown Point Theory for Implied Probability Bootstrap

    OpenAIRE

    Lorenzo Camponovo; Taisuke Otsu

    2011-01-01

    This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulati...

  9. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  10. The Pauli equation for probability distributions

    International Nuclear Information System (INIS)

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  11. The Pauli equation for probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it

    2001-04-27

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  12. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  13. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  14. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  15. A case concerning the improved transition probability

    OpenAIRE

    Tang, Jian; Wang, An Min

    2006-01-01

    As is well known, the existed perturbation theory can be applied to calculations of energy, state and transition probability in many quantum systems. However, there are different paths and methods to improve its calculation precision and efficiency in our view. According to an improved scheme of perturbation theory proposed by [An Min Wang, quant-ph/0611217], we reconsider the transition probability and perturbed energy for a Hydrogen atom in a constant magnetic field. We find the results obt...

  16. Atomic transition probabilities of neutral samarium

    International Nuclear Information System (INIS)

    Absolute atomic transition probabilities from a combination of new emission branching fraction measurements using Fourier transform spectrometer data with radiative lifetimes from recent laser induced fluorescence measurements are reported for 299 lines of the first spectrum of samarium (Sm i). Improved values for the upper and lower energy levels of these lines are also reported. Comparisons to published transition probabilities from earlier experiments show satisfactory and good agreement with two of the four published data sets. (paper)

  17. Validation of fluorescence transition probability calculations

    OpenAIRE

    M. G. PiaINFN, Sezione di Genova; P. Saracco(INFN, Sezione di Genova); Manju Sudhaka(INFN, Sezione di Genova)

    2015-01-01

    A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimen...

  18. Generalized couplings and convergence of transition probabilities

    OpenAIRE

    Kulik, Alexei; Scheutzow, Michael

    2015-01-01

    We provide sufficient conditions for the uniqueness of an invariant measure of a Markov process as well as for the weak convergence of transition probabilities to the invariant measure. Our conditions are formulated in terms of generalized couplings. We apply our results to several SPDEs for which unique ergodicity has been proven in a recent paper by Glatt-Holtz, Mattingly, and Richards and show that under essentially the same assumptions the weak convergence of transition probabilities actu...

  19. Semiclassical transition probabilities for interacting oscillators

    OpenAIRE

    Khlebnikov, S. Yu.

    1994-01-01

    Semiclassical transition probabilities characterize transfer of energy between "hard" and "soft" modes in various physical systems. We establish the boundary problem for singular euclidean solutions used to calculate such probabilities. Solutions are found numerically for a system of two interacting quartic oscillators. In the double-well case, we find numerical evidence that certain regular {\\em minkowskian} trajectories have approximate stopping points or, equivalently, are approximately pe...

  20. Country Default Probabilities: Assessing and Backtesting

    OpenAIRE

    Vogl, Konstantin; Maltritz, Dominik; Huschens, Stefan; Karmann, Alexander

    2006-01-01

    We address the problem how to estimate default probabilities for sovereign countries based on market data of traded debt. A structural Merton-type model is applied to a sample of emerging market and transition countries. In this context, only few and heterogeneous default probabilities are derived, which is problematic for backtesting. To deal with this problem, we construct likelihood ratio test statistics and quick backtesting procedures.

  1. Transition probability studies in 175Au

    OpenAIRE

    Grahn, Tuomas; Watkins, H.; Joss, David; Page, Robert; Carroll, R. J.; Dewald, A.; Greenlees, Paul; Hackstein, M.; Herzberg, Rolf-Dietmar; Jakobsson, Ulrika; Jones, Peter; Julin, Rauno; Juutinen, Sakari; Ketelhut, Steffen; Kröll, Th

    2013-01-01

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms...

  2. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  3. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  4. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  5. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  6. Probability detection of defects in tubes components and their weldments using non-destructive methods

    International Nuclear Information System (INIS)

    A methodology of estimation of the probability of defects detection and the probability of defects absence in tubes components and weldments is presented. The suggested approach gives an important information and may be applied for the service life assessment of NPP equipment. Concrete examples of calculation of reliability indexes are given for tubes used for steam generators, butt welded joints of steam generator tubes, and welded joints, connected tubes and tube sheets

  7. The assessment of low probability containment failure modes using dynamic PRA

    Science.gov (United States)

    Brunett, Acacia Joann

    a significant threat to containment integrity. Additional scoping studies regarding the effect of recovery actions on in-vessel hydrogen generation show that reflooding a partially degraded core do not significantly affect hydrogen generation in-vessel, and the NUREG-1150 assumption that insufficient hydrogen is generated in-vessel to produce an energetic deflagration is confirmed. The DET analyses performed in this work show that very late power recovery produces the potential for very energetic combustion events which are capable of failing containment with a non-negligible probability, and that containment cooling systems have a significant impact on core concrete attack, and therefore combustible gas generation ex-vessel. Ultimately, the overall risk of combustion-induced containment failure is low, but its conditional likelihood can have a significant effect on accident mitigation strategies. It is also shown in this work that DETs are particularly well suited to examine low probability events because of their ability to rediscretize CDFs and observe solution convergence.

  8. Interaction of vanadium (IV) solvates (L) with second-generation fluoroquinolone antibacterial drug ciprofloxacin: Spectroscopic, structure, thermal analyses, kinetics and biological evaluation (L = An, DMF, Py and Et3N)

    Science.gov (United States)

    Zordok, Wael A.

    2014-08-01

    The preparation and characterization of the new solid complexes [VO(CIP)2L]SO4ṡnH2O, where L = aniline (An), dimethylformamide (DMF), pyridine (Py) and triethylamine (Et3N) in the reaction of ciprofloxacin (CIP) with VO(SO4)2·2H2O in ethanol. The isolated complexes have been characterized with their melting points, elemental analysis, IR spectroscopy, magnetic properties, conductance measurements, UV-Vis. and 1H NMR spectroscopic methods and thermal analyses. The results supported the formation of the complexes and indicated that ciprofloxacin reacts as a bidentate ligand bound to the vanadium ion through the pyridone oxygen and one carboxylato oxygen. The activation energies, E*; entropies, ΔS*; enthalpies, ΔH*; Gibbs free energies, ΔG*, of the thermal decomposition reactions have been derived from thermo gravimetric (TGA) and differential thermo gravimetric (DTG) curves, using Coats-Redfern and Horowitz-Metzeger methods. The lowest energy model structure of each complex has been proposed by using the density functional theory (DFT) at the B3LYP/CEP-31G level of theory. The ligand and their metal complexes were also evaluated for their antibacterial activity against several bacterial species, such as Bacillus Subtilis (B. Subtilis), Staphylococcus aureus (S. aureus), Nesseria Gonorrhoeae (N. Gonorrhoeae), Pseudomonas aeruginosa (P. aeruginosa) and Escherichia coli (E. coli).

  9. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  10. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Roger E. Feeley

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  11. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove;

    2007-01-01

    The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  12. ESTIMATION OF INTRUSION DETECTION PROBABILITY BY PASSIVE INFRARED DETECTORS

    Directory of Open Access Journals (Sweden)

    V. V. Volkhonskiy

    2015-07-01

    Full Text Available Subject of Research. The paper deals with estimation of detection probability of intruder by passive infrared detector in different conditions of velocity and direction for automated analyses of physical protection systems effectiveness. Method. Analytic formulas for detection distance distribution laws obtained by means of experimental histogram approximation are used. Main Results. Applicability of different distribution laws has been studied, such as Rayleigh, Gauss, Gamma, Maxwell and Weibull distribution. Based on walk tests results, approximation of experimental histograms of detection distance probability distribution laws by passive infrared detectors was done. Conformity of the histograms to the mentioned analytical laws according to fitting criterion 2 has been checked for different conditions of velocity and direction of intruder movement. Mean and variance of approximate distribution laws were equal to the same parameters of experimental histograms for corresponding intruder movement parameters. Approximation accuracy evaluation for above mentioned laws was done with significance level of 0.05. According to fitting criterion 2, the Rayleigh and Gamma laws are corresponded mostly close to the histograms for different velocity and direction of intruder movement. Dependences of approximation accuracy for different conditions of intrusion have been got. They are usable for choosing an approximation law in the certain condition. Practical Relevance. Analytic formulas for detection probability are usable for modeling of intrusion process and objective effectiveness estimation of physical protection systems by both developers and users.

  13. Reduced reward-related probability learning in schizophrenia patients

    Directory of Open Access Journals (Sweden)

    Yılmaz A

    2012-01-01

    Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation

  14. Analyses on the Competence of Successors of Family Business in the Perspective of Trans-generational Succession%代际传承视角下家族企业继任者胜任力分析

    Institute of Scientific and Technical Information of China (English)

    林剑; 张向前

    2013-01-01

    The quality of the competence of successors is the key factor which affeds the development of family business. First, this paper constructs the competency model of successors of family business named KAP (Knowledge, Ability and Per-sonality) model by qualitative study, then it conducts analyses on the data collected through on-site research based on SPSS statistic software and AMOS structural equation modeling analysis tool for verification. The results of the study show that the empirical results basically tallies with the hypothetic theory model. They also indicate that there are discrepancies of the components of competence between successors and the founders of family business. At last, the paper puts forward concrete suggestions, from preparations before succession, trials during succession and innovations after succession, on promoting the level of the competence of successors of family business.%家族企业继任者素质的高低成为影响企业发展的关键因素。通过质性研究方法构建家族企业继任者胜任力KAP模型并基于SPSS统计软件和AMOS结构方程分析工具对实证数据进行处理和分析。研究结果显示,除了实证结果与理论模型基本契合之外,还发现继任者与第一代创业者在胜任力要素构成上存在着差异。最后,从家族企业继任前的筹备、继任中的考验以及继任后的创新三个方面提出提升继任者胜任力水平的具体管理建议。

  15. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  16. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  17. Consistent probabilities in loop quantum cosmology

    CERN Document Server

    Craig, David A

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler-DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce vs. a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation v...

  18. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  19. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  20. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  1. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  2. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  3. Probabilities and Signalling in Quantum Field Theory

    CERN Document Server

    Dickinson, Robert; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.

  4. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  5. Match probabilities in racially admixed populations.

    Science.gov (United States)

    Lange, K

    1993-02-01

    The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors.

  6. Introduction: Research and Developments in Probability Education

    Directory of Open Access Journals (Sweden)

    Manfred Borovcnik

    2009-10-01

    Full Text Available In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the world and the result of an experiment in electronic communication. For convenience of international readers, abstracts in Spanish and German have been supplied, as well as hints for navigation to linked electronic materials.

  7. Sampling Quantum Nonlocal Correlations with High Probability

    Science.gov (United States)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  8. Report sensory analyses veal

    OpenAIRE

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull, pink veal and white veal. The sensory descriptive analyses show that the three groups Young bulls, pink veal and white veal, differ significantly in red colour for the raw meat as well as the baked...

  9. ROSA/LSTF Tests and RELAP5 Posttest Analyses for PWR Safety System Using Steam Generator Secondary-Side Depressurization against Effects of Release of Nitrogen Gas Dissolved in Accumulator Water

    Directory of Open Access Journals (Sweden)

    Takeshi Takeda

    2016-01-01

    Full Text Available Two tests related to a new safety system for a pressurized water reactor were performed with the ROSA/LSTF (rig of safety assessment/large scale test facility. The tests simulated cold leg small-break loss-of-coolant accidents with 2-inch diameter break using an early steam generator (SG secondary-side depressurization with or without release of nitrogen gas dissolved in accumulator (ACC water. The SG depressurization was initiated by fully opening the depressurization valves in both SGs immediately after a safety injection signal. The pressure difference between the primary and SG secondary sides after the actuation of ACC system was larger in the test with the dissolved gas release than that in the test without the dissolved gas release. No core uncovery and heatup took place because of the ACC coolant injection and two-phase natural circulation. Long-term core cooling was ensured by the actuation of low-pressure injection system. The RELAP5 code predicted most of the overall trends of the major thermal-hydraulic responses after adjusting a break discharge coefficient for two-phase discharge flow under the assumption of releasing all the dissolved gas at the vessel upper plenum.

  10. Conditional Probabilities and Collapse in Quantum Measurements

    Science.gov (United States)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  11. Probability, statistics, and decision for civil engineers

    CERN Document Server

    Benjamin, Jack R

    2014-01-01

    Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and

  12. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  13. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  14. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth;

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  15. Steering in spin tomographic probability representation

    Science.gov (United States)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  16. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  17. Probability groups as orbits of groups

    International Nuclear Information System (INIS)

    The set of double cosets of a group with respect to a subgroup and the set of orbits of a group with respect to a group of automorphisms have structures which can be studied as multigroups, hypergroups or Pasch geometries. When the subgroup or the group of automorphisms are finite, the multivalued products can be provided with some weightages forming so-called Probability Groups. It is shown in this paper that some abstract probability groups can be realized as orbit spaces of groups. (author)

  18. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  19. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  20. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  1. A structural model of intuitive probability

    CERN Document Server

    Dessalles, Jean-Louis

    2011-01-01

    Though the ability of human beings to deal with probabilities has been put into question, the assessment of rarity is a crucial competence underlying much of human decision-making and is pervasive in spontaneous narrative behaviour. This paper proposes a new model of rarity and randomness assessment, designed to be cognitively plausible. Intuitive randomness is defined as a function of structural complexity. It is thus possible to assign probability to events without being obliged to consider the set of alternatives. The model is tested on Lottery sequences and compared with subjects' preferences.

  2. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  3. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  4. Quantum probability and quantum decision making

    CERN Document Server

    Yukalov, V I

    2016-01-01

    A rigorous general definition of quantum probability is given, which is valid for elementary events and for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  5. Transition probability studies in 175Au

    International Nuclear Information System (INIS)

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms of available systematics as a function of atomic number and aligned angular momentum.

  6. Electric quadrupole transition probabilities for atomic lithium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT

  7. Poisson spaces with a transition probability

    OpenAIRE

    Landsman, N. P.

    1997-01-01

    The common structure of the space of pure states $P$ of a classical or a quantum mechanical system is that of a Poisson space with a transition probability. This is a topological space equipped with a Poisson structure, as well as with a function $p:P\\times P-> [0,1]$, with certain properties. The Poisson structure is connected with the transition probabilities through unitarity (in a specific formulation intrinsic to the given context). In classical mechanics, where $p(\\rho,\\sigma)=\\dl_{\\rho...

  8. What is probability? The importance of probability when dealing with technical risks

    International Nuclear Information System (INIS)

    The book handles the following themes: - different aspects in connection with the probability concept including the mathematical fundamentals, - the importance of the probability concepts for the estimation of the effects of various activities, - the link between risk and time and the utilisation of concepts for describing this link, - the application of the probability concept in various engineering fields, - complementary attempts for the probabilistic safety analysis of systems. figs., tabs., refs

  9. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, M.A.; Luyten, J.W.; Scheerens, J.; Sleegers, P.J.C.; Scheerens, J.

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  10. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  11. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  12. Report sensory analyses veal

    NARCIS (Netherlands)

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,

  13. 温州地区太阳能电池板实际发电能力的分析%Analyses on the Actual Power-generating Capability of Solar Panels in Wenzhou

    Institute of Scientific and Technical Information of China (English)

    华晓玲; 梁步猛; 吴桂初

    2014-01-01

    简要介绍了薄膜、单晶硅和多晶硅太阳能电池板的优缺点以及实现最大功率点跟踪的几种方法。结合温州地区的太阳辐射情况,利用扰动观察法测量了大量的数据,以比功率为衡量标准对三种太阳能电池板的实际发电能力进行了比较。结果表明,不管是在光强很强还是在光强较弱的环境下,单晶硅太阳能电池板的发电能力均最高;对另两种太阳能板,在光强较强时,薄膜太阳能电池板占优势,在光强很弱时,多晶硅太阳能电池板发电能力更强。%In this paper, solar panels made of thin film, monocrystalline silicon and polycrystalline sil-icon were introduced and compared, and several methods to achieve maximum power point tracking (MPPT) were summarized as well. Combined with the condition of solar radiation in WenZhou, a large amou-nt of data were acquired by using the perturbing-and-observing algorithm and then analyzed to determinethe power-generating capacity by calculating their specific power. Experimental results indicated that mono-crystalline silicon solar panels always perform best in spite of weather conditions. Thin film solar panelshave advantages over polysilicon solar panels when light intensity is high, and when the light intensity is weak, polysilicon solar panels are better than the film solar panels.

  14. Vehicle Detection Based on Probability Hypothesis Density Filter

    Science.gov (United States)

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI) in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD) filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art. PMID:27070621

  15. A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures

    Institute of Scientific and Technical Information of China (English)

    李典庆; 张圣坤; 唐文勇

    2003-01-01

    There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.

  16. Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-05-23

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{sub eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.

  17. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  18. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  19. Probable Bright Supernova discovered by PSST

    Science.gov (United States)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-09-01

    A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  20. Laplace's 1774 Memoir on Inverse Probability

    OpenAIRE

    Stigler, Stephen M.

    1986-01-01

    Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, and English translation of this epochal memoir is given.

  1. The Pauli Equation for Probability Distributions

    OpenAIRE

    Mancini, S.; Man'ko, O. V.; Man'ko, V. I.; Tombesi, P.

    2000-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  2. The Pauli Equation for Probability Distributions

    CERN Document Server

    Mancini, S; Man'ko, V I; Tombesi, P

    2001-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  3. Partially Specified Probabilities: Decisions and Games

    OpenAIRE

    Ehud Lehrer

    2012-01-01

    The paper develops a theory of decision making based on partially specified probabilities. It takes an axiomatic approach using Anscombe and Aumann's (1963) setting, and is based on the concave integral for capacities. This theory is then expanded to interactive models in order to extend Nash equilibrium by introducing the concept of partially specified equilibrium. (JEL C70, D81, D83)

  4. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  5. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...

  6. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  7. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  8. Learning a Probability Distribution Efficiently and Reliably

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  9. Adiabatic transition probability for a tangential crossing

    OpenAIRE

    Watanabe, Takuya

    2006-01-01

    We consider a time-dependent Schrödinger equation whose Hamiltonian is a $2\\times 2$ real symmetric matrix. We study, using an exact WKB method, the adiabatic limit of the transition probability in the case where several complex eigenvalue crossing points accumulate to one real point.

  10. Markov Chains with Stochastically Stationary Transition Probabilities

    OpenAIRE

    Orey, Steven

    1991-01-01

    Markov chains on a countable state space are studied under the assumption that the transition probabilities $(P_n(x,y))$ constitute a stationary stochastic process. An introductory section exposing some basic results of Nawrotzki and Cogburn is followed by four sections of new results.

  11. A real formula for transition probabilities

    Directory of Open Access Journals (Sweden)

    Alessandra Luati

    2007-10-01

    Full Text Available Transition probabilities between states in two dimensional quantum systems are derived as functions of unit vectors in R3 instead of state vectors in C2. This can be done once represented states and von Neumann measurements acting on C2 by means of vectors on the unit sphere of R3.

  12. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  13. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi...

  14. The albedo effect on neutron transmission probability.

    Science.gov (United States)

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  15. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  16. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    International Nuclear Information System (INIS)

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  17. Possible future HERA analyses

    CERN Document Server

    Geiser, Achim

    2015-01-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing $ep$ collider data and their physics scope. Comparisons to the original scope of the HERA programme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-e...

  18. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  19. Determination of bounds on failure probability in the presence of hybrid uncertainties

    Indian Academy of Sciences (India)

    M B Anoop; K Balaji Rao

    2008-12-01

    A fundamental component of safety assessment is the appropriate representation and incorporation of uncertainty. A procedure for handling hybrid uncertainties in stochastic mechanics problems is presented. The procedure can be used for determining the bounds on failure probability for cases where failure probability is a monotonic function of the fuzzy variables. The procedure is illustrated through an example problem of safety assessment of a nuclear power plant piping component against stress corrosion cracking, considering the stochastic evolution of stress corrosion cracks with time. It is found that the bounds obtained enclose the values of failure probability obtained from probabilistic analyses.

  20. Statistisk analyse med SPSS

    OpenAIRE

    Linnerud, Kristin; Oklevik, Ove; Slettvold, Harald

    2004-01-01

    Dette notatet har sitt utspring i forelesninger og undervisning for 3.års studenter i økonomi og administrasjon ved høgskolen i Sogn og Fjordane. Notatet er særlig lagt opp mot undervisningen i SPSS i de to kursene ”OR 685 Marknadsanalyse og merkevarestrategi” og ”BD 616 Økonomistyring og analyse med programvare”.