WorldWideScience

Sample records for probability measures fpm

  1. DEPENDENCE OF THE MUNICIPALITIES OF MINAS GERAIS IN RELATION TO FPM

    Directory of Open Access Journals (Sweden)

    Wellington de Oliveira Massardi

    2016-03-01

    Full Text Available This research sought to demonstrate the level of dependence of Minas Gerais municipalities on the Municipalities Participation Fund (FPM. To achieve this goal we demonstrated the representativeness of FPM in the financing structure of municipalities, measured by dividing the revenue from FPM and the municipal current revenue. It was found that the level of dependence of the vast majority of municipalities is higher than 50%, ie, the FPM resources represent the main source of municipal funding, especially for those with less than 20,000 inhabitants. Regarding geographical location, it was found that the regions of Zona da Mata and Vale do Rio Doce have the highest concentration of municipalities that have high dependence on FPM. The average population in these municipalities is 3,202 inhabitants, which leads to the conclusion that the dependence of FPM is directly related to the size of the municipality.

  2. Fabrication of mechanical system of the FPM capsule puller

    International Nuclear Information System (INIS)

    Sudirdjo, Hari; Prasetya, Hendra

    2000-01-01

    A mechanical system of the FPM capsule puller has been fabricated, which has a function to pull the irradiated FPM capsule. The construction of the system consist of driving motor equipped with reduction gear, spindle, and puller wire. The system has puller stroke of 700 mm, therefore the puller will be terminated at the outside of the reactor core. A function test had been done and shows that the system has fulfilled the requirements

  3. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  4. Verification of maximum radial power peaking factor due to insertion of FPM-LEU target in the core of RSG-GAS reactor

    Energy Technology Data Exchange (ETDEWEB)

    Setyawan, Daddy, E-mail: d.setyawan@bapeten.go.id [Center for Assessment of Regulatory System and Technology for Nuclear Installations and Materials, Indonesian Nuclear Energy Regulatory Agency (BAPETEN), Jl. Gajah Mada No. 8 Jakarta 10120 (Indonesia); Rohman, Budi [Licensing Directorate for Nuclear Installations and Materials, Indonesian Nuclear Energy Regulatory Agency (BAPETEN), Jl. Gajah Mada No. 8 Jakarta 10120 (Indonesia)

    2014-09-30

    Verification of Maximum Radial Power Peaking Factor due to insertion of FPM-LEU target in the core of RSG-GAS Reactor. Radial Power Peaking Factor in RSG-GAS Reactor is a very important parameter for the safety of RSG-GAS reactor during operation. Data of radial power peaking factor due to the insertion of Fission Product Molybdenum with Low Enriched Uranium (FPM-LEU) was reported by PRSG to BAPETEN through the Safety Analysis Report RSG-GAS for FPM-LEU target irradiation. In order to support the evaluation of the Safety Analysis Report incorporated in the submission, the assessment unit of BAPETEN is carrying out independent assessment in order to verify safety related parameters in the SAR including neutronic aspect. The work includes verification to the maximum radial power peaking factor change due to the insertion of FPM-LEU target in RSG-GAS Reactor by computational method using MCNP5and ORIGEN2. From the results of calculations, the new maximum value of the radial power peaking factor due to the insertion of FPM-LEU target is 1.27. The results of calculations in this study showed a smaller value than 1.4 the limit allowed in the SAR.

  5. Meshfree simulation of avalanches with the Finite Pointset Method (FPM)

    Science.gov (United States)

    Michel, Isabel; Kuhnert, Jörg; Kolymbas, Dimitrios

    2017-04-01

    Meshfree methods are the numerical method of choice in case of applications which are characterized by strong deformations in conjunction with free surfaces or phase boundaries. In the past the meshfree Finite Pointset Method (FPM) developed by Fraunhofer ITWM (Kaiserslautern, Germany) has been successfully applied to problems in computational fluid dynamics such as water crossing of cars, water turbines, and hydraulic valves. Most recently the simulation of granular flows, e.g. soil interaction with cars (rollover), has also been tackled. This advancement is the basis for the simulation of avalanches. Due to the generalized finite difference formulation in FPM, the implementation of different material models is quite simple. We will demonstrate 3D simulations of avalanches based on the Drucker-Prager yield criterion as well as the nonlinear barodesy model. The barodesy model (Division of Geotechnical and Tunnel Engineering, University of Innsbruck, Austria) describes the mechanical behavior of soil by an evolution equation for the stress tensor. The key feature of successful and realistic simulations of avalanches - apart from the numerical approximation of the occurring differential operators - is the choice of the boundary conditions (slip, no-slip, friction) between the different phases of the flow as well as the geometry. We will discuss their influences for simplified one- and two-phase flow examples. This research is funded by the German Research Foundation (DFG) and the FWF Austrian Science Fund.

  6. H2 gas pressure calculation of FPM capsule failure at RSG-GAS reactor core

    International Nuclear Information System (INIS)

    Hastuti, Endiah Puji; Sunaryo, Geni Rina

    2002-01-01

    RSG-GAS has been irradiated FPM capsule for 236 times, one of those i.e. capsule number 228 has failure. The one of root cause of failure possibility is radiolysis reaction can be occurred in FPM capsule when it is filled with water during irradiation in the reactor core. The safety analysis of the radiolysis reaction in the capsule has been done. The oc cumulative hydrogen gas production can cause high pressure in the capsule then a mechanical damage occurred. The analysis was done at 10 MW of reactor power which equivalent with neutron flux of 0,6929 x 10 1 4 n/cm 2 sec and γ dose rate of 0,63x10 9 rad/hour. The assumption is the capsule is filled with water at maximum volume, i.e. 176.67 ml. The results of calculation showed that radiolysis reaction with γ and neutron produce hydrogen gas for nominal flow rate each are 494 atm and 19683 atm for γ and neutron radiolysis, respectively. H 2 gas pressure for 5% flow rate each are 723 atm. and 25772 atm., for γ and neutron radiolysis, respectively. The changing of the operation condition due to radiolysis together with one way valve' phenomena, can be produce hydrogen gas from water during irradiation in the reactor core and can be the one of root cause of capsule failure. This analysis recommended the FPM capsule preparation must be guaranteed no water or/and there is no possibility of water immersion in the capsule during irradiation in the core by more accurate leak test

  7. Optimization of FPM system in Barsukovskoye deposit with hydrodynamic modeling and analysis of inter-well interaction

    Science.gov (United States)

    Almukhametova, E. M.; Gizetdinov, I. A.

    2018-05-01

    Development of most deposits in Russia is accompanied with a high level of crude water cut. More than 70% of the operating well count of Barsukovskoye deposit operates with water; about 12% of the wells are characterized by a saturated water cut; many wells with high water cut are idling. To optimize the current FPM system of the Barsukovskoye deposit, a calculation method over a hydrodynamic model was applied with further analysis of hydrodynamic connectivity between the wells. A plot was selected, containing several wells with water cut going ahead of reserve recovery rate; injection wells, exerting the most influence onto the selected producer wells, were determined. Then, several variants were considered for transformation of the FPM system of this plot. The possible cases were analyzed with the hydrodynamic model with further determination of economic effect of each of them.

  8. An experiment to measure accurately the lifetime of the $D^{0}, D^{\\pm}, F^{\\pm}, \\Lambda_{c}$-charm particles and to study their hadronic production and decay properties

    CERN Multimedia

    2002-01-01

    We propose to use the EHS with the hydrogen bubble chamber HOLEBC equipped with classical optics to accumulate statistics of several hundred fully reconstructed $D^{0}$ and $D^{\\pm}$ and several tens of $F^{\\pm}$ and $\\Lambda_{c}$ decays produced by 360 GeV/c $\\pi^{-}$ and 360 GeV/c proton beams. The main aim of the experiment is to determine accurately the lifetime of these particles. Interesting information will also be obtained on branching ratios, decay modes and hadronic production mechanisms.

  9. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  10. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  11. Measurement of the real time fill-pattern at the Australian Synchrotron

    International Nuclear Information System (INIS)

    Peake, D.J.; Boland, M.J.; LeBlanc, G.S.; Rassool, R.P.

    2008-01-01

    This article describes the development, commissioning and operation of a Fill-Pattern Monitor (FPM) for the Australian Synchrotron that measures the real-time intensity distribution of the electron bunches in the storage ring. Using a combination of an ultra-fast photodiode and a high-speed digitiser, real-time measurement of the fill-pattern at bunch-by-bunch resolution was achieved. The results compare very well with current methods of measuring the fill-pattern, such as a pick-up style detector. In addition, the FPM is fully integrated into the EPICS control system. The data provided by the FPM gives accurate RF bucket position and relative bunch currents over a wide range of stored beam currents, from 0.01 mA in a single bunch to 200 mA total beam current. The FPM monitors the success of an injection attempt into the storage ring and is used in a feedback loop to determine where to target the next injection. Using the FPM a beam top-up mode was successfully tested, resulting in a near constant beam current by periodic targeted injections over an 8 h shift. Results are presented for dynamically topped up real-time injection, where the beam pattern was squared using an intensity-dependent injection algorithm

  12. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  14. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  15. Quantum probabilities of composite events in quantum measurements with multimode states

    International Nuclear Information System (INIS)

    Yukalov, V I; Sornette, D

    2013-01-01

    The problem of defining quantum probabilities of composite events is considered. This problem is of great importance for the theory of quantum measurements and for quantum decision theory, which is a part of measurement theory. We show that the Lüders probability of consecutive measurements is a transition probability between two quantum states and that this probability cannot be treated as a quantum extension of the classical conditional probability. The Wigner distribution is shown to be a weighted transition probability that cannot be accepted as a quantum extension of the classical joint probability. We suggest the definition of quantum joint probabilities by introducing composite events in multichannel measurements. The notion of measurements under uncertainty is defined. We demonstrate that the necessary condition for mode interference is the entanglement of the composite prospect together with the entanglement of the composite statistical state. As an illustration, we consider an example of a quantum game. Special attention is paid to the application of the approach to systems with multimode states, such as atoms, molecules, quantum dots, or trapped Bose-condensed atoms with several coherent modes. (paper)

  16. Introduction to probability and measure theories

    International Nuclear Information System (INIS)

    Partasarati, K.

    1983-01-01

    Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given

  17. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  18. Approximation of Measurement Results of “Emergency” Signal Reception Probability

    Directory of Open Access Journals (Sweden)

    Gajda Stanisław

    2017-08-01

    Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.

  19. Gap probability - Measurements and models of a pecan orchard

    Science.gov (United States)

    Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI

    1992-01-01

    Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.

  20. Quantum Zeno and anti-Zeno effects measured by transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenxian, E-mail: wxzhang@whu.edu.cn [School of Physics and Technology, Wuhan University, Wuhan, Hubei 430072 (China); Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Kavli Institute for Theoretical Physics China, CAS, Beijing 100190 (China); Kofman, A.G. [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States); Zhuang, Jun [Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); You, J.Q. [Beijing Computational Science Research Center, Beijing 10084 (China); Department of Physics, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Nori, Franco [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States)

    2013-10-30

    Using numerical calculations, we compare the transition probabilities of many spins in random magnetic fields, subject to either frequent projective measurements, frequent phase modulations, or a mix of modulations and measurements. For various distribution functions, we find the transition probability under frequent modulations is suppressed most if the pulse delay is short and the evolution time is larger than a critical value. Furthermore, decay freezing occurs only under frequent modulations as the pulse delay approaches zero. In the large pulse-delay region, however, the transition probabilities under frequent modulations are highest among the three control methods.

  1. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  2. The Anatomy of American Football: Evidence from 7 Years of NFL Game Data.

    Directory of Open Access Journals (Sweden)

    Konstantinos Pelechrinis

    Full Text Available How much does a fumble affect the probability of winning an American football game? How balanced should your offense be in order to increase the probability of winning by 10%? These are questions for which the coaching staff of National Football League teams have a clear qualitative answer. Turnovers are costly; turn the ball over several times and you will certainly lose. Nevertheless, what does "several" mean? How "certain" is certainly? In this study, we collected play-by-play data from the past 7 NFL seasons, i.e., 2009-2015, and we build a descriptive model for the probability of winning a game. Despite the fact that our model incorporates simple box score statistics, such as total offensive yards, number of turnovers etc., its overall cross-validation accuracy is 84%. Furthermore, we combine this descriptive model with a statistical bootstrap module to build FPM (short for Football Prediction Matchup for predicting future match-ups. The contribution of FPM is pertinent to its simplicity and transparency, which however does not sacrifice the system's performance. In particular, our evaluations indicate that our prediction engine performs on par with the current state-of-the-art systems (e.g., ESPN's FPI and Microsoft's Cortana. The latter are typically proprietary but based on their components described publicly they are significantly more complicated than FPM. Moreover, their proprietary nature does not allow for a head-to-head comparison in terms of the core elements of the systems but it should be evident that the features incorporated in FPM are able to capture a large percentage of the observed variance in NFL games.

  3. New measurements of spontaneous transition probabilities for beryllium-like ions

    International Nuclear Information System (INIS)

    Lang, J.; Hardcastle, R.A.; McWhirter, R.W.P.; Spurrett, P.H.

    1986-06-01

    The authors describe measurements of spectral line intensities for pairs of transitions having common upper levels and thus derive the branching ratios of their spontaneous radiative transition probabilities. These are then combined with the results of measurements of the radiative lifetimes of the upper levels by other authors to obtain values of the individual transition probabilities. The results are for transitions in NIV, OV and NeVII and are given with a claimed accuracy of between 7% and 38%. These are compared with values calculated theoretically. For some of the simpler electric dipole transitions good agreement is found. On the other hand for some of the other transitions which in certain cases are only possible because of configuration interaction disparities between the present measurements and theory are as large as x5. (author)

  4. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  5. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    Energy Technology Data Exchange (ETDEWEB)

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  6. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  7. Evaluating probability measures related to subsurface flow and transport

    International Nuclear Information System (INIS)

    Cawlfield, J.D.

    1991-01-01

    Probabilistic modeling approaches are being used increasingly in order to carry out quantified risk analysis and to evaluate the uncertainty existing in subsurface flow and transport analyses. The work presented in this paper addresses three issues: comparison of common probabilistic modeling techniques, recent results regarding the sensitivity of probability measures to likely changes in the uncertain variables for transport in porous media, and a discussion of some questions regarding fundamental modeling philosophy within a probabilistic framework. Recent results indicate that uncertainty regarding average flow velocity controls the probabilistic outcome, while uncertainty in the dispersivity and diffusion coefficient does not seem very important. Uncertainty of reaction terms is important only at early times in the transport process. Questions are posed regarding (1) the inclusion of macrodispersion in a probabilistic analysis, (2) statistics of flow velocity and (3) the notion of an ultimate probability measure for subsurface flow analyses

  8. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  9. Measurements of transition probabilities in the range from vacuum ultraviolet to infrared

    International Nuclear Information System (INIS)

    Peraza Fernandez, M.C.

    1992-01-01

    In this memory we describe the design, testing and calibration of different spectrometers to measure transition probabilities from the vacuum ultraviolet to the infrared spectral region. For the infrared measurements we have designed and performed a phase sensitive detection system, using an InGaAs photodiode like detector. With this system we have determined the transition probabilities of infrared lines of KrI and XeI. For these lines we haven't found previous measurements. In the vacuum ultraviolet spectral region we have designed a 3 m normal incidence monochromator where we have installed an optical multichannel analyzer. We have tested its accurate working, obtaining the absorption spectrum of KrI. In the visible region we have obtained the emission spectrum of Al using different spectral: hallow-cathode lamp and Nd: YAG laser produced Al plasma. With these spectra we have determined different atomic parameters like transition probabilities and electron temperatures.(author). 83 refs

  10. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  11. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    International Nuclear Information System (INIS)

    Marshman, Emily; Singh, Chandralekha

    2017-01-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations. (paper)

  12. Slope stability probability classification, Waikato Coal Measures, New Zealand

    Energy Technology Data Exchange (ETDEWEB)

    Lindsay, P.; Gillard, G.R.; Moore, T.A. [CRL Energy, PO Box 29-415, Christchurch (New Zealand); Campbell, R.N.; Fergusson, D.A. [Solid Energy North, Private Bag 502, Huntly (New Zealand)

    2001-01-01

    Ferm classified lithological units have been identified and described in the Waikato Coal Measures in open pits in the Waikato coal region. These lithological units have been classified geotechnically by mechanical tests and discontinuity measurements. Using these measurements slope stability probability classifications (SSPC) have been quantified based on an adaptation of Hack's [Slope Stability Probability Classification, ITC Delft Publication, Enschede, Netherlands, vol. 43, 1998, 273 pp.] SSPC system, which places less influence on rock quality designation and unconfined compressive strength than previous slope/rock mass rating systems. The Hack weathering susceptibility rating has been modified by using chemical index of alteration values determined from XRF major element analyses. Slaking is an important parameter in slope stability in the Waikato Coal Measures lithologies and hence, a non-subjective method of assessing slaking in relation to the chemical index of alteration has been introduced. Another major component of this adapted SSPC system is the inclusion of rock moisture content effects on slope stability. The main modifications of Hack's SSPC system are the introduction of rock intact strength derived from the modified Mohr-Coulomb failure criterion, which has been adapted for varying moisture content, weathering state and confining pressure. It is suggested that the subjectivity in assessing intact rock strength within broad bands in the initial SSPC system is a major weakness of the initial system. Initial results indicate a close relationship between rock mass strength values, calculated from rock mass friction angles and rock mass cohesion values derived from two established rock mass classification methods (modified Hoek-Brown failure criteria and MRMR) and the adapted SSPC system. The advantage of the modified SSPC system is that slope stability probabilities based on discontinuity-independent and discontinuity-dependent data and a

  13. Wolf Attack Probability: A Theoretical Security Measure in Biometric Authentication Systems

    Science.gov (United States)

    Une, Masashi; Otsuka, Akira; Imai, Hideki

    This paper will propose a wolf attack probability (WAP) as a new measure for evaluating security of biometric authentication systems. The wolf attack is an attempt to impersonate a victim by feeding “wolves” into the system to be attacked. The “wolf” means an input value which can be falsely accepted as a match with multiple templates. WAP is defined as a maximum success probability of the wolf attack with one wolf sample. In this paper, we give a rigorous definition of the new security measure which gives strength estimation of an individual biometric authentication system against impersonation attacks. We show that if one reestimates using our WAP measure, a typical fingerprint algorithm turns out to be much weaker than theoretically estimated by Ratha et al. Moreover, we apply the wolf attack to a finger-vein-pattern based algorithm. Surprisingly, we show that there exists an extremely strong wolf which falsely matches all templates for any threshold value.

  14. Ventolin Diskus and Inspyril Turbuhaler: an in vitro comparison.

    Science.gov (United States)

    Broeders, M E A C; Molema, J; Burnell, P K P; Folgering, H T M

    2005-01-01

    Dose delivery (total emitted dose, or TED) from dry powder inhalers (DPIs), pulmonary deposition, and the biological effects depend on drug formulation and device and patient characteristics. The aim of this study was to measure, in vitro, the relationship between parameters of inhalation profiles recorded from patients, the TED and fine particle mass (FPM) of Diskus and Turbuhaler inhalers. Inhalation profiles (IPs) of 25 patients, a representative sample of a wide range of 1500 IPs generated by 10 stable asthmatics, 3 x 16 (mild/moderate/severe) COPD patients and 15 hospitalized patients with an exacerbation asthma or COPD, were selected for each device. These 25 IPs were input IPs for the Electronic Lung (a computerdriven inhalation simulator) to determine particle size distribution from Ventolin Diskus and Inspyril Turbuhaler. The TED and FPM of Diskus and FPM of Turbuhaler were affected by the peak inspiratory flow (PIF) and not by slope of the pressure-time curve, inhaled volume and inhalation time. This flow-dependency was more marked at lower flows (PIF TED and FPM of Diskus were significantly higher as compared to those of the Turbuhaler [mean (SD) TED(_diskus) (%label claim) 83.5 (13.9) vs. TED(_turbuhaler) (72.5 (11.1) (p = 0.004), FPM(_diskus) (%label claim) 36.8 (9.8) vs FPM(_turbuhaler) (28.7 (7.7) (p TED and FPM of Diskus and FPM of Turbuhaler were affected by PIF, the flow-dependency being greater at PIF values below 40 L/min. Lower PIFs occurred more often when using Turbuhaler than Diskus, since Turbuhaler have a higher resistivity, requires substantially higher pressure in order to generate the same flow as Diskus. TED, dose consistency and the FPM were higher for Diskus as compared to Turbuhaler. The flow dependency of TED and FPM was substantially influenced by inhalation profiles when not only profiles of the usual outpatient population were included but also the real outliers from exacerbated patients.

  15. Measurement of Plutonium-240 Angular Momentum Dependent Fission Probabilities Using the Alpha-Alpha' Reaction

    Science.gov (United States)

    Koglin, Johnathon

    Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to

  16. Assembly for the measurement of the most probable energy of directed electron radiation

    International Nuclear Information System (INIS)

    Geske, G.

    1987-01-01

    This invention relates to a setup for the measurement of the most probable energy of directed electron radiation up to 50 MeV. The known energy-range relationship with regard to the absorption of electron radiation in matter is utilized by an absorber with two groups of interconnected radiation detectors embedded in it. The most probable electron beam energy is derived from the quotient of both groups' signals

  17. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  18. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  19. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  20. The distribution function of a probability measure on a space with a fractal structure

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Granero, M.A.; Galvez-Rodriguez, J.F.

    2017-07-01

    In this work we show how to define a probability measure with the help of a fractal structure. One of the keys of this approach is to use the completion of the fractal structure. Then we use the theory of a cumulative distribution function on a Polish ultrametric space and describe it in this context. Finally, with the help of fractal structures, we prove that a function satisfying the properties of a cumulative distribution function on a Polish ultrametric space is a cumulative distribution function with respect to some probability measure on the space. (Author)

  1. Measurement of the spark probability in single gap parallel plate chambers

    International Nuclear Information System (INIS)

    Arefiev, A.; Bencze, Gy.L.; Choumilov, E.; Civinini, C.; Dalla Santa, F.; D'Alessandro, R.; Ferrando, A.; Fouz, M.C.; Golovkin, V.; Kholodenko, A.; Iglesias, A.; Ivochkin, V.; Josa, M.I.; Malinin, A.; Meschini, M.; Misyura, S.; Pojidaev, V.; Salicio, J.M.

    1996-01-01

    We present results on the measurements of the spark probability with CO 2 and CF 4 /CO 2 (80/20) mixture, at atmospheric pressure, using 1.5 mm gas gap parallel plate chambers, working at a gas gain ranging from 4.5 x 10 2 to 3.3 x 10 4 . (orig.)

  2. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  3. Hidden measurements, hidden variables and the volume representation of transition probabilities

    OpenAIRE

    Oliynyk, Todd A.

    2005-01-01

    We construct, for any finite dimension $n$, a new hidden measurement model for quantum mechanics based on representing quantum transition probabilities by the volume of regions in projective Hilbert space. For $n=2$ our model is equivalent to the Aerts sphere model and serves as a generalization of it for dimensions $n \\geq 3$. We also show how to construct a hidden variables scheme based on hidden measurements and we discuss how joint distributions arise in our hidden variables scheme and th...

  4. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  5. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  6. Audio Query by Example Using Similarity Measures between Probability Density Functions of Features

    Directory of Open Access Journals (Sweden)

    Marko Helén

    2010-01-01

    Full Text Available This paper proposes a query by example system for generic audio. We estimate the similarity of the example signal and the samples in the queried database by calculating the distance between the probability density functions (pdfs of their frame-wise acoustic features. Since the features are continuous valued, we propose to model them using Gaussian mixture models (GMMs or hidden Markov models (HMMs. The models parametrize each sample efficiently and retain sufficient information for similarity measurement. To measure the distance between the models, we apply a novel Euclidean distance, approximations of Kullback-Leibler divergence, and a cross-likelihood ratio test. The performance of the measures was tested in simulations where audio samples are automatically retrieved from a general audio database, based on the estimated similarity to a user-provided example. The simulations show that the distance between probability density functions is an accurate measure for similarity. Measures based on GMMs or HMMs are shown to produce better results than that of the existing methods based on simpler statistics or histograms of the features. A good performance with low computational cost is obtained with the proposed Euclidean distance.

  7. Evolution of probability measures by cellular automata on algebraic topological Markov chains

    Directory of Open Access Journals (Sweden)

    ALEJANDRO MAASS

    2003-01-01

    Full Text Available In this paper we review some recent results on the evolution of probability measures under cellular automata acting on a fullshift. In particular we discuss the crucial role of the attractiveness of maximal measures. We enlarge the context of the results of a previous study of topological Markov chains that are Abelian groups; the shift map is an automorphism of this group. This is carried out by studying the dynamics of Markov measures by a particular additive cellular automata. Many of these topics were within the focus of Francisco Varela's mathematical interests.

  8. Slope stability probability classification, Waikato Coal Measures, New Zealand

    Energy Technology Data Exchange (ETDEWEB)

    Lindsay, P.; Campbell, R.; Fergusson, D.A.; Ferm, J.C.; Gillard, G.R.; Moore, T.A. [CRL Energy Ltd., Christchurch (New Zealand)

    1999-07-01

    Ferm classified lithological units have been identified and described in the Waikato Coal Measures in open pits in the Waikato coal region. These lithological units have been classified geotechnically with mechanical tests and discontinuity measurements. Using these measurements, slope stability probability classification (SSPC) have been quantified based on an adaption of Hack's SSPC system which places less influence on rock quality designation and unconfined compressive strength than previous rock mass rating systems. An attempt has been made to modify the Hack weathering susceptibility rating by using chemical index of alteration values from XRF major element analysis. Another major component of this adapted SSPC system is the inclusion of rock moisture content effects on slope stability. The paper explains the systematic initial approach of using the adapted SSPC system to classify slope stability in the Waikato open pit coal mines. The XRF major element results obtained for lithologies in the Waikato coal region may be a useful mine management tool to quantify stratigraphic thickness and palaeoweathering from wash drill cuttings. 14 refs., 7 figs., 3 tabs.

  9. Probability measures, Lévy measures and analyticity in time

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich

    2008-01-01

    We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators, we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...

  10. Probability Measures, Lévy Measures, and Analyticity in Time

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich

    We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...

  11. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  12. Evaluation of the probability distribution of intake from a single measurement on a personal air sampler

    International Nuclear Information System (INIS)

    Birchall, A.; Muirhead, C.R.; James, A.C.

    1988-01-01

    An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)

  13. The relationship between freezing point of milk and milk components and its changes during lactation in Czech Pied and Holstein cows

    Directory of Open Access Journals (Sweden)

    Gustav Chládek

    2005-01-01

    Full Text Available The freezing point of milk (FPM is an instant indicator of violated technological quality of raw milk, especially of dilution. FPM can also vary due to numerous effects associated with changes in milk composition and milk characteristics. Beside the effect of season, phase of lactation, breed, milk yield, sub-clinical mastitis etc. the impacts of nutrition and dietary or metabolic disorders are the most significant and the most frequent (GAJDŮŠEK, 2003. FPM is a relatively stable physical characteristic and due to osmotically active elements it ranges from – 0.510 to – 0.535 °C (HANUŠ et al., 2003b. Recently ŠUSTOVÁ (2001 studied the freezing point of milk in pool samples; she observed seasonal changes in FPM of mixed milk and the effect of different diets on FPM values. KOLOŠTA (2003 looked into the effect of grazing season on FPM. HANUŠ et al. (2003a analysed possible effects of handling of milk components on FPM.The aim of this work was to describe the relationship between FPM and milk components and the impact of breed, number and phase of lactation on FPM. We analysed 328 milk samples in total, out of which 137 samples were of Czech Pied cows and 191 samples of Holstein cows. The effect of number and phase of lactation was evaluated for both breeds together. The greatest coefficients of correlation in total were found between FPM and lactose content (r = 0.600 and solids non fat (r = 0.523. Lower coefficients of correlation were found between FPM and milk fat content (r = 0.235, milk protein content (r = 0.260 and urea concentration (r = 0.256. These coefficients were considerably lower in Holstein cows than in Czech Pied cows. The coefficients of correlation between FPM and number and phase of lactation and somatic cells count were insignificant. The total mean value of FPM was – 0.534 °C. Breed statistically significantly (P<0.01 affected FPM (+0.006 °C in C breed and milk fat content (+0.19 % in H

  14. Measurement of vacancy transfer probability from K to L shell using ...

    Indian Academy of Sciences (India)

    73, No. 4. — journal of. October 2009 physics pp. 711–718. Measurement of vacancy transfer probability from K to L shell using K-shell fluorescence yields. ¨O S¨O˘GÜT1,∗, E BÜYÜKKASAP2, A KÜC¸ ÜK¨ONDER1 and T TARAKC¸ IO ˇGLU1. 1Department of Physics, Faculty of Science and Letters, Kahramanmaras Sütçü ˙ ...

  15. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  16. A probability measure for random surfaces of arbitrary genus and bosonic strings in 4 dimensions

    International Nuclear Information System (INIS)

    Albeverio, S.; Hoeegh-Krohn, R.; Paycha, S.; Scarlatti, S.

    1989-01-01

    We define a probability measure describing random surfaces in R D , 3≤D≤13, parametrized by compact Riemann surfaces of arbitrary genus. The measure involves the path space measure for scalar fields with exponential interaction in 2 space time dimensions. We show that it gives a mathematical realization of Polyakov's heuristic measure for bosonic strings. (orig.)

  17. Equações de referência para a predição da força de preensão manual em brasileiros de meia idade e idosos Reference equations for predicting of handgrip strength in brazilian middle-aged and elderly subjects

    Directory of Open Access Journals (Sweden)

    Rômulo Dias Novaes

    2009-09-01

    Full Text Available O objetivo deste estudo foi avaliar os valores normais da força de preensão manual do membro superior dominante (FPM-D e não dominante (FPM-ND em sujeitos de meia idade e idosos assintomáticos e elaborar equações de referência para a predição da FPM. Foram investigados 54 voluntários (51,9% homens com idade >50 anos, medindo-se massa corporal, estatura e perimetria do braço direito e esquerdo, e calculando-se o índice de massa corporal. A FPM-D e FPM -ND foram avaliadas por dinamometria mecânica. O índice de atividade física habitual (IAF foi avaliado pelo questionário de Baecke. A FPM-D foi superior à FPM-ND em ambos os sexos e em todas as idades (pThe aim of the present study was to evaluate normal values of dominant (D and non-dominant (ND upper limb handgrip strength (HGS in asymptomatic middle-aged elderly subjects and to establish reference equations for predicting HGS. Fifty-four volunteers (51.9% men aged >50 years old were enrolled. Weight, height, and both arms circumference were measured, being the body mass index calculated. Mechanical dynamometry was used to measure D-HGS and ND-HGS. Self-reported level of regular physical activity was assessed by Baecke questionnaire. D-HGS was higher than ND-HGS for both sexes and in all age groups (p<0.05. Significant correlations were found between HGS and age, height, weight, and arm circumferences. The best reference equations were the following: D-HGSkgf=39.996 - (0.382 x age years+(0.174 x weight kg+(13.628 x sex men=1;women=0 (Adjusted R²=0.677; and ND-HGSkgf=44.968 - (0.420 x age years+(0.110 x weight kg+(9.274 x sex men=1; women=0 (Adjusted R²=0.546. The consistent difference found between dominant and non-dominant HGS requires the use of specific normative data for each hand. Hence easily-obtained attributes such as age, height, weight, arm circumference and sex can predict HGS expected values for asymptomatic elder adults.

  18. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  19. Measurement of D-T neutron penetration probability spectra for iron ball shell systems

    International Nuclear Information System (INIS)

    Duan Shaojie

    1998-06-01

    The D-T neutron penetration probability spectra are measured for iron ball shell systems of the series of samples used in the experiments, and the penetration curves are presented. As the detector is near to samples, the measured results being approximately corrected are compared with those in the literature, and it is shown that the former is compatible with the latter in the range of the experimental error

  20. First simultaneous measurement of fission and gamma probabilities of 237U and 239Np via surrogate reactions

    Directory of Open Access Journals (Sweden)

    Marini P.

    2016-01-01

    Full Text Available Fission and gamma decay probabilities of 237U and 239Np have been measured, for the first time simultaneously in dedicated experiments, via the surrogate reactions 238U(3He, 4He and 238U(3He,d, respectively. While a good agreement between our data and neutron-induced data is found for fission probabilities, gamma decay probabilities are several times higher than the corresponding neutron-induced data for each studied nucleus. We study the role of the different spin distributions populated in the surrogate and neutron-induced reactions. The compound nucleus spin distribution populated in the surrogate reaction is extracted from the measured gamma-decay probabilities, and used as input parameter in the statistical model to predict fission probabilities to be compared to our data. A strong disagreement between our data and the prediction is obtained. Preliminary results from an additional dedicated experiment confirm the observed discrepancies, indicating the need of a better understanding of the formation and decay processes of the compound nucleus.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Comment on "Measurements without probabilities in the final state proposal"

    Science.gov (United States)

    Cohen, Eliahu; Nowakowski, Marcin

    2018-04-01

    The final state proposal [G. T. Horowitz and J. M. Maldacena, J. High Energy Phys. 04 (2004) 008, 10.1088/1126-6708/2004/04/008] is an attempt to relax the apparent tension between string theory and semiclassical arguments regarding the unitarity of black hole evaporation. Authors Bousso and Stanford [Phys. Rev. D 89, 044038 (2014), 10.1103/PhysRevD.89.044038] analyze thought experiments where an infalling observer first verifies the entanglement between early and late Hawking modes and then verifies the interior purification of the same Hawking particle. They claim that "probabilities for outcomes of these measurements are not defined" and therefore suggest that "the final state proposal does not offer a consistent alternative to the firewall hypothesis." We show, in contrast, that one may define all the relevant probabilities based on the so-called ABL rule [Y. Aharonov, P. G. Bergmann, and J. L. Lebowitz, Phys. Rev. 134, B1410 (1964), 10.1103/PhysRev.134.B1410], which is better suited for this task than the decoherence functional. We thus assert that the analysis of Bousso and Stanford cannot yet rule out the final state proposal.

  3. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  4. Link importance incorporated failure probability measuring solution for multicast light-trees in elastic optical networks

    Science.gov (United States)

    Li, Xin; Zhang, Lu; Tang, Ying; Huang, Shanguo

    2018-03-01

    The light-tree-based optical multicasting (LT-OM) scheme provides a spectrum- and energy-efficient method to accommodate emerging multicast services. Some studies focus on the survivability technologies for LTs against a fixed number of link failures, such as single-link failure. However, a few studies involve failure probability constraints when building LTs. It is worth noting that each link of an LT plays different important roles under failure scenarios. When calculating the failure probability of an LT, the importance of its every link should be considered. We design a link importance incorporated failure probability measuring solution (LIFPMS) for multicast LTs under independent failure model and shared risk link group failure model. Based on the LIFPMS, we put forward the minimum failure probability (MFP) problem for the LT-OM scheme. Heuristic approaches are developed to address the MFP problem in elastic optical networks. Numerical results show that the LIFPMS provides an accurate metric for calculating the failure probability of multicast LTs and enhances the reliability of the LT-OM scheme while accommodating multicast services.

  5. Measurement of K-electron capture probability in the decay of 87Y

    International Nuclear Information System (INIS)

    Prasad, N.V.S.V.; Murty, G.S.K.; Rao, M.V.S.C.; Sastry, D.L.

    1993-01-01

    The K-electron capture probability for the 1/2 - to 3/2 - transition in the decay of 87 Y to the 873.0 keV level in the daughter 87 Sr was measured for the first time using an x-γ summing method. The experimental P K value was found to be 0.911 ± 0.047, in agreement with the theoretical value of 0.878. (author)

  6. Measurements of excited-state-to-excited-state transition probabilities and photoionization cross-sections using laser-induced fluorescence and photoionization signals

    International Nuclear Information System (INIS)

    Shah, M.L.; Sahoo, A.C.; Pulhani, A.K.; Gupta, G.P.; Dikshit, B.; Bhatia, M.S.; Suri, B.M.

    2014-01-01

    Laser-induced photoionization and fluorescence signals were simultaneously observed in atomic samarium using Nd:YAG-pumped dye lasers. Two-color, three-photon photoionization and two-color fluorescence signals were recorded simultaneously as a function of the second-step laser power for two photoionization pathways. The density matrix formalism has been employed to analyze these signals. Two-color laser-induced fluorescence signal depends on the laser powers used for the first and second-step transitions as well as the first and second-step transition probability whereas two-color, three-photon photoionization signal depends on the third-step transition cross-section at the second-step laser wavelength along with the laser powers and transition probability for the first and second-step transitions. Two-color laser-induced fluorescence was used to measure the second-step transition probability. The second-step transition probability obtained was used to infer the photoionization cross-section. Thus, the methodology combining two-color, three-photon photoionization and two-color fluorescence signals in a single experiment has been established for the first time to measure the second-step transition probability as well as the photoionization cross-section. - Highlights: • Laser-induced photoionization and fluorescence signals have been simultaneously observed. • The density matrix formalism has been employed to analyze these signals. • Two-color laser-induced fluorescence was used to measure the second-step transition probability. • The second-step transition probability obtained was used to infer the photoionization cross-section. • Transition probability and photoionization cross-section have been measured in a single experiment

  7. Measurement of the resonance escape probability; Mesure de l'absorption resonnante

    Energy Technology Data Exchange (ETDEWEB)

    Anthony, J P; Bacher, P; Lheureux, L; Moreau, J; Schmitt, A P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 {+-} 0.005 and 0.912 {+-} 0.006 (d. 26 mm), 0.8627 {+-} 0.009 and 0.884 {+-} 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [French] Nous avons mesure le rapport cadmium moyen dans des barres d'uranium a l'aide de disques d'uranium naturel de meme diametre que ces dernieres. Ces mesures nous ont permis, conjointement avec des mesures de Laplacien du reseau, de determiner deux facteurs antitrappes du reacteur G1 correspondant a deux definitions exposees. Les mesures ont ete faites sur deux diametres de barres 26 et 32 mm. Resultats: 0.8976 {+-} 0.005 and 0.912 {+-} 0.006 (d. 26 mm), 0.8627 {+-} 0.009 and 0.884 {+-} 0.01 (d. 32 mm). L'influence de ces deux definitions sur les divers parametres du reseau, est discutee. La determination de 'p' pour un diametre de barres d'uranium de 26 mm, et les mesures de variation de Laplacien, nous ont permis de calculer une valeur de l'integrale effective correspondant a chaque definition. Les mesures analogues faites sur des barres de thorium sont egalement indiquees. (auteur)

  8. Production of 147Eu for gamma-ray emission probability measurement

    International Nuclear Information System (INIS)

    Katoh, Keiji; Marnada, Nada; Miyahara, Hiroshi

    2002-01-01

    Gamma-ray emission probability is one of the most important decay parameters of radionuclide and many researchers are paying efforts to improve the certainty of it. The certainties of γ-ray emission probabilities for neutron-rich nuclides are being improved little by little, but the improvements of those for proton-rich nuclides are still insufficient. Europium-147 that decays by electron capture or β + -particle emission is a proton-rich nuclide and the γ-ray emission probabilities evaluated by Mateosian and Peker have large uncertainties. They referred to only one report concerning with γ-ray emission probabilities. Our final purpose is to determine the precise γ-ray emission probabilities of 147 Eu from disintegration rates and γ-ray intensities by using a 4πβ-γ coincidence apparatus. Impurity nuclides affect largely to the determination of disintegration rate; therefore, a highly pure 147 Eu source is required. This short note will describe the most proper energy for 147 Eu production through 147 Sm(p, n) reaction. (author)

  9. Effects of illumination on image reconstruction via Fourier ptychography

    Science.gov (United States)

    Cao, Xinrui; Sinzinger, Stefan

    2017-12-01

    The Fourier ptychographic microscopy (FPM) technique provides high-resolution images by combining a traditional imaging system, e.g. a microscope or a 4f-imaging system, with a multiplexing illumination system, e.g. an LED array and numerical image processing for enhanced image reconstruction. In order to numerically combine images that are captured under varying illumination angles, an iterative phase-retrieval algorithm is often applied. However, in practice, the performance of the FPM algorithm degrades due to the imperfections of the optical system, the image noise caused by the camera, etc. To eliminate the influence of the aberrations of the imaging system, an embedded pupil function recovery (EPRY)-FPM algorithm has been proposed [Opt. Express 22, 4960-4972 (2014)]. In this paper, we study how the performance of FPM and EPRY-FPM algorithms are affected by imperfections of the illumination system using both numerical simulations and experiments. The investigated imperfections include varying and non-uniform intensities, and wavefront aberrations. Our study shows that the aberrations of the illumination system significantly affect the performance of both FPM and EPRY-FPM algorithms. Hence, in practice, aberrations in the illumination system gain significant influence on the resulting image quality.

  10. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  11. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  12. The reward probability index: design and validation of a scale measuring access to environmental reward.

    Science.gov (United States)

    Carvalho, John P; Gawrysiak, Michael J; Hellmuth, Julianne C; McNulty, James K; Magidson, Jessica F; Lejuez, C W; Hopko, Derek R

    2011-06-01

    Behavioral models of depression implicate decreased response-contingent positive reinforcement (RCPR) as critical toward the development and maintenance of depression (Lewinsohn, 1974). Given the absence of a psychometrically sound self-report measure of RCPR, the Reward Probability Index (RPI) was developed to measure access to environmental reward and to approximate actual RCPR. In Study 1 (n=269), exploratory factor analysis supported a 20-item two-factor model (Reward Probability, Environmental Suppressors) with strong internal consistency (α=.90). In Study 2 (n=281), confirmatory factor analysis supported this two-factor structure and convergent validity was established through strong correlations between the RPI and measures of activity, avoidance, reinforcement, and depression (r=.65 to .81). Discriminant validity was supported via smaller correlations between the RPI and measures of social support and somatic anxiety (r=-.29 to -.40). Two-week test-retest reliability was strong (r=.69). In Study 3 (n=33), controlling for depression symptoms, hierarchical regression supported the incremental validity of the RPI in predicting daily diary reports of environmental reward. The RPI represents a parsimonious, reliable, and valid measure that may facilitate understanding of the etiology of depression and its relationship to overt behaviors. Copyright © 2011. Published by Elsevier Ltd.

  13. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  14. Evaluation of fundamental parameters method for biological materials and soil analysis by energy dispersive x-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Holynska, B; Muia, L M; Maina, D M

    1987-01-01

    Two methods of determination of trace elements in plant materials, viz. the fundamental parameters method (FPM) and the empirical method with the use of standard samples, were compared. Hay CRM and fresh tea leaves were used in measurements. Good agreement was achieved for the determination of a number of elements by both methods. Also Soil-7 Certified Reference Material (CRM) was analysed using emission-transmission method for absorption correction and FPM for concentration determination. The agreement with CRM was found to be reasonably good for several elements.

  15. Measurement of the Mis-identification Probability of τ Leptons from Hadronic Jets and from Electrons

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    Measurements of the mis-identification probability of QCD jets and electrons as hadronically decaying τ leptons using tag-and-probe methods are described. The analyses are based on 35pb−1 of proton-proton collision data, taken by the ATLAS experiment at a center-of-mass energy of sqrt(s) = 7 TeV. The mis-identification probabilities range between 10% and 0.1% for QCD jets, and about (1 − 2)% for electrons. They depend on the identification algorithm chosen, the pT and the number of prongs of the τ candidate, and on the amount of pile up present in the event.

  16. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  17. Optimum policies for a system with general imperfect maintenance

    International Nuclear Information System (INIS)

    Sheu, S.-H.; Lin, Y.-B.; Liao, G.-L.

    2006-01-01

    This study considers periodic preventive maintenance policies, which maximizes the availability of a repairable system with major repair at failure. Three types of preventive maintenance are performed, namely: imperfect preventive maintenance (IPM), perfect preventive maintenance (PPM) and failed preventive maintenance (FPM). The probability that preventive maintenance is perfect depends on the number of imperfect maintenances conducted since the previous renewal cycle, and the probability that preventive maintenance remains imperfect is not increasing. The optimum preventive maintenance time that maximizes availability is derived. Various special cases are considered. A numerical example is given

  18. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  19. Measurement of K-electron capture probability in the decay of [sup 87]Y

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, N.V.S.V.; Murty, G.S.K.; Rao, M.V.S.C.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Labs. for Nuclear Research); Chintalapudi, S.N. (Inter University Consortium for DAE Facilities, Calcutta (India))

    1993-04-01

    The K-electron capture probability for the 1/2[sup -] to 3/2[sup -]transition in the decay of [sup 87]Y to the 873.0 keV level in the daughter [sup 87]Sr was measured for the first time using an x-[gamma] summing method. The experimental P[sub K] value was found to be 0.911 [+-] 0.047, in agreement with the theoretical value of 0.878. (author).

  20. Measurement on K-electron capture probability in the decay of 97Ru

    International Nuclear Information System (INIS)

    Kalayani, V.D.M.L.; Vara Prasad, N.V.S.; Chandrasekhar Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L.; Chintalapudi, S.N.

    1999-01-01

    The K-electron capture probabilities of two strong allowed transitions 5/2 + →5/2 + and 5/2 + →7/2 + were measured in the decay of 97 Ru employing the X-γ internal summing technique. The two P K experimental values were found to be 0.884±0.046 and 0.886±0.018 in agreement with the theoretical values 0.878 and 0.878, respectively. The theoretical values are seen to be insensitive for Q EC values above 200 keV

  1. Evaluation of fundamental parameters method for biological materials and soil analysis by energy dispersive x-ray spectrometry

    International Nuclear Information System (INIS)

    Holynska, B.; Muia, L.M.; Maina, D.M.

    1987-01-01

    Two methods of determination of trace elements in plant materials, viz. the fundamental parameters method (FPM) and the empirical method with the use of standard samples, were compared. Hay CRM and fresh tea leaves were used in measurements. Good agreement was achieved for the determination of a number of elements by both methods. Also Soil-7 Certified Reference Material (CRM) was analysed using emission-transmission method for absorption correction and FPM for concentration determination. The agreement with CRM was found to be reasonably good for several elements. (author)

  2. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  3. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  4. Effect of flecainide on atrial fibrillatory rate in a large animal model with induced atrial fibrillation

    DEFF Research Database (Denmark)

    Hesselkilde, Eva Z.; Carstensen, Helena; Haugaard, Maria M.

    2017-01-01

    caused a decrease in AFR in all animals and restored sinus rhythm in the animals with induced AF. In the control animals, AFR increased from 269 ± 36 fpm to a plateau of 313 ± 14 fpm before decreasing to 288 ± 28 fpm during the last 10% of the AF episodes preceding spontaneous conversion (P 

  5. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  6. Planning of technical flood retention measures in large river basins under consideration of imprecise probabilities of multivariate hydrological loads

    Directory of Open Access Journals (Sweden)

    D. Nijssen

    2009-08-01

    Full Text Available As a result of the severe floods in Europe at the turn of the millennium, the ongoing shift from safety oriented flood control towards flood risk management was accelerated. With regard to technical flood control measures it became evident that the effectiveness of flood control measures depends on many different factors, which cannot be considered with single events used as design floods for planning. The multivariate characteristics of the hydrological loads have to be considered to evaluate complex flood control measures. The effectiveness of spatially distributed flood control systems differs for varying flood events. Event-based characteristics such as the spatial distribution of precipitation, the shape and volume of the resulting flood waves or the interactions of flood waves with the technical elements, e.g. reservoirs and flood polders, result in varying efficiency of these systems. Considering these aspects a flood control system should be evaluated with a broad range of hydrological loads to get a realistic assessment of its performance under different conditions. The consideration of this variety in flood control planning design was one particular aim of this study. Hydrological loads were described by multiple criteria. A statistical characterization of these criteria is difficult, since the data base is often not sufficient to analyze the variety of possible events. Hydrological simulations were used to solve this problem. Here a deterministic-stochastic flood generator was developed and applied to produce a large quantity of flood events which can be used as scenarios of possible hydrological loads. However, these simulations imply many uncertainties. The results will be biased by the basic assumptions of the modeling tools. In flood control planning probabilities are applied to characterize uncertainties. The probabilities of the simulated flood scenarios differ from probabilities which would be derived from long time series

  7. Fluorinated phenmetrazine "legal highs" act as substrates for high-affinity monoamine transporters of the SLC6 family.

    Science.gov (United States)

    Mayer, Felix P; Burchardt, Nadine V; Decker, Ann M; Partilla, John S; Li, Yang; McLaughlin, Gavin; Kavanagh, Pierce V; Sandtner, Walter; Blough, Bruce E; Brandt, Simon D; Baumann, Michael H; Sitte, Harald H

    2018-05-15

    A variety of new psychoactive substances (NPS) are appearing in recreational drug markets worldwide. NPS are compounds that target various receptors and transporters in the central nervous system to achieve their psychoactive effects. Chemical modifications of existing drugs can generate NPS that are not controlled by current legislation, thereby providing legal alternatives to controlled substances such as cocaine or amphetamine. Recently, 3-fluorophenmetrazine (3-FPM), a derivative of the anorectic compound phenmetrazine, appeared on the recreational drug market and adverse clinical effects have been reported. Phenmetrazine is known to elevate extracellular monoamine concentrations by an amphetamine-like mechanism. Here we tested 3-FPM and its positional isomers, 2-FPM and 4-FPM, for their abilities to interact with plasma membrane monoamine transporters for dopamine (DAT), norepinephrine (NET) and serotonin (SERT). We found that 2-, 3- and 4-FPM inhibit uptake mediated by DAT and NET in HEK293 cells with potencies comparable to cocaine (IC 50 values 80 μM). Experiments directed at identifying transporter-mediated reverse transport revealed that FPM isomers induce efflux via DAT, NET and SERT in HEK293 cells, and this effect is augmented by the Na + /H + ionophore monensin. Each FPM evoked concentration-dependent release of monoamines from rat brain synaptosomes. Hence, this study reports for the first time the mode of action for 2-, 3- and 4-FPM and identifies these NPS as monoamine releasers with marked potency at catecholamine transporters implicated in abuse and addiction. This article is part of the Special Issue entitled 'Designer Drugs and Legal Highs.' Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    Stations are often the limiting capacity factor in a railway network. This induces interdependencies, especially at at-grade junctions, causing network effects. This paper presents three traditional methods that can be used to measure the complexity of a station, indicating the robustness...... of the station’s infrastructure layout and plan of operation. However, these three methods do not take the timetable at the station into consideration. Therefore, two methods are introduced in this paper, making it possible to estimate the robustness of different timetables at a station or different...... infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time...

  9. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  10. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  11. Multiple-event probability in general-relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-01-01

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse

  12. Trending in Probability of Collision Measurements

    Science.gov (United States)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  13. Measurement of transition probabilities in Kr II UV and visible spectral lines

    International Nuclear Information System (INIS)

    Mar, S; Val, J A del; RodrIguez, F; Pelaez, R J; Gonzalez, V R; Gonzalo, A B; Castro, A de; Aparicio, J A

    2006-01-01

    This work reports an extensive collection of 120 atomic transition probabilities of Kr II lines in the spectral region 350-720 nm, all of them measured in an emission experiment. For many of them, these are the first data up to the authors' knowledge. Relative intensity measurements have been obtained on a pulsed discharge lamp and the absolute A ki -values have been calculated by considering the available data from the literature as reference for the plasma temperature diagnosis. Excitation temperature (14 000-28 000 K) has been determined by using the Boltzmann-plot method. The plasma electron density (0.2-0.8 x 10 23 m -3 ) has been determined by two-wavelength interferometry. This work extends a previous one already published by our laboratory [1, 2]. Comparisons have also been made with previous literature values

  14. Feasible pickup from intact ossicular chain with floating piezoelectric microphone.

    Science.gov (United States)

    Kang, Hou-Yong; Na, Gao; Chi, Fang-Lu; Jin, Kai; Pan, Tie-Zheng; Gao, Zhen

    2012-02-22

    Many microphones have been developed to meet with the implantable requirement of totally implantable cochlear implant (TICI). However, a biocompatible one without destroying the intactness of the ossicular chain still remains under investigation. Such an implantable floating piezoelectric microphone (FPM) has been manufactured and shows an efficient electroacoustic performance in vitro test at our lab. We examined whether it pick up sensitively from the intact ossicular chain and postulated whether it be an optimal implantable one. Animal controlled experiment: five adult cats (eight ears) were sacrificed as the model to test the electroacoustic performance of the FPM. Three groups were studied: (1) the experiment group (on malleus): the FPM glued onto the handle of the malleus of the intact ossicular chains; (2) negative control group (in vivo): the FPM only hung into the tympanic cavity; (3) positive control group (Hy-M30): a HiFi commercial microphone placed close to the site of the experiment ear. The testing speaker played pure tones orderly ranged from 0.25 to 8.0 kHz. The FPM inside the ear and the HiFi microphone simultaneously picked up acoustic vibration which recorded as .wav files to analyze. The FPM transformed acoustic vibration sensitively and flatly as did the in vitro test across the frequencies above 2.0 kHz, whereas inefficiently below 1.0 kHz for its overloading mass. Although the HiFi microphone presented more efficiently than the FPM did, there was no significant difference at 3.0 kHz and 8.0 kHz. It is feasible to develop such an implantable FPM for future TICIs and TIHAs system on condition that the improvement of Micro Electromechanical System and piezoelectric ceramic material technology would be applied to reduce its weight and minimize its size.

  15. Feasible pickup from intact ossicular chain with floating piezoelectric microphone

    Directory of Open Access Journals (Sweden)

    Kang Hou-Yong

    2012-02-01

    Full Text Available Abstract Objectives Many microphones have been developed to meet with the implantable requirement of totally implantable cochlear implant (TICI. However, a biocompatible one without destroying the intactness of the ossicular chain still remains under investigation. Such an implantable floating piezoelectric microphone (FPM has been manufactured and shows an efficient electroacoustic performance in vitro test at our lab. We examined whether it pick up sensitively from the intact ossicular chain and postulated whether it be an optimal implantable one. Methods Animal controlled experiment: five adult cats (eight ears were sacrificed as the model to test the electroacoustic performance of the FPM. Three groups were studied: (1 the experiment group (on malleus: the FPM glued onto the handle of the malleus of the intact ossicular chains; (2 negative control group (in vivo: the FPM only hung into the tympanic cavity; (3 positive control group (Hy-M30: a HiFi commercial microphone placed close to the site of the experiment ear. The testing speaker played pure tones orderly ranged from 0.25 to 8.0 kHz. The FPM inside the ear and the HiFi microphone simultaneously picked up acoustic vibration which recorded as .wav files to analyze. Results The FPM transformed acoustic vibration sensitively and flatly as did the in vitro test across the frequencies above 2.0 kHz, whereas inefficiently below 1.0 kHz for its overloading mass. Although the HiFi microphone presented more efficiently than the FPM did, there was no significant difference at 3.0 kHz and 8.0 kHz. Conclusions It is feasible to develop such an implantable FPM for future TICIs and TIHAs system on condition that the improvement of Micro Electromechanical System and piezoelectric ceramic material technology would be applied to reduce its weight and minimize its size.

  16. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  17. Measurement on K-electron capture probability in the decay of {sup 97}Ru

    Energy Technology Data Exchange (ETDEWEB)

    Kalayani, V.D.M.L.; Vara Prasad, N.V.S.; Chandrasekhar Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L. [Swami Jnanananda Laboratories for Nuclear Research, Andhra University, Visakhapatnam (India); Chintalapudi, S.N. [Inter University Consortium for DEA Facililities, Calcutta (India)

    1999-08-01

    The K-electron capture probabilities of two strong allowed transitions 5/2{sup +}{yields}5/2{sup +} and 5/2{sup +}{yields}7/2{sup +} were measured in the decay of {sup 97}Ru employing the X-{gamma} internal summing technique. The two P{sub K} experimental values were found to be 0.884{+-}0.046 and 0.886{+-}0.018 in agreement with the theoretical values 0.878 and 0.878, respectively. The theoretical values are seen to be insensitive for Q{sub EC} values above 200 keV.

  18. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  19. A note on iterated function systems with discontinuous probabilities

    International Nuclear Information System (INIS)

    Jaroszewska, Joanna

    2013-01-01

    Highlights: ► Certain iterated function system with discontinuous probabilities is discussed. ► Existence of an invariant measure via the Schauder–Tychonov theorem is established. ► Asymptotic stability of the system under examination is proved. -- Abstract: We consider an example of an iterated function system with discontinuous probabilities. We prove that it posses an invariant probability measure. We also prove that it is asymptotically stable provided probabilities are positive

  20. Measuring sensitivity in pharmacoeconomic studies. Refining point sensitivity and range sensitivity by incorporating probability distributions.

    Science.gov (United States)

    Nuijten, M J

    1999-07-01

    The aim of the present study is to describe a refinement of a previously presented method, based on the concept of point sensitivity, to deal with uncertainty in economic studies. The original method was refined by the incorporation of probability distributions which allow a more accurate assessment of the level of uncertainty in the model. In addition, a bootstrap method was used to create a probability distribution for a fixed input variable based on a limited number of data points. The original method was limited in that the sensitivity measurement was based on a uniform distribution of the variables and that the overall sensitivity measure was based on a subjectively chosen range which excludes the impact of values outside the range on the overall sensitivity. The concepts of the refined method were illustrated using a Markov model of depression. The application of the refined method substantially changed the ranking of the most sensitive variables compared with the original method. The response rate became the most sensitive variable instead of the 'per diem' for hospitalisation. The refinement of the original method yields sensitivity outcomes, which greater reflect the real uncertainty in economic studies.

  1. Focusing on a Probability Element: Parameter Selection of Message Importance Measure in Big Data

    OpenAIRE

    She, Rui; Liu, Shanyun; Dong, Yunquan; Fan, Pingyi

    2017-01-01

    Message importance measure (MIM) is applicable to characterize the importance of information in the scenario of big data, similar to entropy in information theory. In fact, MIM with a variable parameter can make an effect on the characterization of distribution. Furthermore, by choosing an appropriate parameter of MIM,it is possible to emphasize the message importance of a certain probability element in a distribution. Therefore, parametric MIM can play a vital role in anomaly detection of bi...

  2. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  3. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  4. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  5. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  6. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  7. Data preprocessing methods for robust Fourier ptychographic microscopy

    Science.gov (United States)

    Zhang, Yan; Pan, An; Lei, Ming; Yao, Baoli

    2017-12-01

    Fourier ptychographic microscopy (FPM) is a recently developed computational imaging technique that achieves gigapixel images with both high resolution and large field-of-view. In the current FPM experimental setup, the dark-field images with high-angle illuminations are easily overwhelmed by stray lights and background noises due to the low signal-to-noise ratio, thus significantly degrading the achievable resolution of the FPM approach. We provide an overall and systematic data preprocessing scheme to enhance the FPM's performance, which involves sampling analysis, underexposed/overexposed treatments, background noises suppression, and stray lights elimination. It is demonstrated experimentally with both US Air Force (USAF) 1951 resolution target and biological samples that the benefit of the noise removal by these methods far outweighs the defect of the accompanying signal loss, as part of the lost signals can be compensated by the improved consistencies among the captured raw images. In addition, the reported nonparametric scheme could be further cooperated with the existing state-of-the-art algorithms with a great flexibility, facilitating a stronger noise-robust capability of the FPM approach in various applications.

  8. Method to Calculate Accurate Top Event Probability in a Seismic PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Woo Sik [Sejong Univ., Seoul (Korea, Republic of)

    2014-05-15

    ACUBE(Advanced Cutset Upper Bound Estimator) calculates the top event probability and importance measures from cutsets by dividing cutsets into major and minor groups depending on the cutset probability, where the cutsets that have higher cutset probability are included in the major group and the others in minor cutsets, converting major cutsets into a Binary Decision Diagram (BDD). By applying the ACUBE algorithm to the seismic PSA cutsets, the accuracy of a top event probability and importance measures can be significantly improved. ACUBE works by dividing the cutsets into two groups (higher and lower cutset probability groups), calculating the top event probability and importance measures in each group, and combining the two results from the two groups. Here, ACUBE calculates the top event probability and importance measures of the higher cutset probability group exactly. On the other hand, ACUBE calculates these measures of the lower cutset probability group with an approximation such as MCUB. The ACUBE algorithm is useful for decreasing the conservatism that is caused by approximating the top event probability and importance measure calculations with given cutsets. By applying the ACUBE algorithm to the seismic PSA cutsets, the accuracy of a top event probability and importance measures can be significantly improved. This study shows that careful attention should be paid and an appropriate method be provided in order to avoid the significant overestimation of the top event probability calculation. Due to the strength of ACUBE that is explained in this study, the ACUBE became a vital tool for calculating more accurate CDF of the seismic PSA cutsets than the conventional probability calculation method.

  9. The correlation of defect distribution in collisional phase with measured cascade collapse probability

    International Nuclear Information System (INIS)

    Morishita, K.; Ishino, S.; Sekimura, N.

    1995-01-01

    The spatial distributions of atomic displacement at the end of the collisional phase of cascade damage processes were calculated using the computer simulation code MARLOWE, which is based on the binary collision approximation (BCA). The densities of the atomic displacement were evaluated in high dense regions (HDRs) of cascades in several pure metals (Fe, Ni, Cu, Ag, Au, Mo and W). They were compared with the measured cascade collapse probabilities reported in the literature where TEM observations were carried out using thin metal foils irradiated by low-dose ions at room temperature. We found that there exists the minimum or ''critical'' values of the atomic displacement densities for the HDR to collapse into TEM-visible vacancy clusters. The critical densities are generally independent of the cascade energy in the same metal. Furthermore, the material dependence of the critical densities can be explained by the difference in the vacancy mobility at the melting temperature of target materials. This critical density calibration, which is extracted from the ion-irradiation experiments and the BCA simulations, is applied to estimation of cascade collapse probabilities in the metals irradiated by fusion neutrons. (orig.)

  10. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  11. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  12. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  13. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  14. Survey of loading performance of currently available types HEPA filters under in-service conditions

    International Nuclear Information System (INIS)

    Gunn, C.A.; McDonough, J.B.

    1981-01-01

    Atmospheric dust loading tests were conducted on various industrial grade High Efficiency Particulate Air Filters. The filters tested were the European Style, Super-Flow, Standard US Design, and a Super-Pak. Filters were installed on the roof of a 3-story building. Test flows were set at a media velocity of 5 FPM (1.52 meters per min) and results show that filter life varies from 8.8 to 12.7 months. In addition, tests were coucted on the European Style filter at media velocities of 5.6 and 2.6 FPM. On the filter tested at 5.6 FPM an abrupt change in life was observed at 4 months. After more than 1 year operation at a lower velocity of 2.6 FPM the pressure rise with time is still very slow

  15. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  16. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  17. Half-life measurements and photon emission probabilities of frequently applied radioisotopes

    International Nuclear Information System (INIS)

    Schoetzig, U.; Schrader, H.

    1998-09-01

    It belongs to the duties of the PTB department for 'Radioactivity' to determine the radioactivity emitted by radioactive radiation sources and publish their specific decay data, also called ''standards'', so that appliers of such sources may calibrate their equipment accordingly, as e.g. photon detectors. Further data required for proper calibration are those defining the photon emission probability per decay, P(E), at the relevant photon energy E. The emission rate R(E) is derived from the activity A, by the calculus R(E)=A x P(E), and the half-lives of decay, T 1 /2, together with the standards are used for determining the time of measurement. The calibration quality essentially is determined by those two parameters and the incertainties involved. The PTB 'Radioactivity' department therefore publishes for users recommended decay data elaborated and used by the experts at PTB. The tabulated data are either measured at PTB, or critically selected from data compilations of other publication sources. The tabulated decay data presented here are intended to serve as a source of reference for laboratory work and should be used in combination with the comprehensive data collections available (see the bibliography of this document: 86BRFI, 91TECD, 96FI, Nuclear Data Sheets, e.g. 98ND84). (orig./CB) [de

  18. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  19. Disentangling multiple pressures on fish assemblages in large rivers.

    Science.gov (United States)

    Zajicek, Petr; Radinger, Johannes; Wolter, Christian

    2018-06-15

    European large rivers are exposed to multiple human pressures and maintained as waterways for inland navigation. However, little is known on the dominance and interactions of multiple pressures in large rivers and in particular inland navigation has been ignored in multi-pressure analyzes so far. We determined the response of ten fish population metrics (FPM, related to densities of diagnostic guilds and biodiversity) to 11 prevailing pressures including navigation intensity at 76 sites in eight European large rivers. Thereby, we aimed to derive indicative FPM for the most influential pressures that can serve for fish-based assessments. Pressures' influences, impacts and interactions were determined for each FPM using bootstrapped regression tree models. Increased flow velocity, navigation intensity and the loss of floodplains had the highest influences on guild densities and biodiversity. Interactions between navigation intensity and loss of floodplains and between navigation intensity and increased flow velocity were most frequent, each affecting 80% of the FPM. Further, increased sedimentation, channelization, organic siltation, the presence of artificial embankments and the presence of barriers had strong influences on at least one FPM. Thereby, each FPM was influenced by up to five pressures. However, some diagnostic FPM could be derived: Species richness, Shannon and Simpson Indices, the Fish Region Index and lithophilic and psammophilic guilds specifically indicate rhithralisation of the potamal region of large rivers. Lithophilic, phytophilic and psammophilic guilds indicate disturbance of shoreline habitats through both (i) wave action induced by passing vessels and (ii) hydromorphological degradation of the river channel that comes along with inland navigation. In European large rivers, inland navigation constitutes a highly influential pressure that adds on top of the prevailing hydromorphological degradation. Therefore, river management has to consider

  20. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  1. To Measure Probable Physical Changes On The Earth During Total Solar Eclipse Using Geophysical Methods

    International Nuclear Information System (INIS)

    Gocmen, C.

    2007-01-01

    When the total solar eclipse came into question, people connected the eclipse with the earthquake dated 17.08.1999. We thought if any physical parameters change during total solar eclipse on the earth, we could measure this changing and we did the project 'To Measure Probable Physical Changes On The Earth During Total Solar Eclipse Using Geophysical Methods' We did gravity, magnetic and self-potential measurements at Konya and Ankara during total solar eclipse (29, March, 2006) and the day before eclipse and the day after eclipse. The measurements went on three days continuously twenty-four hours at Konya and daytime in Ankara. Bogazici University Kandilli Observatory gave us magnetic values in Istanbul and we compare the values with our magnetic values. Turkish State Meteorological Service sent us temperature and air pressure observations during three days, in Konya and Ankara. We interpreted all of them

  2. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  3. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  4. Tumor control probability after a radiation of animal tumors

    International Nuclear Information System (INIS)

    Urano, Muneyasu; Ando, Koichi; Koike, Sachiko; Nesumi, Naofumi

    1975-01-01

    Tumor control and regrowth probability of animal tumors irradiated with a single x-ray dose were determined, using a spontaneous C3H mouse mammary carcinoma. Cellular radiation sensitivity of tumor cells and tumor control probability of the tumor were examined by the TD 50 and TCD 50 assays respectively. Tumor growth kinetics were measured by counting the percentage of labelled mitosis and by measuring the growth curve. A mathematical analysis of tumor control probability was made from these results. A formula proposed, accounted for cell population kinetics or division probability model, cell sensitivity to radiation and number of tumor cells. (auth.)

  5. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  6. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  7. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  8. PAHs concentration and toxicity in organic solvent extracts of atmospheric particulate matter and sea sediments.

    Science.gov (United States)

    Ozaki, Noriatsu; Takeuchi, Shin-ya; Kojima, Keisuke; Kindaichi, Tomonori; Komatsu, Toshiko; Fukushima, Takehiko

    2012-01-01

    The concentration of polycyclic aromatic hydrocarbons (PAHs) and the toxicity to marine bacteria (Vibrio fischeri) were measured for the organic solvent extracts of sea sediments collected from an urban watershed area (Hiroshima Bay) of Japan and compared with the concentrations and toxicity of atmospheric particulate matter (PM). In atmospheric PM, the PAHs concentration was highest in fine particulate matter (FPM) collected during cold seasons. The concentrations of sea sediments were 0.01-0.001 times those of atmospheric PM. 1/EC50 was 1-10 L g(-1) PM for atmospheric PM and 0.1-1 L g(-1) dry solids for sea sediments. These results imply that toxic substances from atmospheric PM are diluted several tens or hundreds of times in sea sediments. The ratio of the 1/EC50 to PAHs concentration ((1/EC50)/16PAHs) was stable for all sea sediments (0.1-1 L μg(-1) 16PAHs) and was the same order of magnitude as that of FPM and coarse particulate matter (CPM). The ratio of sediments collected from the west was more similar to that of CPM while that from the east was more similar to FPM, possibly because of hydraulic differences among water bodies. The PAHs concentration pattern analyses (principal component analysis and isomer ratio analysis) were conducted and the results showed that the PAHs pattern in sea sediments was quite different to that of FPM and CPM. Comparison with previously conducted PAHs analyses suggested that biomass burning residues comprised a major portion of these other sources.

  9. Optimization and coordination of South-to-North Water Diversion supply chain with strategic customer behavior

    Directory of Open Access Journals (Sweden)

    Zhi-song Chen

    2012-12-01

    Full Text Available The South-to-North Water Diversion (SNWD Project is a significant engineering project meant to solve water shortage problems in North China. Faced with market operations management of the water diversion system, this study defined the supply chain system for the SNWD Project, considering the actual project conditions, built a decentralized decision model and a centralized decision model with strategic customer behavior (SCB using a floating pricing mechanism (FPM, and constructed a coordination mechanism via a revenue-sharing contract. The results suggest the following: (1 owing to water shortage supplements and the excess water sale policy provided by the FPM, the optimal ordering quantity of water resources is less than that without the FPM, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without the FPM; (2 wholesale pricing and supplementary wholesale pricing with SCB are higher than those without SCB, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without SCB; and (3 considering SCB and introducing the FPM help increase the optimal profits of the whole supply chain, supplier, and external distributor, and improve the efficiency of water resources usage.

  10. Measurement of low energy neutrino absorption probability in thallium 205

    International Nuclear Information System (INIS)

    Freedman, M.S.

    1986-01-01

    A major aspect of the P-P neutrino flux determination using thallium 205 is the very difficult problem of experimentally demonstrating the neutrino reaction cross section with about 10% accuracy. One will soon be able to completely strip the electrons from atomic thallium 205 and to maintain the bare nucleus in this state in the heavy storage ring to be built at GSI Darmstadt. This nucleus can decay by emitting a beta-minus particle into the bound K-level of the daughter lead 205 ion as the only energetically open decay channel, (plus, of course, an antineutrino). This single channel beta decay explores the same nuclear wave functions of initial and final states as does the neutrino capture in atomic thallium 205, and thus its probability or rate is governed by the same nuclear matrix elements that affect both weak interactions. Measuring the rate of accumulation of lead 205 ions in the circulating beam of thallium 205 ions gives directly the cross section of the neutrino capture reaction. The calculations of the expected rates under realistic experimental conditions will be shown to be very favorable for the measurement. A special calibration experiment to verify this method and check the theoretical calculations will be suggested. Finally, the neutrino cross section calculation based on the observed rate of the single channel beta-minus decay reaction will be shown. Demonstrating bound state beta decay may be the first verification of the theory of this very important process that influences beta decay rates of several isotopes in stellar interiors, e.g., Re-187, that play important roles in geologic and cosmologic dating and nucleosynthesis. 21 refs., 2 figs

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Indoor Measurements of Environmental Tobacco Smoke Final Report to the Tobacco Related Disease Research Program

    Energy Technology Data Exchange (ETDEWEB)

    Apte, Michael G.; Gundel, Lara A.; Dod, Raymond L.; Russell, Marion L.; Singer, Brett C.; Sohn, Michael D.; Sullivan, Douglas P.; Chang, Gee-Minn; Sextro, Richard G.

    2004-03-02

    , quickly adsorbed on unconditioned surfaces so that nicotine concentrations in these rooms remained very low, even during smoking episodes. These findings suggest that using nicotine as a tracer of ETS particle concentrations may yield misleading concentration and/or exposure estimates. The results of the solanesol analyses were compromised, apparently by exposure to light during collection (lights in the chambers were always on during the experiments). This may mean that the use of solanesol as a tracer is impractical in ''real-world'' conditions. In the final phase of the project we conducted measurements of ETS particles and tracers in three residences occupied by smokers who had joined a smoking cessation program. As a pilot study, its objective was to improve our understanding of how ETS aerosols are transported in a small number of homes (and thus, whether limiting smoking to certain areas has an effect on ETS exposures in other parts of the building). As with the chamber studies, we examined whether measurements of various chemical tracers, such as nicotine, solanesol, FPM and UVPM, could be used to accurately predict ETS concentrations and potential exposures in ''real-world'' settings, as has been suggested by several authors. The ultimate goal of these efforts, and a future larger multiple house study, is to improve the basis for estimating ETS exposures to the general public. Because we only studied three houses no firm conclusions can be developed from our data. However, the results for the ETS tracers are essentially the same as those for the chamber experiments. The use of nicotine was problematic as a marker for ETS exposure. In the smoking areas of the homes, nicotine appeared to be a suitable indicator; however in the non-smoking regions, nicotine behavior was very inconsistent. The other tracers, UVPM and FPM, provided a better basis for estimating ETS exposures in the ''real world''. The use of

  13. Rapid slowing of the atrial fibrillatory rate after administration of AZD7009 predicts conversion of atrial fibrillation

    DEFF Research Database (Denmark)

    Aunes, Maria; Egstrup, Kenneth; Frison, Lars

    2014-01-01

    to sinus rhythm (SR) and were matched to 35 non-converters. The mean AFR before conversion was 231 fibrillations per minute (fpm), having decreased by 41%; in non-converters, it was 296 fpm at the end of infusion, having decreased by 26%. The rate of decrease was greater in converters at 5 min, -88 vs. -66......BACKGROUND: Effects on the atrial fibrillatory rate (AFR) were studied during infusion with the combined potassium and sodium channel blocker AZD7009. METHODS AND RESULTS: Patients with persistent atrial fibrillation (AF) were randomized to AZD7009 or placebo. Thirty-five patients converted...... fpm (p=0.02), and at 10 min, -133 vs. -111 fpm (p=0.048). The AFR-SD and the exponential decay decreased. A small left atrial area was the only baseline predictor of conversion to SR. CONCLUSIONS: AZD7009 produced a significantly more rapid decrease of the AFR in converters than in non...

  14. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  15. Measurements of atomic transition probabilities in highly ionized atoms by fast ion beams

    International Nuclear Information System (INIS)

    Martinson, I.; Curtis, L.J.; Lindgaerd, A.

    1977-01-01

    A summary is given of the beam-foil method by which level lifetimes and transition probabilities can be determined in atoms and ions. Results are presented for systems of particular interest for fusion research, such as the Li, Be, Na, Mg, Cu and Zn isoelectronic sequences. The available experimental material is compared to theoretical transition probabilities. (author)

  16. Growth characteristics and biomass production of kenaf | Tahery ...

    African Journals Online (AJOL)

    Parameters of height, diameter and internode were measured within four to six regular intervals of 10 to 15 days, while biomass production parameters of dry one meter stalk mass (DMSM), defoliated plant mass (DPM), one meter stalk mass (MSM) and fresh plant mass (FPM) were measured at harvest time. There was no ...

  17. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  18. Computer code structure for evaluation of fire protection measures and fighting capability at nuclear plants

    International Nuclear Information System (INIS)

    Anton, V.

    1997-01-01

    In this work a computer code structure for Fire Protection Measures (FPM) and Fire Fighting Capability (FFC) at Nuclear Power Plants (NPP) is presented. It allows to evaluate the category (satisfactory (s), needs for further evaluation (n), unsatisfactory (u)) to which belongs the given NPP for a self-control in view of an IAEA inspection. This possibility of a self assessment resulted from IAEA documents. Our approach is based on international experience gained in this field and stated in IAEA recommendations. As an illustration we used the FORTRAN programming language statement to make clear the structure of the computer code for the problem taken into account. This computer programme can be conceived so that some literal message in English and Romanian languages be displayed beside the percentage assessments. (author)

  19. Analysis of femtosecond pump-probe photoelectron-photoion coincidence measurements applying Bayesian probability theory

    Science.gov (United States)

    Rumetshofer, M.; Heim, P.; Thaler, B.; Ernst, W. E.; Koch, M.; von der Linden, W.

    2018-06-01

    Ultrafast dynamical processes in photoexcited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals provoked by pump and/or probe pulses alone often obscure these excited-state signals. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement, as commonly applied, results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Coincidence measurements additionally suffer from false coincidences, requiring long data-acquisition times to keep erroneous signals at an acceptable level. Here we present a probabilistic approach based on Bayesian probability theory that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection, we reconstruct the interesting excited-state spectrum from pump-probe and pump-only measurements. This approach allows us to treat background and false coincidences consistently and on the same footing. We demonstrate that the Bayesian formalism has the following advantages over simple signal subtraction: (i) the signal-to-noise ratio is significantly increased, (ii) the pump-only contribution is not overestimated, (iii) false coincidences are excluded, (iv) prior knowledge, such as positivity, is consistently incorporated, (v) confidence intervals are provided for the reconstructed spectrum, and (vi) it is applicable to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, the Bayesian approach allows us to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. The probabilistic approach is thoroughly scrutinized by challenging mock data. The application to pump-probe coincidence measurements on acetone molecules enables quantitative interpretations

  20. Path probability of stochastic motion: A functional approach

    Science.gov (United States)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  1. On the shake-off probability for atomic systems

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)

    2016-07-15

    Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.

  2. Methodology for assessing the probability of corrosion in concrete structures on the basis of half-cell potential and concrete resistivity measurements.

    Science.gov (United States)

    Sadowski, Lukasz

    2013-01-01

    In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  3. Methodology for Assessing the Probability of Corrosion in Concrete Structures on the Basis of Half-Cell Potential and Concrete Resistivity Measurements

    Directory of Open Access Journals (Sweden)

    Lukasz Sadowski

    2013-01-01

    Full Text Available In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential Ecorr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  4. K-shell ionization probability in energetic nearly symmetric heavy-ion collisions

    International Nuclear Information System (INIS)

    Tserruya, I.; Schmidt-Boecking, H.; Schuch, R.

    1977-01-01

    Impact parameter dependent K-x-ray emission probabilities for the projectile and target atoms have been measured in 35 MeV Cl on Cl, Cl on Ti and Cl on Ni collisions. The sum of projectile plus target K-shell ionization probability is taken as a measure of the total 2psigma ionization probability. The 2pπ-2psigma totational coupling model is in clear disagreement with the present results. On the other hand the sum of probabilities is reproduced both in shape and absolute magnitude by the statistical model for inner-shell ionization. The K-shell ionization probability of the higher -Z collision partner is well described by this model including the 2psigma-1ssigma vacancy sharing probability calculated as a function of the impact parameter. (author)

  5. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  6. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  7. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  8. Probability of crack-initiation and application to NDE

    Energy Technology Data Exchange (ETDEWEB)

    Prantl, G [Nuclear Safety Inspectorate HSK, (Switzerland)

    1988-12-31

    Fracture toughness is a property with a certain variability. When a statistical distribution is assumed, the probability of crack initiation may be calculated for a given problem defined by its geometry and the applied stress. Experiments have shown, that cracks which experience a certain small amount of ductile growth can reliably be detected by acoustic emission measurements. The probability of crack detection by AE-techniques may be estimated using this experimental finding and the calculated probability of crack initiation. (author).

  9. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  10. The first permanent molar: spontaneous eruption after a five-year failure.

    Science.gov (United States)

    Mistry, Vinay N; Barker, Christopher S; James Spencer, R

    2017-09-01

    It is rare for a first permanent molar (FPM) to temporarily exhibit clinical features of failure of eruption, followed by regeneration of full eruptive capacity 5 years later. Indeterminate failure of eruption (IFE) is a diagnosis of exclusion where the distinction between primary failure of eruption (PFE) and mechanical failure of eruption (MFE) is unclear, including patients too young to specify. An 11-year-old girl attended the orthodontic clinic at Mid Yorkshire Hospitals NHS Trust regarding an unerupted lower right FPM. Her medical and dental trauma history was unremarkable. She presented with a Class II division 2 malocclusion in the mixed dentition, with all other FPMs fully erupted. This report documents that an unerupted FPM in an 11-year-old patient may still have the eruptive potential to become functional within the dentition. The period spent monitoring the FPM's outcome prior to surgical intervention has avoided an operation under general anaesthetic and potentially unnecessary orthodontic treatment, as the tooth subsequently erupted without treatment. © 2017 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Impact parameter dependence of inner-shell ionization probabilities

    International Nuclear Information System (INIS)

    Cocke, C.L.

    1974-01-01

    The probability for ionization of an inner shell of a target atom by a heavy charged projectile is a sensitive function of the impact parameter characterizing the collision. This probability can be measured experimentally by detecting the x-ray resulting from radiative filling of the inner shell in coincidence with the projectile scattered at a determined angle, and by using the scattering angle to deduce the impact parameter. It is conjectured that the functional dependence of the ionization probability may be a more sensitive probe of the ionization mechanism than is a total cross section measurement. Experimental results for the K-shell ionization of both solid and gas targets by oxygen, carbon and fluorine projectiles in the MeV/amu energy range will be presented, and their use in illuminating the inelastic collision process discussed

  12. Sufficient Statistics for Divergence and the Probability of Misclassification

    Science.gov (United States)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  13. Quantum correlations in terms of neutrino oscillation probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Alok, Ashutosh Kumar, E-mail: akalok@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Banerjee, Subhashish, E-mail: subhashish@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Uma Sankar, S., E-mail: uma@phy.iitb.ac.in [Indian Institute of Technology Bombay, Mumbai 400076 (India)

    2016-08-15

    Neutrino oscillations provide evidence for the mode entanglement of neutrino mass eigenstates in a given flavour eigenstate. Given this mode entanglement, it is pertinent to consider the relation between the oscillation probabilities and other quantum correlations. In this work, we show that all the well-known quantum correlations, such as the Bell's inequality, are directly related to the neutrino oscillation probabilities. The results of the neutrino oscillation experiments, which measure the neutrino survival probability to be less than unity, imply Bell's inequality violation.

  14. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  15. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  16. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  17. A discussion on the origin of quantum probabilities

    International Nuclear Information System (INIS)

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-01

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases

  18. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  19. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  20. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  1. Absolute transition probabilities of 5s-5p transitions of Kr I from interferometric measurements in LTE-plasmas

    International Nuclear Information System (INIS)

    Kaschek, K.; Ernst, G.K.; Boetticher, W.

    1984-01-01

    Absolute transition probabilities of nine 5s-5p transitions of Kr I have been evaluated by using the hook method. The plasma was produced in a shock tube. The population density of the 5s-levels was calculated, under the assumption of LTE, from the electron density and the ground state number measured by means of a dual wavelength interferometer. An evaluation is given which proves the validity of the LTE assumption. (orig.)

  2. The determination of transition probabilities with an inductively-coupled plasma discharge

    International Nuclear Information System (INIS)

    Nieuwoudt, G.

    1984-03-01

    The 27 MHz inductively-coupled plasma discharge (ICP) is used for the determination of relative transition probabilities of the 451, 459 and 470 nm argon spectral lines. The temperature of the argon plasma is determined with hydrogen as thermometric specie, because of the accurate transition probabilities ( approximately 1% uncertainty) there of. The relative transition probabilities of the specific argon spectral lines were determined by substitution of the measured spectral radiances thereof, together with the hydrogen temperature, in the two-line equation of temperature measurement

  3. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  4. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  5. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    Science.gov (United States)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  6. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Parallel computation of multigroup reactivity coefficient using iterative method

    Science.gov (United States)

    Susmikanti, Mike; Dewayatna, Winter

    2013-09-01

    One of the research activities to support the commercial radioisotope production program is a safety research target irradiation FPM (Fission Product Molybdenum). FPM targets form a tube made of stainless steel in which the nuclear degrees of superimposed high-enriched uranium. FPM irradiation tube is intended to obtain fission. The fission material widely used in the form of kits in the world of nuclear medicine. Irradiation FPM tube reactor core would interfere with performance. One of the disorders comes from changes in flux or reactivity. It is necessary to study a method for calculating safety terrace ongoing configuration changes during the life of the reactor, making the code faster became an absolute necessity. Neutron safety margin for the research reactor can be reused without modification to the calculation of the reactivity of the reactor, so that is an advantage of using perturbation method. The criticality and flux in multigroup diffusion model was calculate at various irradiation positions in some uranium content. This model has a complex computation. Several parallel algorithms with iterative method have been developed for the sparse and big matrix solution. The Black-Red Gauss Seidel Iteration and the power iteration parallel method can be used to solve multigroup diffusion equation system and calculated the criticality and reactivity coeficient. This research was developed code for reactivity calculation which used one of safety analysis with parallel processing. It can be done more quickly and efficiently by utilizing the parallel processing in the multicore computer. This code was applied for the safety limits calculation of irradiated targets FPM with increment Uranium.

  8. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  9. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  10. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  11. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  13. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  14. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  16. The Bayesian count rate probability distribution in measurement of ionizing radiation by use of a ratemeter

    Energy Technology Data Exchange (ETDEWEB)

    Weise, K.

    2004-06-01

    Recent metrological developments concerning measurement uncertainty, founded on Bayesian statistics, give rise to a revision of several parts of the DIN 25482 and ISO 11929 standard series. These series stipulate detection limits and decision thresholds for ionizing-radiation measurements. Part 3 and, respectively, part 4 of them deal with measurements by use of linear-scale analogue ratemeters. A normal frequency distribution of the momentary ratemeter indication for a fixed count rate value is assumed. The actual distribution, which is first calculated numerically by solving an integral equation, differs, however, considerably from the normal distribution although this one represents an approximation of it for sufficiently large values of the count rate to be measured. As is shown, this similarly holds true for the Bayesian probability distribution of the count rate for sufficiently large given measured values indicated by the ratemeter. This distribution follows from the first one mentioned by means of the Bayes theorem. Its expectation value and variance are needed for the standards to be revised on the basis of Bayesian statistics. Simple expressions are given by the present standards for estimating these parameters and for calculating the detection limit and the decision threshold. As is also shown, the same expressions can similarly be used as sufficient approximations by the revised standards if, roughly, the present indicated value exceeds the reciprocal ratemeter relaxation time constant. (orig.)

  17. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  18. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  19. Zirconium and Yttrium (p, d) Surrogate Nuclear Reactions: Measurement and determination of gamma-ray probabilities: Experimental Physics Report

    Energy Technology Data Exchange (ETDEWEB)

    Burke, J. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hughes, R. O. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Escher, J. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Scielzo, N. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Casperson, R. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ressler, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Saastamoinen, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ota, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Park, H. I. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ross, T. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCleskey, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCleskey, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Austin, R. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rapisarda, G. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-21

    This technical report documents the surrogate reaction method and experimental results used to determine the desired neutron induced cross sections of 87Y(n,g) and the known 90Zr(n,g) cross section. This experiment was performed at the STARLiTeR apparatus located at Texas A&M Cyclotron Institute using the K150 Cyclotron which produced a 28.56 MeV proton beam. The proton beam impinged on Y and Zr targets to produce the nuclear reactions 89Y(p,d)88Y and 92Zr(p,d)91Zr. Both particle singles data and particle-gamma ray coincident data were measured during the experiment. This data was used to determine the γ-ray probability as a function of energy for these reactions. The results for the γ-ray probabilities as a function of energy for both these nuclei are documented here. For completeness, extensive tabulated and graphical results are provided in the appendices.

  20. Transition probabilities and radiative decay constants of the excited levels of Ne

    International Nuclear Information System (INIS)

    Wosinski, L.

    1981-01-01

    Transition probabilities for eight optical transitions between the 3p and 3d neon levels have been measured by the ''plasma transparency method''. The transitions probabilities are placed on an absolute scale by use of the recently reported values for the 4p→3s transitions. The measurements of induced changes in populations allowed the determination of the ratios of the radiative decay constants for the 4p and 3d levels. The experimental results are compared with the theoretically calculated transitions probabilities of Murphy and Lilly. (author)

  1. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  2. The Intersection Probability of Brownian Motion and SLEκ

    Directory of Open Access Journals (Sweden)

    Shizhong Zhou

    2015-01-01

    Full Text Available By using excursion measure Poisson kernel method, we obtain a second-order differential equation of the intersection probability of Brownian motion and SLEκ. Moreover, we find a transformation such that the second-order differential equation transforms into a hypergeometric differential equation. Then, by solving the hypergeometric differential equation, we obtain the explicit formula of the intersection probability for the trace of the chordal SLEκ and planar Brownian motion started from distinct points in an upper half-plane H-.

  3. The relationship between operating cash flow per share and portfolio default probability

    Directory of Open Access Journals (Sweden)

    Mohammad Khodaei Valahzaghard

    2014-03-01

    Full Text Available One of the primary duties of the depositary banks is to protect themselves against any possibility of bankruptcy. This requires the identification and measurement of risks, including default risk, which is important given the nature of the activities of banks. This paper presents an empirical investigation to study the relationship between default probability and some financial figures including operating cash flow, liabilities and return of equities. The proposed study of this paper uses historical data of twenty-two firms listed on Tehran Stock Exchange over the period 2008-2012. Default probability as the dependent variable is measured by the method developed by Moody’s KMV Company. The study uses linear regression model to examine the relationship between default probability and some independent variables. The results of the present study suggest that there were some reverse relationship between operating cash flow per share, return on equities and default probability. In addition, there was a direct relationship between log facilities and default probability. However, there was not any relationship between net sales and default probability.

  4. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  5. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  6. Factors influencing reporting and harvest probabilities in North American geese

    Science.gov (United States)

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  7. Probabilities and Shannon's Entropy in the Everett Many-Worlds Theory

    Directory of Open Access Journals (Sweden)

    Andreas Wichert

    2016-12-01

    Full Text Available Following a controversial suggestion by David Deutsch that decision theory can solve the problem of probabilities in the Everett many-worlds we suggest that the probabilities are induced by Shannon's entropy that measures the uncertainty of events. We argue that a relational person prefers certainty to uncertainty due to fundamental biological principle of homeostasis.

  8. The probability representation as a new formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Man'ko, Margarita A; Man'ko, Vladimir I

    2012-01-01

    We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.

  9. Bremsstrahlung emission probability in the α decay of 210Po

    International Nuclear Information System (INIS)

    Boie, Hans-Hermann

    2009-01-01

    A high-statistics measurement of bremsstrahlung emitted in the α decay of 210 Po has been performed. The measured differential emission probabilities, which could be followed up to γ-energies of ∝ 500 keV, allow for the first time for a serious test of various model calculations of the bremsstrahlung accompanied α decay. It is shown that corrections to the α-γ angular correlation due to the interference between the electric dipole and quadrupole amplitudes and due to the relativistic character of the process have to be taken into account. With the experimentally derived angular correlation the measured energydifferential bremsstrahlung emission probabilities show excellent agreement with the fully quantum mechanical calculation. (orig.)

  10. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  11. Measurement of the ionization probability of the 1s sigma molecular orbital in half a collision at zero impact parameter

    International Nuclear Information System (INIS)

    Chemin, J.F.; Andriamonje, S.; Guezet, D.; Thibaud, J.P.; Aguer, P.; Hannachi, F.; Bruandet, J.F.

    1984-01-01

    We have measured, for the first time, the ionization probability Psub(1s sigma) of the 1s sigma molecular orbital in the way into a nuclear reaction (in half a collision at zero impact parameter) in a near symmetric collision 58 Ni + 54 Fe at 230 MeV leads to a compound nucleus of 112 Xe highly excited which decays first by sequential emission of charged particles and then by sequential emission of gamma rays. The determination of Psub(1s sigma) is based on the coincidence measurement between X-rays and γ-rays and the Doppler shift method is used to discrimine the ''atomic'' and ''nuclear'' X-rays

  12. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  13. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  14. Performance evaluation of mobile downflow booths for reducing airborne particles in the workplace.

    Science.gov (United States)

    Lo, Li-Ming; Hocker, Braden; Steltz, Austin E; Kremer, John; Feng, H Amy

    2017-11-01

    Compared to other common control measures, the downflow booth is a costly engineering control used to contain airborne dust or particles. The downflow booth provides unidirectional filtered airflow from the ceiling, entraining released particles away from the workers' breathing zone, and delivers contained airflow to a lower level exhaust for removing particulates by filtering media. In this study, we designed and built a mobile downflow booth that is capable of quick assembly and easy size change to provide greater flexibility and particle control for various manufacturing processes or tasks. An experimental study was conducted to thoroughly evaluate the control performance of downflow booths used for removing airborne particles generated by the transfer of powdered lactose between two containers. Statistical analysis compared particle reduction ratios obtained from various test conditions including booth size (short, regular, or extended), supply air velocity (0.41 and 0.51 m/s or 80 and 100 feet per minute, fpm), powder transfer location (near or far from the booth exhaust), and inclusion or exclusion of curtains at the booth entrance. Our study results show that only short-depth downflow booths failed to protect the worker performing powder transfer far from the booth exhausts. Statistical analysis shows that better control performance can be obtained with supply air velocity of 0.51 m/s (100 fpm) than with 0.41 m/s (80 fpm) and that use of curtains for downflow booths did not improve their control performance.

  15. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  16. Measurement on K-electron capture probabilities in the decay of [sup 183]Re and [sup 168]Tm

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, N.V.S.V.; Rao, M.V.S.C.; Reddy, S.B.; Satyanarayana, G.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Swami Jnanananda Labs. for Nuclear Research); Murty, G.S.K. (UNDNJ, Newark, NJ (United States). Dept. of Radiology); Chintalapudi, S.N. (Inter University Consortium for DAE Facilities, Calcutta (India))

    1994-03-01

    The K-electron capture probabilities for the 5/2[sup +] to 3/2[sup -]transition in the electron capture decay of [sup 183]Re to the 208.805 keV level in the daughter [sup 183]W and for the 3[sup (+)] to 3[sup -]and 3[sup (+)] to 4[sup -] transitions in the electron capture decay of [sup 168]Tm to the 1541.4 keV and 1093.0 keV levels, respectively, in the daughter [sup 168]Er were measured for the first time using an x-[gamma] summing method. The experimental P[sub K] values are reported in this paper, together with those due to theory, and discussed. (Author).

  17. Measuring inequity aversion in a heterogeneous population using experimental decisions and subjective probabilities

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2008-01-01

    We combine choice data in the ultimatum game with the expectations of proposers elicited by subjective probability questions to estimate a structural model of decision making under uncertainty. The model, estimated using a large representative sample of subjects from the Dutch population, allows

  18. Test of the X(5) symmetry in 156Dy and 178Os by measurement of electromagnetic transition probabilities

    International Nuclear Information System (INIS)

    Moeller, O.

    2005-01-01

    This work reports on results from two Recoil-Distance-Doppler-Shift lifetime measurements of excited states in 155 Dy and 178 Os. The experiments were carried out at the GASP spektrometer of the Laboratori Nazional i di Legnaro in combination with the Cologne plunger apparatus. The main purpose of the performed experiments was to test the predictions of the X(5) critical point symmetry in these two nuclei. In 156 Dy and 178 Os 29 lifetimes of excited states were derived using the Differential-Decay-Curve method. In weaker reaction channels the nuclei 155 Dy, 157 Dy and 177 Os were populated. In these nuclei 32 additional lifetimes were measured, most of them for the first time. In order to calculate absolute transition probabilities from the measured lifetimes of the first excited band in 156 Dy, essential branching ratios were derived from the measured data with a very small systematic error ( 178 Os confirm the consistency of a X(5) description in these nuclei. A comparision with the well established X(5)-like nuclei in the N=90 isotones gives an agreement with the X(5) description of at least the same quality. (orig.)

  19. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  20. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  1. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  2. Effect of grinding conditions on the fatigue life of titanium 5Al-2.5Sn alloy

    Science.gov (United States)

    Rangaswamy, P.; Terutung, H.; Jeelani, S.

    1991-01-01

    An investigation into the effect of grinding conditions on the fatigue life of titanium 5Al-2.5Sn is presented. Damage to surface integrity and changes in the residual stresses distribution are studied to assess changes in fatigue life. A surface grinding machine, operating at speeds ranging from 2000 to 6000 fpm and using SiC wheels of grit sizes 60 and 120, was used to grind flat subsize specimens of 0.1-in. thickness. After grinding, the specimens were fatigued at a chosen stress and compared with the unadulterated material. A standard profilometer, a microhardness tester, and a scanning electron microscope were utilized to examine surface characteristics and measure roughness and hardness. Increased grinding speed in both wet and dry applications tended to decrease the fatigue life of the specimens. Fatigue life increased markedly at 2000 fpm under wet conditions, but then decreased at higher speeds. Grit size had no effect on the fatigue life.

  3. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  5. Probability-density-function characterization of multipartite entanglement

    International Nuclear Information System (INIS)

    Facchi, P.; Florio, G.; Pascazio, S.

    2006-01-01

    We propose a method to characterize and quantify multipartite entanglement for pure states. The method hinges upon the study of the probability density function of bipartite entanglement and is tested on an ensemble of qubits in a variety of situations. This characterization is also compared to several measures of multipartite entanglement

  6. Defining Baconian Probability for Use in Assurance Argumentation

    Science.gov (United States)

    Graydon, Patrick J.

    2016-01-01

    The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.

  7. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  8. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  9. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  10. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  13. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  14. Moving beyond probabilities – Strength of knowledge characterisations applied to security

    International Nuclear Information System (INIS)

    Askeland, Tore; Flage, Roger; Aven, Terje

    2017-01-01

    Many security experts avoid the concept of probability when assessing risk and vulnerabilities. Their main argument is that meaningful probabilities cannot be determined and they are consequently not useful for decision-making and security management. However, to give priority to some measures and not others, the likelihood dimension needs to be addressed in some way; the question is how. One approach receiving attention recently is to add strength of knowledge judgements to the probabilities and probability intervals generated. The judgements provide a qualitative labelling of how strong the knowledge supporting the probability assignments is. Criteria for such labelling have been developed, but not for a security setting. The purpose of this paper is to develop such criteria specific to security applications and, using some examples, to demonstrate their suitability. - Highlights: • The concept of probability is often avoided in security risk assessments. • We argue that the likelihood/probability dimension needs to be somehow addressed. • Probabilities should be supplemented by qualitative strength-of-knowledge scores. • Such criteria specific to security applications are developed. • Two examples are used to demonstrate the suitability of the suggested criteria.

  15. Bremsstrahlung emission probability in the {alpha} decay of {sup 210}Po

    Energy Technology Data Exchange (ETDEWEB)

    Boie, Hans-Hermann

    2009-06-03

    A high-statistics measurement of bremsstrahlung emitted in the {alpha} decay of {sup 210}Po has been performed. The measured differential emission probabilities, which could be followed up to {gamma}-energies of {proportional_to} 500 keV, allow for the first time for a serious test of various model calculations of the bremsstrahlung accompanied {alpha} decay. It is shown that corrections to the {alpha}-{gamma} angular correlation due to the interference between the electric dipole and quadrupole amplitudes and due to the relativistic character of the process have to be taken into account. With the experimentally derived angular correlation the measured energydifferential bremsstrahlung emission probabilities show excellent agreement with the fully quantum mechanical calculation. (orig.)

  16. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  17. Measuring survival time: a probability-based approach useful in healthcare decision-making.

    Science.gov (United States)

    2011-01-01

    In some clinical situations, the choice between treatment options takes into account their impact on patient survival time. Due to practical constraints (such as loss to follow-up), survival time is usually estimated using a probability calculation based on data obtained in clinical studies or trials. The two techniques most commonly used to estimate survival times are the Kaplan-Meier method and the actuarial method. Despite their limitations, they provide useful information when choosing between treatment options.

  18. Κ-electron capture probability in 167Tm

    International Nuclear Information System (INIS)

    Sree Krishna Murty, G.; Chandrasekhar Rao, M.V.S.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Ramana Rao, P.V.; Sastry, D.L.

    1990-01-01

    The Κ-electron capture probability in the decay of 167 Tm for the first-forbidden transition 1/2 + →3/2 - was measured using the sum-coincidence method and employing a hyper-pure Ge system. The P Κ value is found to be 0.835±0.029, in agreement with the theoretical value of 0.829. (author)

  19. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  20. Determination of the Quantity of I-135 Released from the AGR Experiment Series

    International Nuclear Information System (INIS)

    Scates, Dawn M.; Walter, John B.; Reber, Edward L.; Sterbentz, James W.; Petti, David A.

    2014-01-01

    gas stream will appear to decay with the parent half-life. This equilibrium condition enables the determination of the amount of "1"3"5I released from the fuel particles by measurement of the "1"3"5"mXe at the FPM following reactor shutdown. In this paper, the "1"3"5I released will be reported and compared to similar releases for noble gases as well as the unexpected finding of "1"3"1I deposition from intentional impure gas injection into capsule 11 of experiment AGR-3/4. (author)

  1. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  2. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  3. Main factors for fatigue failure probability of pipes subjected to fluid thermal fluctuation

    International Nuclear Information System (INIS)

    Machida, Hideo; Suzuki, Masaaki; Kasahara, Naoto

    2015-01-01

    It is very important to grasp failure probability and failure mode appropriately to carry out risk reduction measures of nuclear power plants. To clarify the important factors for failure probability and failure mode of pipes subjected to fluid thermal fluctuation, failure probability analyses were performed by changing the values of a stress range, stress ratio, stress components and threshold of stress intensity factor range. The important factors for the failure probability are range, stress ratio (mean stress condition) and threshold of stress intensity factor range. The important factor for the failure mode is a circumferential angle range of fluid thermal fluctuation. When a large fluid thermal fluctuation acts on the entire circumferential surface of the pipe, the probability of pipe breakage increases, calling for measures to prevent such a failure and reduce the risk to the plant. When the circumferential angle subjected to fluid thermal fluctuation is small, the failure mode of piping is leakage and the corrective maintenance might be applicable from the viewpoint of risk to the plant. (author)

  4. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand.  Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.Keywords: Perceived Understanding, Probability Concepts, Rasch Measurement Model DOI: dx.doi.org/10.22342/jme.61.1

  5. The risk of major nuclear accident: calculation and perception of probabilities

    International Nuclear Information System (INIS)

    Leveque, Francois

    2013-01-01

    Whereas before the Fukushima accident, already eight major accidents occurred in nuclear power plants, a number which is higher than that expected by experts and rather close to that corresponding of people perception of risk, the author discusses how to understand these differences and reconcile observations, objective probability of accidents and subjective assessment of risks, why experts have been over-optimistic, whether public opinion is irrational regarding nuclear risk, and how to measure risk and its perception. Thus, he addresses and discusses the following issues: risk calculation (cost, calculated frequency of major accident, bias between the number of observed accidents and model predictions), perceived probabilities and aversion for disasters (perception biases of probability, perception biases unfavourable to nuclear), the Bayes contribution and its application (Bayes-Laplace law, statistics, choice of an a priori probability, prediction of the next event, probability of a core fusion tomorrow)

  6. Introduction to tensorial resistivity probability tomography

    OpenAIRE

    Mauriello, Paolo; Patella, Domenico

    2005-01-01

    The probability tomography approach developed for the scalar resistivity method is here extended to the 2D tensorial apparent resistivity acquisition mode. The rotational invariant derived from the trace of the apparent resistivity tensor is considered, since it gives on the datum plane anomalies confined above the buried objects. Firstly, a departure function is introduced as the difference between the tensorial invariant measured over the real structure and that computed for a reference uni...

  7. Absolute Kr I and Kr II transition probabilities

    International Nuclear Information System (INIS)

    Brandt, T.; Helbig, V.; Nick, K.P.

    1982-01-01

    Transition probabilities for 11 KrI and 9 KrII lines between 366.5 and 599.3nm were obtained from measurements with a wall-stabilised arc at atmospheric pressure in pure krypton. The population densities of the excited krypton levels were calculated under the assumption of LTE from electron densities measured by laser interferometry. The uncertainties for the KrI and the KrII data are 15 and 25% respectively. (author)

  8. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  11. Effects Of Local Oscillator Errors On Digital Beamforming

    Science.gov (United States)

    2016-03-01

    processor EF element factor EW electronic warfare FFM flicker frequency modulation FOV field-of-view FPGA field-programmable gate array FPM flicker...frequencies and also more difficult to measure [15]. 2. Flicker frequency modulation The source for flicker frequency modulation ( FFM ) is attributed to...a physical resonance mechanism of an oscillator or issues controlling electronic components. Some oscillators might not show FFM noise, which might

  12. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  13. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  14. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  15. Semantic and associative factors in probability learning with words.

    Science.gov (United States)

    Schipper, L M; Hanson, B L; Taylor, G; Thorpe, J A

    1973-09-01

    Using a probability-learning technique with a single word as the cue and with the probability of a given event following this word fixed at .80, it was found (1) that neither high nor low associates to the original word and (2) that neither synonyms nor antonyms showed differential learning curves subsequent to original learning when the probability for the following event was shifted to .20. In a second study when feedback, in the form of knowledge of results, was withheld, there was a clear-cut similarity of predictions to the originally trained word and the synonyms of both high and low association value and a dissimilarity of these words to a set of antonyms of both high and low association value. Two additional studies confirmed the importance of the semantic dimension as compared with association value as traditionally measured.

  16. A new method for estimating the probable maximum hail loss of a building portfolio based on hailfall intensity determined by radar measurements

    Science.gov (United States)

    Aller, D.; Hohl, R.; Mair, F.; Schiesser, H.-H.

    2003-04-01

    Extreme hailfall can cause massive damage to building structures. For the insurance and reinsurance industry it is essential to estimate the probable maximum hail loss of their portfolio. The probable maximum loss (PML) is usually defined with a return period of 1 in 250 years. Statistical extrapolation has a number of critical points, as historical hail loss data are usually only available from some events while insurance portfolios change over the years. At the moment, footprints are derived from historical hail damage data. These footprints (mean damage patterns) are then moved over a portfolio of interest to create scenario losses. However, damage patterns of past events are based on the specific portfolio that was damaged during that event and can be considerably different from the current spread of risks. A new method for estimating the probable maximum hail loss to a building portfolio is presented. It is shown that footprints derived from historical damages are different to footprints of hail kinetic energy calculated from radar reflectivity measurements. Based on the relationship between radar-derived hail kinetic energy and hail damage to buildings, scenario losses can be calculated. A systematic motion of the hail kinetic energy footprints over the underlying portfolio creates a loss set. It is difficult to estimate the return period of losses calculated with footprints derived from historical damages being moved around. To determine the return periods of the hail kinetic energy footprints over Switzerland, 15 years of radar measurements and 53 years of agricultural hail losses are available. Based on these data, return periods of several types of hailstorms were derived for different regions in Switzerland. The loss set is combined with the return periods of the event set to obtain an exceeding frequency curve, which can be used to derive the PML.

  17. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  18. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  19. K-electron capture probability in 171Lu

    International Nuclear Information System (INIS)

    Mishra, N.R.; Vara Prasad, N.V.S.; Chandrasekhara Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L.; Chintalapudi, S.N.

    1999-01-01

    The K-electron capture probability in the decay of 171 Lu to the 835.06 keV level of the daughter nucleus 171 Yb is measured to be 0.822 ± 0.027 involving two transitions, in agreement with the theoretical value 0.833. The experimental value is seen to be consistent with the mass prediction of the relationship due to Wapstra and Bos. (author)

  20. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  1. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  2. UNA INTRODUCCIÓN AL MÉTODO LIBRE DE MALLA DE CONJUNTOS FINITOS DE PUNTOS

    Directory of Open Access Journals (Sweden)

    Jorge Mauricio Ruiz V.

    2016-08-01

    Full Text Available Este trabajo propone una introducción corta y simple al método libre de malla conocido con el nombre de método de conjuntos finitos de puntos (FPM. Se describen los conceptos importantes que involucra el método como: la generación del sistema de puntos, búsqueda de puntos vecinos, aproximación de las derivadas espaciales mediante el método de mínimos cuadrados móviles y la solución de sistemas de ecuaciones diferenciales ordinarias resultantes. Como aplicación del método FPM se soluciona la ecuación viscosa y no viscosa de Burgers. Las soluciones numéricas son comparadas con la solución analítica y se realiza un análisis de convergencia del método vía experimentación numérica. Se proveen las rutinas de MATLAB de los pasos fundamentales del método FPM, que pueden ser usados para resolver problemas más complicados.

  3. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  4. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  5. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  6. A measurement error approach to assess the association between dietary diversity, nutrient intake, and mean probability of adequacy.

    Science.gov (United States)

    Joseph, Maria L; Carriquiry, Alicia

    2010-11-01

    Collection of dietary intake information requires time-consuming and expensive methods, making it inaccessible to many resource-poor countries. Quantifying the association between simple measures of usual dietary diversity and usual nutrient intake/adequacy would allow inferences to be made about the adequacy of micronutrient intake at the population level for a fraction of the cost. In this study, we used secondary data from a dietary intake study carried out in Bangladesh to assess the association between 3 food group diversity indicators (FGI) and calcium intake; and the association between these same 3 FGI and a composite measure of nutrient adequacy, mean probability of adequacy (MPA). By implementing Fuller's error-in-the-equation measurement error model (EEM) and simple linear regression (SLR) models, we assessed these associations while accounting for the error in the observed quantities. Significant associations were detected between usual FGI and usual calcium intakes, when the more complex EEM was used. The SLR model detected significant associations between FGI and MPA as well as for variations of these measures, including the best linear unbiased predictor. Through simulation, we support the use of the EEM. In contrast to the EEM, the SLR model does not account for the possible correlation between the measurement errors in the response and predictor. The EEM performs best when the model variables are not complex functions of other variables observed with error (e.g. MPA). When observation days are limited and poor estimates of the within-person variances are obtained, the SLR model tends to be more appropriate.

  7. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  8. Device to detect the presence of a pure signal in a discrete noisy signal measured at an average rate of constant noise with a probability of false detection lower than one predeterminated

    International Nuclear Information System (INIS)

    Poussier, E.; Rambaut, M.

    1986-01-01

    Detection consists of a measurement of a counting rate. A probability of wrong detection is associated with this counting rate and with an average estimated rate of noise. Detection consists also in comparing the wrong detection probability to a predeterminated rate of wrong detection. The comparison can use tabulated values. Application is made to corpuscule radiation detection [fr

  9. The case of escape probability as linear in short time

    Science.gov (United States)

    Marchewka, A.; Schuss, Z.

    2018-02-01

    We derive rigorously the short-time escape probability of a quantum particle from its compactly supported initial state, which has a discontinuous derivative at the boundary of the support. We show that this probability is linear in time, which seems to be a new result. The novelty of our calculation is the inclusion of the boundary layer of the propagated wave function formed outside the initial support. This result has applications to the decay law of the particle, to the Zeno behaviour, quantum absorption, time of arrival, quantum measurements, and more.

  10. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  11. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  12. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    Science.gov (United States)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  13. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  14. Transition Probabilities in {sup 189}Os

    Energy Technology Data Exchange (ETDEWEB)

    Malmskog, S G; Berg, V; Baecklin, A

    1970-02-15

    The level structure of {sup 189}Os has been studied from the decay of {sup 189}Ir (13,3 days) produced in proton spallation at CERN and mass separated in the ISOLDE on-line facility. The gamma-ray spectrum has been recorded both with a high resolution Si(Li) - detector and Ge(Li) - detectors. Three previously not reported transitions were observed defining a new level at 348.5 keV. Special attention was given to the low energy level band structure. Several multipolarity mixing ratios were deduced from measured L-subshell ratios which, together with measured level half-lives, gave absolute transition probabilities. The low level decay properties are discussed in terms of the Nilsson model with the inclusion of Coriolis coupling.

  15. Lower first permanent molars: developing better predictors of spontaneous space closure.

    Science.gov (United States)

    Teo, Terry Kuo-Yih; Ashley, Paul Francis; Derrick, Donald

    2016-02-01

    First, first permanent molars (FPMs) of poor prognosis are often planned for extraction at an 'ideal time' so that second permanent molars (SPMs) erupt favourably to replace them. However for lower FPM extractions, timing is not an accurate predictor of success. The aim of this study was to identify additional radiographic factors that could better predict the degree of spontaneous space closure of the lower SPM following FPM extraction. Data from a previous study of 127 lower SPMs from 66 patients was re-analysed by incorporating additional radiographic factors. These included calcification stage of the bifurcation of the SPM, position of the second premolar, mesial angulation of SPM in relation to the FPM, and presence of the third permanent molar. Results were analysed using ordered logistic regression. Only 58 per cent of FPMs extracted at the 'ideal time' (SPM development at Demirjian stage E) had complete space closure. The best outcomes resulted from a combination of SPMs not at Demirjian development stage G, together with the presence of mesial angulation of the SPM and presence of the third permanent molar, where 85 per cent of those cases had complete space closure. Apart from extraction timing of the FPM, consideration must also be given to the presence of the third permanent molar and angulation of the SPM in order to ensure a reliable degree of spontaneous space closure of the lower SPM. © The Author 2015. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  16. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  17. Estimating the probability that the Taser directly causes human ventricular fibrillation.

    Science.gov (United States)

    Sun, H; Haemmerich, D; Rahko, P S; Webster, J G

    2010-04-01

    This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.

  18. Communicating through Probabilities: Does Quantum Theory Optimize the Transfer of Information?

    Directory of Open Access Journals (Sweden)

    William K. Wootters

    2013-08-01

    Full Text Available A quantum measurement can be regarded as a communication channel, in which the parameters of the state are expressed only in the probabilities of the outcomes of the measurement. We begin this paper by considering, in a non-quantum-mechanical setting, the problem of communicating through probabilities. For example, a sender, Alice, wants to convey to a receiver, Bob, the value of a continuous variable, θ, but her only means of conveying this value is by sending Bob a coin in which the value of θ is encoded in the probability of heads. We ask what the optimal encoding is when Bob will be allowed to flip the coin only a finite number of times. As the number of tosses goes to infinity, we find that the optimal encoding is the same as what nature would do if we lived in a world governed by real-vector-space quantum theory. We then ask whether the problem might be modified, so that the optimal communication strategy would be consistent with standard, complex-vector-space quantum theory.

  19. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  20. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  1. Probability distributions of placental morphological measurements and origins of variability of placental shapes.

    Science.gov (United States)

    Yampolsky, M; Salafia, C M; Shlakhter, O

    2013-06-01

    While the mean shape of human placenta is round with centrally inserted umbilical cord, significant deviations from this ideal are fairly common, and may be clinically meaningful. Traditionally, they are explained by trophotropism. We have proposed a hypothesis explaining typical variations in placental shape by randomly determined fluctuations in the growth process of the vascular tree. It has been recently reported that umbilical cord displacement in a birth cohort has a log-normal probability distribution, which indicates that the displacement between an initial point of origin and the centroid of the mature shape is a result of accumulation of random fluctuations of the dynamic growth of the placenta. To confirm this, we investigate statistical distributions of other features of placental morphology. In a cohort of 1023 births at term digital photographs of placentas were recorded at delivery. Excluding cases with velamentous cord insertion, or missing clinical data left 1001 (97.8%) for which placental surface morphology features were measured. Best-fit statistical distributions for them were obtained using EasyFit. The best-fit distributions of umbilical cord displacement, placental disk diameter, area, perimeter, and maximal radius calculated from the cord insertion point are of heavy-tailed type, similar in shape to log-normal distributions. This is consistent with a stochastic origin of deviations of placental shape from normal. Deviations of placental shape descriptors from average have heavy-tailed distributions similar in shape to log-normal. This evidence points away from trophotropism, and towards a spontaneous stochastic evolution of the variants of placental surface shape features. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  3. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  5. Alpha-particle emission probabilities of ²³⁶U obtained by alpha spectrometry.

    Science.gov (United States)

    Marouli, M; Pommé, S; Jobbágy, V; Van Ammel, R; Paepen, J; Stroh, H; Benedik, L

    2014-05-01

    High-resolution alpha-particle spectrometry was performed with an ion-implanted silicon detector in vacuum on a homogeneously electrodeposited (236)U source. The source was measured at different solid angles subtended by the detector, varying between 0.8% and 2.4% of 4π sr, to assess the influence of coincidental detection of alpha-particles and conversion electrons on the measured alpha-particle emission probabilities. Additional measurements were performed using a bending magnet to eliminate conversion electrons, the results of which coincide with normal measurements extrapolated to an infinitely small solid angle. The measured alpha emission probabilities for the three main peaks - 74.20 (5)%, 25.68 (5)% and 0.123 (5)%, respectively - are consistent with literature data, but their precision has been improved by at least one order of magnitude in this work. © 2013 Published by Elsevier Ltd.

  6. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  7. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  8. Snell Envelope with Small Probability Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Del Moral, Pierre, E-mail: Pierre.Del-Moral@inria.fr; Hu, Peng, E-mail: Peng.Hu@inria.fr [Universite de Bordeaux I, Centre INRIA Bordeaux et Sud-Ouest and Institut de Mathematiques de Bordeaux (France); Oudjane, Nadia, E-mail: Nadia.Oudjane@edf.fr [EDF R and D Clamart (France)

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  9. Kappa. -electron capture probability in sup 167 Tm

    Energy Technology Data Exchange (ETDEWEB)

    Sree Krishna Murty, G.; Chandrasekhar Rao, M.V.S.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Ramana Rao, P.V.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Labs. for Nuclear Research); Chintalapudi, S.N. (Variable Energy Cyclotron Centre, Calcutta (India))

    1990-07-01

    The {Kappa}-electron capture probability in the decay of {sup 167}Tm for the first-forbidden transition 1/2{sup +}{yields}3/2{sup -} was measured using the sum-coincidence method and employing a hyper-pure Ge system. The P{sub {Kappa}} value is found to be 0.835{plus minus}0.029, in agreement with the theoretical value of 0.829. (author).

  10. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  11. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  14. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  15. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  17. Computer simulation of probability of detection

    International Nuclear Information System (INIS)

    Fertig, K.W.; Richardson, J.M.

    1983-01-01

    This paper describes an integrated model for assessing the performance of a given ultrasonic inspection system for detecting internal flaws, where the performance of such a system is measured by probability of detection. The effects of real part geometries on sound propagations are accounted for and the noise spectra due to various noise mechanisms are measured. An ultrasonic inspection simulation computer code has been developed to be able to detect flaws with attributes ranging over an extensive class. The detection decision is considered to be a binary decision based on one received waveform obtained in a pulse-echo or pitch-catch setup. This study focuses on the detectability of flaws using an amplitude thresholding type. Some preliminary results on the detectability of radially oriented cracks in IN-100 for bore-like geometries are given

  18. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  19. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  20. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  1. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  2. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  3. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Directory of Open Access Journals (Sweden)

    Michael R W Dawson

    Full Text Available Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  4. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability

    Science.gov (United States)

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent’s environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned. PMID:28212422

  5. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Science.gov (United States)

    Dawson, Michael R W; Gupta, Maya

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  6. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  7. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  8. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  9. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  10. Is the classical law of the addition of probabilities violated in quantum interference?

    International Nuclear Information System (INIS)

    Arsenovic, Dusan; Bozic, Mirjana; Vuskovic, Lepsa

    2002-01-01

    We analyse and compare the positive and negative arguments on whether quantum interference violates the classical law of the addition of probabilities. The analysis takes into account the results of recent interference experiments in neutron, electron and atom optics. Nonclassical behaviour of atoms was found in atomic experiments where the measurements included their time of arrival and space distribution. We determine probabilities of elementary events associated with the nonclassical behaviour of particles in interferometers. We show that the emergence of the interference pattern in the process of accumulation of such elementary events is consistent with the classical law of the addition of probabilities

  11. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  12. Classical probability model for Bell inequality

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2014-01-01

    We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction 'Kolmogorovization' of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θ i ,θ ' j ) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θ i , θ ' j )). One may (but need not) say that randomness of 'free will' has to be accounted for.

  13. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  14. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  15. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  17. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  18. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  19. A Hierarchy of Compatibility and Comeasurability Levels in Quantum Logics with Unique Conditional Probabilities

    International Nuclear Information System (INIS)

    Niestegge, Gerd

    2010-01-01

    In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lueders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases. (general)

  20. Correlations between channel probabilities in collisional dissociation of D3+

    International Nuclear Information System (INIS)

    Abraham, S.; Nir, D.; Rosner, B.

    1984-01-01

    Measurements of the dissociation of D 3 + ions at 300--600 keV under single- and multiple-collision conditions in Ar- and H 2 -gas targets have been performed. A complete separation of all dissociation channels was achieved, including the neutral channels, which were resolved using a fine-mesh technique. Data analysis in the multiple-collision regime confirms the validity of the rate equations governing the charge exchange processes. In the single-collision region the analysis yields constant relations between channel probabilities. Data rearrangement shows probability factorization and suggests that collisional dissociation is a two-stage process, a fast electron exchange followed by rearrangement and branching to the exit channels

  1. A Short History of Probability Theory and Its Applications

    Science.gov (United States)

    Debnath, Lokenath; Basu, Kanadpriya

    2015-01-01

    This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…

  2. Fixation Probabilities of Evolutionary Graphs Based on the Positions of New Appearing Mutants

    Directory of Open Access Journals (Sweden)

    Pei-ai Zhang

    2014-01-01

    Full Text Available Evolutionary graph theory is a nice measure to implement evolutionary dynamics on spatial structures of populations. To calculate the fixation probability is usually regarded as a Markov chain process, which is affected by the number of the individuals, the fitness of the mutant, the game strategy, and the structure of the population. However the position of the new mutant is important to its fixation probability. Here the position of the new mutant is laid emphasis on. The method is put forward to calculate the fixation probability of an evolutionary graph (EG of single level. Then for a class of bilevel EGs, their fixation probabilities are calculated and some propositions are discussed. The conclusion is obtained showing that the bilevel EG is more stable than the corresponding one-rooted EG.

  3. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  4. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  5. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  6. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  7. Application of geometric probability to the existence of faults in anisotropic media

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Donath, F.A.

    1980-01-01

    Three primary aspects of faults which relate to their potential for degradation of a repository site are: the possibility of an existing but undetected fault intersecting the repository site; the potential for a new fault occurring and propagating through the repository site; the ability of any such fault to transmit groundwater. Given that a fault might be present in the region surrounding the site, the probability that it intersects the site depends primarily on its orientation and on the density of faulting in the area. Once these parameters are known, a model can be developed to determine the probability that an existing but undetected fault will intersect the repository site. Similar techniques can be used to estimate the potential for new faults occurring and intersecting site, or intersection from propagation along existing faults. However, additional data includng in situ stress measurements and records of seismic activity would be needed. One can estimate the stress level at which the strength in the surrounding media will be exceeded, and thus determine a time-dependent probability of movement along a pre-existing fault or of a new fault occurring, from a predicted rate of change in local stresses. In situ stress measurements taken at intervals of time could aid in determining the rate of stress change in the surrounding media, although measurable changes might not occur over the available period of observation. In situ stress measurements might also aid in assessing the ability of existing faults to transmit fluids

  8. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  9. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  10. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  11. Improved method for estimating particle scattering probabilities to finite detectors for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Mickael, M.; Gardner, R.P.; Verghese, K.

    1988-01-01

    An improved method for calculating the total probability of particle scattering within the solid angle subtended by finite detectors is developed, presented, and tested. The limiting polar and azimuthal angles subtended by the detector are measured from the direction that most simplifies their calculation rather than from the incident particle direction. A transformation of the particle scattering probability distribution function (pdf) is made to match the transformation of the direction from which the limiting angles are measured. The particle scattering probability to the detector is estimated by evaluating the integral of the transformed pdf over the range of the limiting angles measured from the preferred direction. A general formula for transforming the particle scattering pdf is derived from basic principles and applied to four important scattering pdf's; namely, isotropic scattering in the Lab system, isotropic neutron scattering in the center-of-mass system, thermal neutron scattering by the free gas model, and gamma-ray Klein-Nishina scattering. Some approximations have been made to these pdf's to enable analytical evaluations of the final integrals. These approximations are shown to be valid over a wide range of energies and for most elements. The particle scattering probability to spherical, planar circular, and right circular cylindrical detectors has been calculated using the new and previously reported direct approach. Results indicate that the new approach is valid and is computationally faster by orders of magnitude

  12. Has David Howden Vindicated Richard von Mises’s Definition of Probability?

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-11-01

    Full Text Available In my recent article on these pages (Crovelli 2009 I argued that members of the Austrian School of economics have adopted and defended a faulty definition of probability. I argued that the definition of probability necessarily depends upon the nature of the world in which we live. I claimed that if the nature of the world is such that every event and phenomenon which occurs has a cause of some sort, then probability must be defined subjectively; that is, “as a measure of our uncertainty about the likelihood of occurrence of some event or phenomenon, based upon evidence that need not derive solely from past frequencies of ‘collectives’ or ‘classes.’” I further claimed that the nature of the world is indeed such that all events and phenomena have prior causes, and that this fact compels us to adopt a subjective definition of probability.David Howden has recently published what he claims is a refutation of my argument in his article “Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations” (Howden 2009. Unfortunately, Mr. Howden appears to not have understood my argument, and his purported refutation of my subjective definition consequently amounts to nothing more than a concatenation of confused and fallacious ideas that are completely irrelevant to my argument. David Howden has thus failed in his attempt to vindicate Richard von Mises’s definition of probability.

  13. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  14. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  15. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    Science.gov (United States)

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based,…

  16. On the SIMS Ionization Probability of Organic Molecules.

    Science.gov (United States)

    Popczun, Nicholas J; Breuer, Lars; Wucher, Andreas; Winograd, Nicholas

    2017-06-01

    The prospect of improved secondary ion yields for secondary ion mass spectrometry (SIMS) experiments drives innovation of new primary ion sources, instrumentation, and post-ionization techniques. The largest factor affecting secondary ion efficiency is believed to be the poor ionization probability (α + ) of sputtered material, a value rarely measured directly, but estimated to be in some cases as low as 10 -5 . Our lab has developed a method for the direct determination of α + in a SIMS experiment using laser post-ionization (LPI) to detect neutral molecular species in the sputtered plume for an organic compound. Here, we apply this method to coronene (C 24 H 12 ), a polyaromatic hydrocarbon that exhibits strong molecular signal during gas-phase photoionization. A two-dimensional spatial distribution of sputtered neutral molecules is measured and presented. It is shown that the ionization probability of molecular coronene desorbed from a clean film under bombardment with 40 keV C 60 cluster projectiles is of the order of 10 -3 , with some remaining uncertainty arising from laser-induced fragmentation and possible differences in the emission velocity distributions of neutral and ionized molecules. In general, this work establishes a method to estimate the ionization efficiency of molecular species sputtered during a single bombardment event. Graphical Abstract GRAPHICAL ABSTRACT TEXT HERE] -->.

  17. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    Science.gov (United States)

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  18. Delay or probability discounting in a model of impulsive behavior: effect of alcohol.

    Science.gov (United States)

    Richards, J B; Zhang, L; Mitchell, S H; de Wit, H

    1999-01-01

    Little is known about the acute effects of drugs of abuse on impulsivity and self-control. In this study, impulsivity was assessed in humans using a computer task that measured delay and probability discounting. Discounting describes how much the value of a reward (or punisher) is decreased when its occurrence is either delayed or uncertain. Twenty-four healthy adult volunteers ingested a moderate dose of ethanol (0.5 or 0.8 g/kg ethanol: n = 12 at each dose) or placebo before completing the discounting task. In the task the participants were given a series of choices between a small, immediate, certain amount of money and $10 that was either delayed (0, 2, 30, 180, or 365 days) or probabilistic (i.e., certainty of receipt was 1.0, .9, .75, .5, or .25). The point at which each individual was indifferent between the smaller immediate or certain reward and the $10 delayed or probabilistic reward was identified using an adjusting-amount procedure. The results indicated that (a) delay and probability discounting were well described by a hyperbolic function; (b) delay and probability discounting were positively correlated within subjects; (c) delay and probability discounting were moderately correlated with personality measures of impulsivity; and (d) alcohol had no effect on discounting. PMID:10220927

  19. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  20. Applications of Algorithmic Probability to the Philosophy of Mind

    OpenAIRE

    Leuenberger, Gabriel

    2014-01-01

    This paper presents formulae that can solve various seemingly hopeless philosophical conundrums. We discuss the simulation argument, teleportation, mind-uploading, the rationality of utilitarianism, and the ethics of exploiting artificial general intelligence. Our approach arises from combining the essential ideas of formalisms such as algorithmic probability, the universal intelligence measure, space-time-embedded intelligence, and Hutter's observer localization. We argue that such universal...

  1. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  2. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  3. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  4. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  5. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  6. Does charge transfer correlate with ignition probability?

    International Nuclear Information System (INIS)

    Holdstock, Paul

    2008-01-01

    Flammable or explosive atmospheres exist in many industrial environments. The risk of ignition caused by electrostatic discharges is very real and there has been extensive study of the incendiary nature of sparks and brush discharges. It is clear that in order to ignite a gas, an amount of energy needs to be delivered to a certain volume of gas within a comparatively short time. It is difficult to measure the energy released in an electrostatic discharge directly, but it is possible to approximate the energy in a spark generated from a well defined electrical circuit. The spark energy required to ignite a gas, vapour or dust cloud can be determined by passing such sparks through them. There is a relationship between energy and charge in a capacitive circuit and so it is possible to predict whether or not a spark discharge will cause an ignition by measuring the charge transferred in the spark. Brush discharges are in many ways less well defined than sparks. Nevertheless, some work has been done that has established a relationship between charge transferred in brush discharges and the probability of igniting a flammable atmosphere. The question posed by this paper concerns whether such a relationship holds true in all circumstances and if there is a universal correlation between charge transfer and ignition probability. Data is presented on discharges from textile materials that go some way to answering this question.

  7. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    Science.gov (United States)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice

  8. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  9. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  10. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  11. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  12. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  13. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  14. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  15. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  16. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  17. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  19. The Sticking Probability for Hydrogen on Ni, Pd, and Pt at a Hydrogen pressure of 1 bar

    DEFF Research Database (Denmark)

    Johansson, Martin; Lytken, Ole; Chorkendorff, Ib

    2007-01-01

    A technique for measurements of the sticking probability of hydrogen on metal surfaces at high (ambient) pressure is described. As an example, measurements for Ni, Pd and Pt at a hydrogen pressure of 1 bar and temperatures between 40 and 200 degrees C are presented. The sticking probabilities are......, Pt. The transition between beta- and alpha-phase in the H-Pd system has a significant effect on the activity for Pd....

  20. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  1. Lower bound on inconclusive probability of unambiguous discrimination

    International Nuclear Information System (INIS)

    Feng Yuan; Zhang Shengyu; Duan Runyao; Ying Mingsheng

    2002-01-01

    We derive a lower bound on the inconclusive probability of unambiguous discrimination among n linearly independent quantum states by using the constraint of no signaling. It improves the bound presented in the paper of Zhang, Feng, Sun, and Ying [Phys. Rev. A 64, 062103 (2001)], and when the optimal discrimination can be reached, these two bounds coincide with each other. An alternative method of constructing an appropriate measurement to prove the lower bound is also presented

  2. gamma-ray emission probabilities of sup 1 sup 9 sup 3 Os

    CERN Document Server

    Marnada, N; Ueda, N; Ikeda, K; Hayashi, N

    2002-01-01

    Precise measurements of disintegration rates by using a 4 pi beta-gamma coincidence apparatus have resulted in improved certainties of the principal gamma-ray emission probabilities of sup 1 sup 9 sup 3 Os. Most of the uncertainties are less than 1%, whereas the uncertainties of emission probabilities evaluated in the Nuclear Data Sheets (83 (1998) 921) are more than 6%. The precision is improved for the beta-ray branching ratio for direct transition to the ground state and the value is larger than the evaluated value by about 6%.

  3. Reaction probability of molecular deuterium with a disordered InSb (110) surface

    International Nuclear Information System (INIS)

    Wolf, B.; Zehe, A.

    1987-01-01

    A detailed experimental analysis of the interaction of molecular deuterium with sputter-damaged InSb surfaces by the aid of SIMS is given. The sticking probability of D 2 and its transformation to a chemisorbed state resulting in InD + signals in SIMS measurements can be determined by adsorption experimens both with and without a hot tungsten filament. The calculated sticking probability of D 2 = 2 x 10 -4 is at least three orders of magnitude higher than the known-value for a cleavage plane of InSb

  4. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  5. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  6. Combined Cytolytic Effects of a Vaccinia Virus Encoding a Single Chain Trimer of MHC-I with a Tax-Epitope and Tax-Specific CTLs on HTLV-I-Infected Cells in a Rat Model

    Directory of Open Access Journals (Sweden)

    Takashi Ohashi

    2014-01-01

    Full Text Available Adult T cell leukemia (ATL is a malignant lymphoproliferative disease caused by human T cell leukemia virus type I (HTLV-I. To develop an effective therapy against the disease, we have examined the oncolytic ability of an attenuated vaccinia virus (VV, LC16m8Δ (m8Δ, and an HTLV-I Tax-specific cytotoxic T lymphocyte (CTL line, 4O1/C8, against an HTLV-I-infected rat T cell line, FPM1. Our results demonstrated that m8Δ was able to replicate in and lyse tumorigenic FPM1 cells but was incompetent to injure 4O1/C8 cells, suggesting the preferential cytolytic activity toward tumor cells. To further enhance the cytolysis of HTLV-I-infected cells, we modified m8Δ and obtained m8Δ/RT1AlSCTax180L, which can express a single chain trimer (SCT of rat major histocompatibility complex class I with a Tax-epitope. Combined treatment with m8Δ/RT1AlSCTax180L and 4O1/C8 increased the cytolysis of FPM1V.EFGFP/8R cells, a CTL-resistant subclone of FPM1, compared with that using 4O1/C8 and m8Δ presenting an unrelated peptide, suggesting that the activation of 4O1/C8 by m8Δ/RT1AlSCTax180L further enhanced the killing of the tumorigenic HTLV-I-infected cells. Our results indicate that combined therapy of oncolytic VVs with SCTs and HTLV-I-specific CTLs may be effective for eradication of HTLV-I-infected cells, which evade from CTL lysis and potentially develop ATL.

  7. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  8. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  9. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  10. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  11. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  12. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  13. Construction of unitary matrices from observable transition probabilities

    International Nuclear Information System (INIS)

    Peres, A.

    1989-01-01

    An ideal measuring apparatus defines an orthonormal basis vertical strokeu m ) in Hilbert space. Another apparatus defines another basis vertical strokeυ μ ). Both apparatuses together allow to measure the transition probabilities P mμ =vertical stroke(u m vertical strokeυ μ )vertical stroke 2 . The problem is: Given all the elements of a doubly stochastic matrix P mμ , find a unitary matrix U mμ such that P mμ =vertical strokeU mμ vertical stroke 2 . The number of unknown nontrivial phases is equal to the number of independent equations to satisfy. The problem can therefore be solved provided that the values of the P mμ satisfy some inequalities. (orig.)

  14. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  15. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  16. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  17. Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation

    Directory of Open Access Journals (Sweden)

    Michal Halas

    2012-01-01

    Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.

  18. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  19. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  20. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  1. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  2. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  3. Spatial probability of soil water repellency in an abandoned agricultural field in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Misiūnė, Ieva

    2015-04-01

    Water repellency is a natural soil property with implications on infiltration, erosion and plant growth. It depends on soil texture, type and amount of organic matter, fungi, microorganisms, and vegetation cover (Doerr et al., 2000). Human activities as agriculture can have implications on soil water repellency (SWR) due tillage and addition of organic compounds and fertilizers (Blanco-Canqui and Lal, 2009; Gonzalez-Penaloza et al., 2012). It is also assumed that SWR has a high small-scale variability (Doerr et al., 2000). The aim of this work is to study the spatial probability of SWR in an abandoned field testing several geostatistical methods, Organic Kriging (OK), Simple Kriging (SK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area it is located near Vilnius urban area at (54 49' N, 25 22', 104 masl) in Lithuania (Pereira and Oliva, 2013). It was designed a experimental plot with 21 m2 (07x03 m). Inside this area it was measured SWR was measured every 50 cm using the water drop penetration time (WDPT) (Wessel, 1998). A total of 105 points were measured. The probability of SWR was classified in 0 (No probability) to 1 (High probability). The methods accuracy was assessed with the cross validation method. The best interpolation method was the one with the lowest Root Mean Square Error (RMSE). The results showed that the most accurate probability method was SK (RMSE=0.436), followed by DK (RMSE=0.437), IK (RMSE=0.448), PK (RMSE=0.452) and OK (RMSE=0.537). Significant differences were identified among probability tests (Kruskal-Wallis test =199.7597 ptested technique. Simple Kriging, DK, IK and PK methods identified the high SWR probabilities in the northeast and central part of the plot, while OK observed mainly in the south-western part of the plot. In conclusion, before predict the spatial probability of SWR it is important to test several methods in order to identify the most accurate. Acknowledgments COST action ES

  4. Interpretations of Probability in Quantum Mechanics: A Case of "Experimental Metaphysics"

    Science.gov (United States)

    Hellman, Geoffrey

    After reviewing paradigmatic cases of "experimental metaphysics" basing inferences against local realism and determinism on experimental tests of Bells theorem (and successors), we concentrate on clarifying the meaning and status of "objective probability" in quantum mechanics. The terms "objective" and "subjective" are found ambiguous and inadequate, masking crucial differences turning on the question of what the numerical values of probability functions measure vs. the question of the nature of the "events" on which such functions are defined. This leads naturally to a 2×2 matrix of types of interpretations, which are then illustrated with salient examples. (Of independent interest are the splitting of "Copenhagen interpretation" into "objective" and "subjective" varieties in one of the dimensions and the splitting of Bohmian hidden variables from (other) modal interpretations along that same dimension.) It is then explained why Everett interpretations are difficult to categorize in these terms. Finally, we argue that Bohmian mechanics does not seriously threaten the experimental-metaphysical case for ultimate randomness and purely physical probabilities.

  5. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  6. K-capture probabilities in the decay of /sup 133/Ba

    Energy Technology Data Exchange (ETDEWEB)

    Singh, K; Sahota, H S

    1983-07-01

    The K-capture probabilities in the decay of /sup 133/Ba to 437, 383, 161 and 81 keV levels have been determined from the analysis of the K X-ray-gamma ray sumpeaks observed with an intrinsic Ge detector. The measurements on 161 and 81 keV levels are reported for the first time in literature.

  7. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  8. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  9. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  10. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  11. Fat-plug myringoplasty of ear lobule vs abdominal donor sites.

    Science.gov (United States)

    Acar, Mustafa; Yazıcı, Demet; San, Turhan; Muluk, Nuray Bayar; Cingi, Cemal

    2015-04-01

    The purpose of this study is to compare the success rates of fat-graft myringoplasties harvesting adipose grafts from different donor sites (ear lobule vs abdomen). The clinical records of 61 patients (24 males and 37 females) who underwent fat-plug myringoplasty (FPM) were reviewed retrospectively. Fat from ear lobule (FEL) and abdominal fat were used as graft materials. The impact of age, gender, systemic diseases, topography of the perforation, utilization of fat graft materials of different origin on the tympanic membrane closure rate and the effect of FPM on hearing gain was analyzed. Our tympanic membrane (TM) closure rate was 82 %. No statistical significant difference was observed regarding age, gender, comorbidities (septal deviation, hypertension and diabetes mellitus) or habits (smoking). Posterior TM perforations had significantly lower healing rate. The change in TM closure rate considering different adipose tissue donor sites was not statistically significant. The hearing gain of the patients was mostly below 20 dB. Fat-plug myringoplasty (FPM) is a safe, cost-effective and easy operation for selected patients. Abdominal fat graft is as effective as ear lobe fat graft on tympanic membrane healing, has cosmetic advantages and should be taken into consideration when planning fat as the graft source.

  12. Molar Incisor Hypomineralization, Prevalence, and Etiology

    Directory of Open Access Journals (Sweden)

    Sulaiman Mohammed Allazzam

    2014-01-01

    Full Text Available Aim. To evaluate the prevalence and possible etiological factors associated with molar incisor hypomineralization (MIH among a group of children in Jeddah, Saudi Arabia. Methods. A group of 8-12-year-old children were recruited (n=267  from the Pediatric Dental Clinics at the Faculty of Dentistry, King Abdulaziz University. Children had at least one first permanent molar (FPM, erupted or partially erupted. Demographic information, children’s medical history, and pregnancy-related data were obtained. The crowns of the FPM and permanent incisors were examined for demarcated opacities, posteruptive breakdown (PEB, atypical restorations, and extracted FPMs. Children were considered to have MIH if one or more FPM with or without involvement of incisors met the diagnostic criteria. Results. MIH showed a prevalence of 8.6%. Demarcated opacities were the most common form. Maxillary central incisors were more affected than mandibular (P=0.01. The condition was more prevalent in children with history of illnesses during the first four years of life including tonsillitis (P=0.001, adenoiditis (P=0.001, asthma (P=0.001, fever (P=0.014, and antibiotics intake (P=0.001. Conclusions. The prevalence of MIH is significantly associated with childhood illnesses during the first four years of life including asthma, adenoid infections, tonsillitis, fever, and antibiotics intake.

  13. Comparison of bipolar vs. tripolar concentric ring electrode Laplacian estimates.

    Science.gov (United States)

    Besio, W; Aakula, R; Dai, W

    2004-01-01

    Potentials on the body surface from the heart are of a spatial and temporal function. The 12-lead electrocardiogram (ECG) provides useful global temporal assessment, but it yields limited spatial information due to the smoothing effect caused by the volume conductor. The smoothing complicates identification of multiple simultaneous bioelectrical events. In an attempt to circumvent the smoothing problem, some researchers used a five-point method (FPM) to numerically estimate the analytical solution of the Laplacian with an array of monopolar electrodes. The FPM is generalized to develop a bi-polar concentric ring electrode system. We have developed a new Laplacian ECG sensor, a trielectrode sensor, based on a nine-point method (NPM) numerical approximation of the analytical Laplacian. For a comparison, the NPM, FPM and compact NPM were calculated over a 400 x 400 mesh with 1/400 spacing. Tri and bi-electrode sensors were also simulated and their Laplacian estimates were compared against the analytical Laplacian. We found that tri-electrode sensors have a much-improved accuracy with significantly less relative and maximum errors in estimating the Laplacian operator. Apart from the higher accuracy, our new electrode configuration will allow better localization of the electrical activity of the heart than bi-electrode configurations.

  14. Dynamic Model and Vibration Characteristics of Planar 3-RRR Parallel Manipulator with Flexible Intermediate Links considering Exact Boundary Conditions

    Directory of Open Access Journals (Sweden)

    Lianchao Sheng

    2017-01-01

    Full Text Available Due to the complexity of the dynamic model of a planar 3-RRR flexible parallel manipulator (FPM, it is often difficult to achieve active vibration control algorithm based on the system dynamic model. To establish a simple and efficient dynamic model of the planar 3-RRR FPM to study its dynamic characteristics and build a controller conveniently, firstly, considering the effect of rigid-flexible coupling and the moment of inertia at the end of the flexible intermediate link, the modal function is determined with the pinned-free boundary condition. Then, considering the main vibration modes of the system, a high-efficiency coupling dynamic model is established on the basis of guaranteeing the model control accuracy. According to the model, the modal characteristics of the flexible intermediate link are analyzed and compared with the modal test results. The results show that the model can effectively reflect the main vibration modes of the planar 3-RRR FPM; in addition the model can be used to analyze the effects of inertial and coupling forces on the dynamics model and the drive torque of the drive motor. Because this model is of the less dynamic parameters, it is convenient to carry out the control program.

  15. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  16. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  17. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  18. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  19. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  20. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  1. Transition probabilities of Ce I obtained from Boltzmann analysis of visible and near-infrared emission spectra

    Science.gov (United States)

    Nitz, D. E.; Curry, J. J.; Buuck, M.; DeMann, A.; Mitchell, N.; Shull, W.

    2018-02-01

    We report radiative transition probabilities for 5029 emission lines of neutral cerium within the wavelength range 417-1110 nm. Transition probabilities for only 4% of these lines have been previously measured. These results are obtained from a Boltzmann analysis of two high resolution Fourier transform emission spectra used in previous studies of cerium, obtained from the digital archives of the National Solar Observatory at Kitt Peak. The set of transition probabilities used for the Boltzmann analysis are those published by Lawler et al (2010 J. Phys. B: At. Mol. Opt. Phys. 43 085701). Comparisons of branching ratios and transition probabilities for lines common to the two spectra provide important self-consistency checks and test for the presence of self-absorption effects. Estimated 1σ uncertainties for our transition probability results range from 10% to 18%.

  2. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  3. 14 CFR 1214.504 - Screening requirements.

    Science.gov (United States)

    2010-01-01

    ... Critical Space System Personnel Reliability Program § 1214.504 Screening requirements. (a) Only those... using evaluation guidance and criteria contained in Federal Personnel Manual (FPM) chapter 731 and...

  4. Outage probability analysis of wireless sensor networks in the presence of channel fading and spatial correlation

    KAUST Repository

    Al-Murad, Tamim M.

    2011-07-01

    Evaluating the reliability of wireless sensor networks is becoming more important as theses networks are being used in crucial applications. The outage probability defined as the probability that the error in the system exceeds a maximum acceptable threshold has recently been used as a measure of the reliability of such systems. In this work we find the outage probability of wireless sensor network in different scenarios of distributed sensing where sensors\\' readings are affected by spatial correlation and in the presence of channel fading. © 2011 IEEE.

  5. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  6. Phonotactic probability of brand names: I'd buy that!

    Science.gov (United States)

    Vitevitch, Michael S; Donoso, Alexander J

    2012-11-01

    Psycholinguistic research shows that word-characteristics influence the speed and accuracy of various language-related processes. Analogous characteristics of brand names influence the retrieval of product information and the perception of risks associated with that product. In the present experiment we examined how phonotactic probability-the frequency with which phonological segments and sequences of segments appear in a word-might influence consumer behavior. Participants rated brand names that varied in phonotactic probability on the likelihood that they would buy the product. Participants indicated that they were more likely to purchase a product if the brand name was comprised of common segments and sequences of segments rather than less common segments and sequences of segments. This result suggests that word-characteristics may influence higher-level cognitive processes, in addition to language-related processes. Furthermore, the benefits of using objective measures of word characteristics in the design of brand names are discussed.

  7. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  8. Most probable mixing state of aerosols in Delhi NCR, northern India

    Science.gov (United States)

    Srivastava, Parul; Dey, Sagnik; Srivastava, Atul Kumar; Singh, Sachchidanand; Tiwari, Suresh

    2018-02-01

    Unknown mixing state is one of the major sources of uncertainty in estimating aerosol direct radiative forcing (DRF). Aerosol DRF in India is usually reported for external mixing and any deviation from this would lead to high bias and error. Limited information on aerosol composition hinders in resolving this issue in India. Here we use two years of aerosol chemical composition data measured at megacity Delhi to examine the most probable aerosol mixing state by comparing the simulated clear-sky downward surface flux with the measured flux. We consider external, internal, and four combinations of core-shell (black carbon, BC over dust; water-soluble, WS over dust; WS over water-insoluble, WINS and BC over WINS) mixing. Our analysis reveals that choice of external mixing (usually considered in satellite retrievals and climate models) seems reasonable in Delhi only in the pre-monsoon (Mar-Jun) season. During the winter (Dec-Feb) and monsoon (Jul-Sep) seasons, 'WS coating over dust' externally mixed with BC and WINS appears to be the most probable mixing state; while 'WS coating over WINS' externally mixed with BC and dust seems to be the most probable mixing state in the post-monsoon (Oct-Nov) season. Mean seasonal TOA (surface) aerosol DRF for the most probable mixing states are 4.4 ± 3.9 (- 25.9 ± 3.9), - 16.3 ± 5.7 (- 42.4 ± 10.5), 13.6 ± 11.4 (- 76.6 ± 16.6) and - 5.4 ± 7.7 (- 80.0 ± 7.2) W m- 2 respectively in the pre-monsoon, monsoon, post-monsoon and winter seasons. Our results highlight the importance of realistic mixing state treatment in estimating aerosol DRF to aid in policy making to combat climate change.

  9. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  10. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  11. Quantifying Detection Probabilities for Proliferation Activities in Undeclared Facilities

    International Nuclear Information System (INIS)

    Listner, C.; Canty, M.; Niemeyer, I.; Rezniczek, A.; Stein, G.

    2015-01-01

    International Safeguards is currently in an evolutionary process to increase effectiveness and efficiency of the verification system. This is an obvious consequence of the inability to detect the Iraq's clandestine nuclear weapons programme in the early 90s. By the adoption of the Programme 93+2, this has led to the development of Integrated Safeguards and the State-level concept. Moreover, the IAEA's focus was extended onto proliferation activities outside the State's declared facilities. The effectiveness of safeguards activities within declared facilities can and have been quantified with respect to costs and detection probabilities. In contrast, when verifying the absence of undeclared facilities this quantification has been avoided in the past because it has been considered to be impossible. However, when balancing the allocation of budget between the declared and the undeclared field, explicit reasoning is needed why safeguards effort is distributed in a given way. Such reasoning can be given by a holistic, information and risk-driven approach to Acquisition Path Analysis comprising declared and undeclared facilities. Regarding the input, this approach relies on the quantification of several factors, i.e., costs of attractiveness values for specific proliferation activities, potential safeguards measures and detection probabilities for these measures also for the undeclared field. In order to overcome the lack of quantification for detection probabilities in undeclared facilities, the authors of this paper propose a general verification error model. Based on this model, four different approaches are explained and assessed with respect to their advantages and disadvantages: the analogy approach, the Bayes approach, the frequentist approach and the process approach. The paper concludes with a summary and an outlook on potential future research activities. (author)

  12. A technique to obtain a multiparameter radar rainfall algorithm using the probability matching procedure

    International Nuclear Information System (INIS)

    Gorgucci, E.; Scarchilli, G.

    1997-01-01

    The natural cumulative distributions of rainfall observed by a network of rain gauges and a multiparameter radar are matched to derive multiparameter radar algorithms for rainfall estimation. The use of multiparameter radar measurements in a statistical framework to estimate rainfall is resented in this paper, The techniques developed in this paper are applied to the radar and rain gauge measurement of rainfall observed in central Florida and central Italy. Conventional pointwise estimates of rainfall are also compared. The probability matching procedure, when applied to the radar and surface measurements, shows that multiparameter radar algorithms can match the probability distribution function better than the reflectivity-based algorithms. It is also shown that the multiparameter radar algorithm derived matching the cumulative distribution function of rainfall provides more accurate estimates of rainfall on the ground in comparison to any conventional reflectivity-based algorithm

  13. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  14. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  15. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  16. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  17. Probability of assertive behaviour, interpersonal anxiety and self-efficacy of South African registered dietitians.

    Science.gov (United States)

    Paterson, Marie; Green, J M; Basson, C J; Ross, F

    2002-02-01

    There is little information on the probability of assertive behaviour, interpersonal anxiety and self-efficacy in the literature regarding dietitians. The objective of this study was to establish baseline information of these attributes and the factors affecting them. Questionnaires collecting biographical information and self-assessment psychometric scales measuring levels of probability of assertiveness, interpersonal anxiety and self-efficacy were mailed to 350 subjects, who comprised a random sample of dietitians registered with the Health Professions Council of South Africa. Forty-one per cent (n=145) of the sample responded. Self-assessment inventory results were compared to test levels of probability of assertive behaviour, interpersonal anxiety and self-efficacy. The inventory results were compared with the biographical findings to establish statistical relationships between the variables. The hypotheses were formulated before data collection. Dietitians had acceptable levels of probability of assertive behaviour and interpersonal anxiety. The probability of assertive behaviour was significantly lower than the level noted in the literature and was negatively related to interpersonal anxiety and positively related to self-efficacy.

  18. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  19. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  20. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  1. Probability of islanding in utility networks due to grid connected photovoltaic power systems

    Energy Technology Data Exchange (ETDEWEB)

    Verhoeven, B.

    2002-09-15

    This report for the International Energy Agency (IEA) made by Task 5 of the Photovoltaic Power Systems (PVPS) programme takes a look at the probability of islanding in utility networks due to grid-connected photovoltaic power systems. The mission of the Photovoltaic Power Systems Programme is to enhance the international collaboration efforts which accelerate the development and deployment of photovoltaic solar energy. Task 5 deals with issues concerning grid-interconnection and distributed PV power systems. This report summarises the results on a study on the probability of islanding in power networks with a high penetration level of grid connected PV-systems. The results are based on measurements performed during one year in a Dutch utility network. The measurements of active and reactive power were taken every second for two years and stored in a computer for off-line analysis. The area examined and its characteristics are described, as are the test set-up and the equipment used. The ratios between load and PV-power are discussed. The general conclusion is that the probability of islanding is virtually zero for low, medium and high penetration levels of PV-systems.

  2. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  4. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  5. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  6. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  7. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  8. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  9. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  10. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  11. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  12. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  13. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  14. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  15. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  16. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  17. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  18. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  19. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    Normally, a consistent basis for calculating partial factors focuses on a homogeneous reliability index neither depending on which material the structure is constructed of nor the ratio between the permanent and variable actions acting on the structure. Furthermore, the reliability index should n...... the characteristic shape coefficients are based on mean values as specified in background documents to the Eurocodes. Importance of hidden safeties judging the reliability is discussed for wind actions on low-rise structures....... not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  20. Estimating success probability of a rugby goal kick and developing a ...

    African Journals Online (AJOL)

    The objective of this study was firstly to derive a formula to estimate the success probability of a particular rugby goal kick and, secondly to derive a goal kicker rating measure that could be used to rank rugby union goal kickers. Various factors that could influence the success of a particular goal kick were considered.

  1. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    Science.gov (United States)

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  2. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  3. Emission probability determination of {sup 133}Ba by the sum-peak method

    Energy Technology Data Exchange (ETDEWEB)

    Silva, R.L. da; Almeida, M.C.M. de; Delgado, J.U.; Poledna, R.; Araujo, M.T.F.; Trindade, O.L.; Veras, E.V. de; Santos, A.; Rangel, J.; Ferreira Filho, A.L., E-mail: ronaldo@ird.gov.br, E-mail: marcandida@yahoo.com.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2016-07-01

    The National Laboratory of Metrology Ionizing Radiation (LNMRI/IRD/CNEN) has several measurement methods in order to ensure low uncertainties about the results. Through gamma spectrometry analysis by sum-peak absolute method they were performed the standardization of {sup 133}Ba activity and your emission probability determination of different energies with reduced uncertainties. The advantages of radionuclides calibrations by absolute method are accuracy, low uncertainties and is not necessary the use of radionuclides reference standards. {sup 133}Ba is used in research laboratories on calibration detectors in different work areas. The uncertainties for the activity and for the emission probability results are lower than 1%. (author)

  4. Covariate-adjusted Spearman's rank correlation with probability-scale residuals.

    Science.gov (United States)

    Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E

    2018-06-01

    It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.

  5. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  6. From the attempt of certain classical reformulations of quantum mechanics to quasi-probability representations

    International Nuclear Information System (INIS)

    Stulpe, Werner

    2014-01-01

    The concept of an injective affine embedding of the quantum states into a set of classical states, i.e., into the set of the probability measures on some measurable space, as well as its relation to statistically complete observables is revisited, and its limitation in view of a classical reformulation of the statistical scheme of quantum mechanics is discussed. In particular, on the basis of a theorem concerning a non-denseness property of a set of coexistent effects, it is shown that an injective classical embedding of the quantum states cannot be supplemented by an at least approximate classical description of the quantum mechanical effects. As an alternative approach, the concept of quasi-probability representations of quantum mechanics is considered

  7. Effect of field-dependent mobility on the escape probability. I. Electrons photoinjected in neopentane

    International Nuclear Information System (INIS)

    Mozumder, A.; Carmichael, I.

    1978-01-01

    A general procedure is described for calculating the escape probability of an electron against neutralization in the presence of an external field after it has been ejected into a dielectric liquid from a planar surface. The present paper utilizes the field-dependent electron mobility measurement in neopentane by Bakale and Schmidt. The calculated escape probability, upon averaging over the initial distribution, is compared with the current efficiency measurement of Holroyd et al. The median thermalization legnth, inferred from this comparison, depends in general upon the assumed form of initial distribution. It is less than the value obtained when the field dependence of the mobility is ignored but greater than that applicable to the high energy irradiation case. A plausible explanation is offered

  8. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  9. Absolute activity measurement and gamma-ray emission probability for decay of I-126; Medida absoluta da atividade e determinacao da taxa de emissao gama por decaimento do {sup 126} I

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Katia Aparecida

    1997-07-01

    The accurate knowledge of the gamma-ray emission probability per decay of radionuclides is important in several applications. In the case of {sup 126} I, its importance lies mainly in fast neutron dosimetry as well as in the production of {sup 125} I where {sup 126} I appears as an impurity. In the present work the gamma-ray emission probabilities per decay for the 388 and 666-KeV transitions of {sup 126} I have been measured. This radionuclide was obtained by means of the {sup 127} I(n, 2n){sup 126} I reaction in a fast neutron flux at the IPEN 2 MW research reactor. The methodology for the primary standardization of {sup 126} I is described. For this purpose, two different coincidence systems were used due to the complex decay scheme of this radionuclide. The {beta}branch measurement was carried out in a 4 {pi}(PC){beta}-{gamma} coincidence system consisting of a proportional counter, coupled to a pair of 3'x3' Na I (Tl) crystal. The electron capture branch was measured in a X-{gamma} coincidence system using two NaI(Tl) crystals. The gamma-ray measurements were performed in a HPGe system, previously calibrated by means of standard sources supplied by the International Atomic Energy Agency. All the uncertainties evolved were treated rigorously, by means of covariance analysis. (author)

  10. Absolute activity measurement and gamma-ray emission probability for decay of I-126; Medida absoluta da atividade e determinacao da taxa de emissao gama por decaimento do {sup 126} I

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Katia Aparecida

    1997-07-01

    The accurate knowledge of the gamma-ray emission probability per decay of radionuclides is important in several applications. In the case of {sup 126} I, its importance lies mainly in fast neutron dosimetry as well as in the production of {sup 125} I where {sup 126} I appears as an impurity. In the present work the gamma-ray emission probabilities per decay for the 388 and 666-KeV transitions of {sup 126} I have been measured. This radionuclide was obtained by means of the {sup 127} I(n, 2n){sup 126} I reaction in a fast neutron flux at the IPEN 2 MW research reactor. The methodology for the primary standardization of {sup 126} I is described. For this purpose, two different coincidence systems were used due to the complex decay scheme of this radionuclide. The {beta}branch measurement was carried out in a 4 {pi}(PC){beta}-{gamma} coincidence system consisting of a proportional counter, coupled to a pair of 3'x3' Na I (Tl) crystal. The electron capture branch was measured in a X-{gamma} coincidence system using two NaI(Tl) crystals. The gamma-ray measurements were performed in a HPGe system, previously calibrated by means of standard sources supplied by the International Atomic Energy Agency. All the uncertainties evolved were treated rigorously, by means of covariance analysis. (author)

  11. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand

  12. Using Rasch Analysis To Explore What Students Learn About Probability Concepts

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.

  13. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    Introductory treatment develops the theory of integration in a general context, making it applicable to other branches of analysis. More specialized topics include convergence theorems and random sequences and functions. 1963 edition.

  14. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  15. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  16. Prediction suppression in monkey inferotemporal cortex depends on the conditional probability between images.

    Science.gov (United States)

    Ramachandran, Suchitra; Meyer, Travis; Olson, Carl R

    2016-01-01

    When monkeys view two images in fixed sequence repeatedly over days and weeks, neurons in area TE of the inferotemporal cortex come to exhibit prediction suppression. The trailing image elicits only a weak response when presented following the leading image that preceded it during training. Induction of prediction suppression might depend either on the contiguity of the images, as determined by their co-occurrence and captured in the measure of joint probability P(A,B), or on their contingency, as determined by their correlation and as captured in the measures of conditional probability P(A|B) and P(B|A). To distinguish between these possibilities, we measured prediction suppression after imposing training regimens that held P(A,B) constant but varied P(A|B) and P(B|A). We found that reducing either P(A|B) or P(B|A) during training attenuated prediction suppression as measured during subsequent testing. We conclude that prediction suppression depends on contingency, as embodied in the predictive relations between the images, and not just on contiguity, as embodied in their co-occurrence. Copyright © 2016 the American Physiological Society.

  17. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  18. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  19. Probability Weighting and Loss Aversion in Futures Hedging

    NARCIS (Netherlands)

    Mattos, F.; Garcia, P.; Pennings, J.M.E.

    2008-01-01

    We analyze how the introduction of probability weighting and loss aversion in a futures hedging model affects decision making. Analytical findings indicate that probability weighting alone always affects optimal hedge ratios, while loss and risk aversion only have an impact when probability

  20. Measurement of the t anti-t Production Cross Section in p anti-p collisions at s**(1/2) = 1.96-TeV using Lepton + Jets Events with Jet Probability b-tagging

    Energy Technology Data Exchange (ETDEWEB)

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, T.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst.

    2006-07-01

    The authors present a measurement of the t{bar t} production cross section using events with one charged lepton and jets from p{bar p} collisions at a center-of-mass energy of 1.96 TeV. A b-tagging algorithm based on the probability of displaced tracks coming from the event interaction vertex is applied to identify b quarks from top decay. Using 318 pb{sup -1} of data collected with the CDF II detector, they measure the t{bar t} production cross section in events with at least one restrictive (tight) b-tagged jet and obtain 8.9{sub -1.0}{sup +1.0}(stat.){sub -1.0}{sup +1.1}(syst.) pb. The cross section value assumes a top quark mass of m{sub t} is presented in the paper. This result is consistent with other CDF measurements of the t{bar t} cross section using different samples and analysis techniques, and has similar systematic uncertainties. They have also performed consistency checks by using the b-tagging probability function to vary the signal to background ratio and also using events that have at least two b-tagged jets.

  1. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  2. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  3. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  4. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  5. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  6. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  7. Transformation & uncertainty : some thoughts on quantum probability theory, quantum statistics, and natural bundles

    NARCIS (Netherlands)

    Janssens, B.

    2010-01-01

    This PHD thesis is concerned partly with uncertainty relations in quantum probability theory, partly with state estimation in quantum stochastics, and partly with natural bundles in differential geometry. The laws of quantum mechanics impose severe restrictions on the performance of measurement.

  8. The sticking probability for H-2 on some transition metals at a hydrogen pressure of 1 bar

    DEFF Research Database (Denmark)

    Johansson, Martin; Lytken, Ole; Chorkendorff, Ib

    2008-01-01

    The sticking probability for hydrogen on films of Co, Ni, Cu, Ru, Rh, Pd, Ir, and Pt supported on graphite has been measured at a hydrogen pressure of 1 bar in the temperature range 40–200 °C. The sticking probability is found to increase in the order Ni, Co, Ir, Pd, Pt, Rh, and Ru at temperature...

  9. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  10. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  11. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  12. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  13. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  14. Fixation Probability in a Haploid-Diploid Population.

    Science.gov (United States)

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  15. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  16. Chance, determinism and the classical theory of probability.

    Science.gov (United States)

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Frequency position modulation using multi-spectral projections

    Science.gov (United States)

    Goodman, Joel; Bertoncini, Crystal; Moore, Michael; Nousain, Bryan; Cowart, Gregory

    2012-10-01

    In this paper we present an approach to harness multi-spectral projections (MSPs) to carefully shape and locate tones in the spectrum, enabling a new and robust modulation in which a signal's discrete frequency support is used to represent symbols. This method, called Frequency Position Modulation (FPM), is an innovative extension to MT-FSK and OFDM and can be non-uniformly spread over many GHz of instantaneous bandwidth (IBW), resulting in a communications system that is difficult to intercept and jam. The FPM symbols are recovered using adaptive projections that in part employ an analog polynomial nonlinearity paired with an analog-to-digital converter (ADC) sampling at a rate at that is only a fraction of the IBW of the signal. MSPs also facilitate using commercial of-the-shelf (COTS) ADCs with uniform-sampling, standing in sharp contrast to random linear projections by random sampling, which requires a full Nyquist rate sample-and-hold. Our novel communication system concept provides an order of magnitude improvement in processing gain over conventional LPI/LPD communications (e.g., FH- or DS-CDMA) and facilitates the ability to operate in interference laden environments where conventional compressed sensing receivers would fail. We quantitatively analyze the bit error rate (BER) and processing gain (PG) for a maximum likelihood based FPM demodulator and demonstrate its performance in interference laden conditions.

  18. Design, fabrication, and testing of stellar coronagraphs for exoplanet imaging

    Science.gov (United States)

    Knight, Justin M.; Brewer, John; Hamilton, Ryan; Ward, Karen; Milster, Tom D.; Guyon, Olivier

    2017-09-01

    Complex-mask coronagraphs destructively interfere unwanted starlight with itself to enable direct imaging of exoplanets. This is accomplished using a focal plane mask (FPM); a FPM can be a simple occulter mask, or in the case of a complex-mask, is a multi-zoned device designed to phase-shift starlight over multiple wavelengths to create a deep achromatic null in the stellar point spread function. Creating these masks requires microfabrication techniques, yet many such methods remain largely unexplored in this context. We explore methods of fabrication of complex FPMs for a Phased-Induced Amplitude Apodization Complex-Mask Coronagraph (PIAACMC). Previous FPM fabrication efforts for PIAACMC have concentrated on mask manufacturability while modeling science yield, as well as assessing broadband wavelength operation. Moreover current fabrication efforts are concentrated on assessing coronagraph performance given a single approach. We present FPMs fabricated using several process paths, including deep reactive ion etching and focused ion beam etching using a silicon substrate. The characteristic size of the mask features is 5μm with depths ranging over 1μm. The masks are characterized for manufacturing quality using an optical interferometer and a scanning electron microscope. Initial testing is performed at the Subaru Extreme Adaptive Optics testbed, providing a baseline for future experiments to determine and improve coronagraph performance within fabrication tolerances.

  19. Validation of early GOES-16 ABI on-orbit geometrical calibration accuracy using SNO method

    Science.gov (United States)

    Yu, Fangfang; Shao, Xi; Wu, Xiangqian; Kondratovich, Vladimir; Li, Zhengping

    2017-09-01

    The Advanced Baseline Imager (ABI) onboard the GOES-16 satellite, which was launched on 19 November 2016, is the first next-generation geostationary weather instrument in the west hemisphere. It has 16 spectral solar reflective and emissive bands located in three focal plane modules (FPM): one visible and near infrared (VNIR) FPM, one midwave infrared (MWIR), and one longwave infrared (LWIR) FPM. All the ABI bands are geometeorically calibrated with new techniques of Kalman filtering and Global Positioning System (GPS) to determine the accurate spacecraft attitude and orbit configuration to meet the challenging image navigation and registration (INR) requirements of ABI data. This study is to validate the ABI navigation and band-to-band registration (BBR) accuracies using the spectrally matched pixels of the Suomi National Polar-orbiting Partnership (SNPP) Visible Infrared Imaging Radiometer Suite (VIIRS) M-band data and the ABI images from the Simultaneous Nadir Observation (SNO) images. The preliminary results showed that during the ABI post-launch product test (PLPT) period, the ABI BBR errors at the y-direction (along the VIIRS track direction) is smaller than at the x-direction (along the VIIRS scan direction). Variations in the ABI BBR calibration residuals and navigation difference to VIIRS can be observed. Note that ABI is not operational yet and the data is experimental and still under testing. Effort is still ongoing to improve the ABI data quality.

  20. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  1. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  2. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  3. Uptake and Tissue Distribution of Pharmaceuticals and Personal Care Products in Wild Fish from Treated-Wastewater-Impacted Streams.

    Science.gov (United States)

    Tanoue, Rumi; Nomiyama, Kei; Nakamura, Haruna; Kim, Joon-Woo; Isobe, Tomohiko; Shinohara, Ryota; Kunisue, Tatsuya; Tanabe, Shinsuke

    2015-10-06

    A fish plasma model (FPM) has been proposed as a screening technique to prioritize potential hazardous pharmaceuticals to wild fish. However, this approach does not account for inter- or intraspecies variability of pharmacokinetic and pharmacodynamic parameters. The present study elucidated the uptake potency (from ambient water), tissue distribution, and biological risk of 20 pharmaceutical and personal care product (PPCP) residues in wild cyprinoid fish inhabiting treated-wastewater-impacted streams. In order to clarify the uncertainty of the FPM for PPCPs, we compared the plasma bioaccumulation factor in the field (BAFplasma = measured fish plasma/ambient water concentration ratio) with the predicted plasma bioconcentration factor (BCFplasma = fish plasma predicted by use of theoretical partition coefficients/ambient water concentration ratio) in the actual environment. As a result, the measured maximum BAFplasma of inflammatory agents was up to 17 times higher than theoretical BCFplasma values, leading to possible underestimation of toxicological risk on wild fish. When the tissue-blood partition coefficients (tissue/blood concentration ratios) of PPCPs were estimated, higher transportability into tissues, especially the brain, was found for psychotropic agents, but brain/plasma ratios widely varied among individual fish (up to 28-fold). In the present study, we provide a valuable data set on the intraspecies variability of PPCP pharmacokinetics, and our results emphasize the importance of determining PPCP concentrations in possible target organs as well as in the blood to assess the risk of PPCPs on wild fish.

  4. 14 CFR 417.224 - Probability of failure analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle failure...

  5. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  6. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  7. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    Roth, M.J.

    1985-04-01

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  8. Target-type probability combining algorithms for multisensor tracking

    Science.gov (United States)

    Wigren, Torbjorn

    2001-08-01

    Algorithms for the handing of target type information in an operational multi-sensor tracking system are presented. The paper discusses recursive target type estimation, computation of crosses from passive data (strobe track triangulation), as well as the computation of the quality of the crosses for deghosting purposes. The focus is on Bayesian algorithms that operate in the discrete target type probability space, and on the approximations introduced for computational complexity reduction. The centralized algorithms are able to fuse discrete data from a variety of sensors and information sources, including IFF equipment, ESM's, IRST's as well as flight envelopes estimated from track data. All algorithms are asynchronous and can be tuned to handle clutter, erroneous associations as well as missed and erroneous detections. A key to obtain this ability is the inclusion of data forgetting by a procedure for propagation of target type probability states between measurement time instances. Other important properties of the algorithms are their abilities to handle ambiguous data and scenarios. The above aspects are illustrated in a simulations study. The simulation setup includes 46 air targets of 6 different types that are tracked by 5 airborne sensor platforms using ESM's and IRST's as data sources.

  9. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  10. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Critical review of the probability of causation method

    International Nuclear Information System (INIS)

    Cox, L.A. Jr.; Fiksel, J.R.

    1985-01-01

    In a more controversial report than the others in the study, the authors use one scientific discipline to review the work of another discipline. Their proposal recognizes the imprecision that develops in moving from group to individual interpretations of causal effects by substituting the term assigned share for probability of causation. The authors conclude that the use of a formula will not provide reliable measures of risk attribution in individual cases. The gap between scientific certainty and assigning shares of responsibility must be filled by subjective value judgments supplied by the scientists. 22 references, 2 figures, 4 tables

  13. Hitting probabilities for nonlinear systems of stochastic waves

    CERN Document Server

    Dalang, Robert C

    2015-01-01

    The authors consider a d-dimensional random field u = \\{u(t,x)\\} that solves a non-linear system of stochastic wave equations in spatial dimensions k \\in \\{1,2,3\\}, driven by a spatially homogeneous Gaussian noise that is white in time. They mainly consider the case where the spatial covariance is given by a Riesz kernel with exponent \\beta. Using Malliavin calculus, they establish upper and lower bounds on the probabilities that the random field visits a deterministic subset of \\mathbb{R}^d, in terms, respectively, of Hausdorff measure and Newtonian capacity of this set. The dimension that ap

  14. Measure and integration theory

    CERN Document Server

    Burckel, Robert B

    2001-01-01

    This book gives a straightforward introduction to the field as it is nowadays required in many branches of analysis and especially in probability theory. The first three chapters (Measure Theory, Integration Theory, Product Measures) basically follow the clear and approved exposition given in the author's earlier book on ""Probability Theory and Measure Theory"". Special emphasis is laid on a complete discussion of the transformation of measures and integration with respect to the product measure, convergence theorems, parameter depending integrals, as well as the Radon-Nikodym theorem. The fi

  15. Impact of spectral smoothing on gamma radiation portal alarm probabilities

    International Nuclear Information System (INIS)

    Burr, T.; Hamada, M.; Hengartner, N.

    2011-01-01

    Gamma detector counts are included in radiation portal monitors (RPM) to screen for illicit nuclear material. Gamma counts are sometimes smoothed to reduce variance in the estimated underlying true mean count rate, which is the 'signal' in our context. Smoothing reduces total error variance in the estimated signal if the bias that smoothing introduces is more than offset by the variance reduction. An empirical RPM study for vehicle screening applications is presented for unsmoothed and smoothed gamma counts in low-resolution plastic scintillator detectors and in medium-resolution NaI detectors. - Highlights: → We evaluate options for smoothing counts from gamma detectors deployed for portal monitoring. → A new multiplicative bias correction (MBC) is shown to reduce bias in peak and valley regions. → Performance is measured using mean squared error and detection probabilities for sources. → Smoothing with the MBC improves detection probabilities and the mean squared error.

  16. Explaining regional disparities in traffic mortality by decomposing conditional probabilities.

    Science.gov (United States)

    Goldstein, Gregory P; Clark, David E; Travis, Lori L; Haskins, Amy E

    2011-04-01

    In the USA, the mortality rate from traffic injury is higher in rural and in southern regions, for reasons that are not well understood. For 1754 (56%) of the 3142 US counties, we obtained data allowing for separation of the deaths/population rate into deaths/injury, injuries/crash, crashes/exposure and exposure/population, with exposure measured as vehicle miles travelled. A 'decomposition method' proposed by Li and Baker was extended to study how the contributions of these components were affected by three measures of rural location, as well as southern location. The method of Li and Baker extended without difficulty to include non-binary effects and multiple exposures. Deaths/injury was by far the most important determinant in the county-to-county variation in deaths/population, and accounted for the greatest portion of the rural/urban disparity. After controlling for the rural effect, injuries/crash accounted for most of the southern/northern disparity. The increased mortality rate from traffic injury in rural areas can be attributed to the increased probability of death given that a person has been injured, possibly due to challenges faced by emergency medical response systems. In southern areas, there is an increased probability of injury given that a person has crashed, possibly due to differences in vehicle, road, or driving conditions.

  17. Application of damping mechanism model and stacking fault probability in Fe-Mn alloy

    International Nuclear Information System (INIS)

    Huang, S.K.; Wen, Y.H.; Li, N.; Teng, J.; Ding, S.; Xu, Y.G.

    2008-01-01

    In this paper, the damping mechanism model of Fe-Mn alloy was analyzed using dislocation theory. Moreover, as an important parameter in Fe-Mn based alloy, the effect of stacking fault probability on the damping capacity of Fe-19.35Mn alloy after deep-cooling or tensile deformation was also studied. The damping capacity was measured using reversal torsion pendulum. The stacking fault probability of γ-austenite and ε-martensite was determined by means of X-ray diffraction (XRD) profile analysis. The microstructure was observed using scanning electronic microscope (SEM). The results indicated that with the strain amplitude increasing above a critical value, the damping capacity of Fe-19.35Mn alloy increased rapidly which could be explained using the breakaway model of Shockley partial dislocations. Deep-cooling and suitable tensile deformation could improve the damping capacity owning to the increasing of stacking fault probability of Fe-19.35Mn alloy

  18. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  19. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  20. A data fusion framework for floodplain analysis using GIS and remotely sensed data

    Science.gov (United States)

    Necsoiu, Dorel Marius

    Throughout history floods have been part of the human experience. They are recurring phenomena that form a necessary and enduring feature of all river basin and lowland coastal systems. In an average year, they benefit millions of people who depend on them. In the more developed countries, major floods can be the largest cause of economic losses from natural disasters, and are also a major cause of disaster-related deaths in the less developed countries. Flood disaster mitigation research was conducted to determine how remotely sensed data can effectively be used to produce accurate flood plain maps (FPMs), and to identify/quantify the sources of error associated with such data. Differences were analyzed between flood maps produced by an automated remote sensing analysis tailored to the available satellite remote sensing datasets (rFPM), the 100-year flooded areas "predicted" by the Flood Insurance Rate Maps, and FPMs based on DEM and hydrological data (aFPM). Landuse/landcover was also examined to determine its influence on rFPM errors. These errors were identified and the results were integrated in a GIS to minimize landuse/landcover effects. Two substantial flood events were analyzed. These events were selected because of their similar characteristics (i.e., the existence of FIRM or Q3 data; flood data which included flood peaks, rating curves, and flood profiles; and DEM and remote sensing imagery). Automatic feature extraction was determined to be an important component for successful flood analysis. A process network, in conjunction with domain specific information, was used to map raw remotely sensed data onto a representation that is more compatible with a GIS data model. From a practical point of view, rFPM provides a way to automatically match existing data models to the type of remote sensing data available for each event under investigation. Overall, results showed how remote sensing could contribute to the complex problem of flood management by

  1. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  2. Automatic Monitoring System Design and Failure Probability Analysis for River Dikes on Steep Channel

    Science.gov (United States)

    Chang, Yin-Lung; Lin, Yi-Jun; Tung, Yeou-Koung

    2017-04-01

    The purposes of this study includes: (1) design an automatic monitoring system for river dike; and (2) develop a framework which enables the determination of dike failure probabilities for various failure modes during a rainstorm. The historical dike failure data collected in this study indicate that most dikes in Taiwan collapsed under the 20-years return period discharge, which means the probability of dike failure is much higher than that of overtopping. We installed the dike monitoring system on the Chiu-She Dike which located on the middle stream of Dajia River, Taiwan. The system includes: (1) vertical distributed pore water pressure sensors in front of and behind the dike; (2) Time Domain Reflectometry (TDR) to measure the displacement of dike; (3) wireless floating device to measure the scouring depth at the toe of dike; and (4) water level gauge. The monitoring system recorded the variation of pore pressure inside the Chiu-She Dike and the scouring depth during Typhoon Megi. The recorded data showed that the highest groundwater level insides the dike occurred 15 hours after the peak discharge. We developed a framework which accounts for the uncertainties from return period discharge, Manning's n, scouring depth, soil cohesion, and friction angle and enables the determination of dike failure probabilities for various failure modes such as overtopping, surface erosion, mass failure, toe sliding and overturning. The framework was applied to Chiu-She, Feng-Chou, and Ke-Chuang Dikes on Dajia River. The results indicate that the toe sliding or overturning has the highest probability than other failure modes. Furthermore, the overall failure probability (integrate different failure modes) reaches 50% under 10-years return period flood which agrees with the historical failure data for the study reaches.

  3. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  4. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  5. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  6. Appearance probability of certain precipitation quantities and temperature sums > 100S during the growing period

    International Nuclear Information System (INIS)

    Dimitrovska, Blaga; Dimitrovski, Zoran; Ristevski, Pece

    2004-01-01

    In this paper are given probabilities for determined precipitation amounts existence and temperature sums >10 o C (in percentages) for fifteen measure stations in the Republic of Macedonia for the vegetative period of the year (from 1 st of April until 31st of October) and for period 1951-2000. Using the precipitation amounts and sums of temperatures >10 o C for the vegetative period of the year we, calculated hydro thermic coefficient (HTC) by Seljaninov for each year of the period 1951-2000. From the HTC values we categorized the drought for each measure station using criteria by S.Otorepec. Calculating the probabilities for determined precipitation amounts existence and temperature summes > 10 o C occurrence is very important for water balance calculation, irrigation norms etc.(Author)

  7. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  8. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  9. The effect of incremental changes in phonotactic probability and neighborhood density on word learning by preschool children

    Science.gov (United States)

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose Phonotactic probability or neighborhood density have predominately been defined using gross distinctions (i.e., low vs. high). The current studies examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method The full range of probability or density was examined by sampling five nonwords from each of four quartiles. Three- and 5-year-old children received training on nonword-nonobject pairs. Learning was measured in a picture-naming task immediately following training and 1-week after training. Results were analyzed using multi-level modeling. Results A linear spline model best captured nonlinearities in phonotactic probability. Specifically word learning improved as probability increased in the lowest quartile, worsened as probability increased in the midlow quartile, and then remained stable and poor in the two highest quartiles. An ordinary linear model sufficiently described neighborhood density. Here, word learning improved as density increased across all quartiles. Conclusion Given these different patterns, phonotactic probability and neighborhood density appear to influence different word learning processes. Specifically, phonotactic probability may affect recognition that a sound sequence is an acceptable word in the language and is a novel word for the child, whereas neighborhood density may influence creation of a new representation in long-term memory. PMID:23882005

  10. Absolute Transition Probabilities from the 453.1 keV Level in 183W

    International Nuclear Information System (INIS)

    Malmskog, S.G.

    1966-10-01

    The half life of the 453.1 keV level in 183 W has been measured by the delayed coincidence method to 18.4 ± 0.5 nsec. This determines twelve absolute M1 and E2 transition probabilities, out of which nine are K-forbidden. All transition probabilities are compared with the single particle estimate. The three K-allowed E2, ΔK = 2 transition rates to the 1/2 - (510) rotational band are furthermore compared with the Nilsson model. An attempt to give a quantitative explanation of the observed transition rates has been made by including the effects from admixtures into the single particle wave functions

  11. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  12. Estimating the Probability of a Rare Event Over a Finite Time Horizon

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; L'Ecuyer, Pierre; Rubino, Gerardo; Tuffin, Bruno

    2007-01-01

    We study an approximation for the zero-variance change of measure to estimate the probability of a rare event in a continuous-time Markov chain. The rare event occurs when the chain reaches a given set of states before some fixed time limit. The jump rates of the chain are expressed as functions of

  13. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  14. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  15. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  16. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  17. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  18. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  19. Risk Profile Indicators and Spanish Banks’ Probability of Default from a Regulatory Approach

    Directory of Open Access Journals (Sweden)

    Pilar Gómez-Fernández-Aguado

    2018-04-01

    Full Text Available This paper analyses the relationships between the traditional bank risk profile indicators and a new measure of banks’ probability of default that considers the Basel regulatory framework. First, based on the SYstemic Model of Bank Originated Losses (SYMBOL, we calculated the individual probabilities of default (PD of a representative sample of Spanish credit institutions during the period of 2008–2016. Then, panel data regressions were estimated to explore the influence of the risk indicators on the PD. Our findings on the Spanish banking system could be important to regulatory and supervisory authorities. First, the PD based on the SYMBOL model could be used to analyse bank risk from a regulatory approach. Second, the results might be useful for designing new regulations focused on the key factors that affect the banks’ probability of default. Third, our findings reveal that the emphasis on regulation and supervision should differ by type of entity.

  20. Formulas for Rational-Valued Separability Probabilities of Random Induced Generalized Two-Qubit States

    Directory of Open Access Journals (Sweden)

    Paul B. Slater

    2015-01-01

    Full Text Available Previously, a formula, incorporating a 5F4 hypergeometric function, for the Hilbert-Schmidt-averaged determinantal moments ρPTnρk/ρk of 4×4 density-matrices (ρ and their partial transposes (|ρPT|, was applied with k=0 to the generalized two-qubit separability probability question. The formula can, furthermore, be viewed, as we note here, as an averaging over “induced measures in the space of mixed quantum states.” The associated induced-measure separability probabilities (k=1,2,… are found—via a high-precision density approximation procedure—to assume interesting, relatively simple rational values in the two-re[al]bit (α=1/2, (standard two-qubit (α=1, and two-quater[nionic]bit (α=2 cases. We deduce rather simple companion (rebit, qubit, quaterbit, … formulas that successfully reproduce the rational values assumed for general  k. These formulas are observed to share certain features, possibly allowing them to be incorporated into a single master formula.