WorldWideScience

Sample records for system phased-mission probability

  1. An algorithm for reliability analysis of phased-mission systems

    International Nuclear Information System (INIS)

    Ma, Y.; Trivedi, K.S.

    1999-01-01

    The purpose of this paper is to describe an efficient Boolean algebraic algorithm that provides exact solution to the unreliability of a multi-phase mission system where the configurations are described through fault trees. The algorithm extends and improves the Boolean method originally proposed by Somani and Trivedi. By using the Boolean algebraic method, we provide an efficient modeling approach which avoids the state space explosion and the mapping problems that are encountered by the Markov chain approach. To calculate the exact solution of the phased-mission system with deterministic phase durations, we introduce the sum of disjoint phase products (SDPP) formula, which is a phased-extension of the sum of disjoint products (SDP) formula. Computationally, the algorithm is quite efficient because it calls an SDP generation algorithm in the early stage of the SDPP computation. In this way, the phase products generated in the early stage of the SDPP formula are guaranteed to be disjoint. Consequently, the number of the intermediate phase products is greatly reduced. In this paper, we also consider the transient analysis of the phased-mission system. Special care is needed to account for the possible latent failures at the mission phase change times. If there are more stringent success criteria just after a mission phase change time, an unreliability jump would occur at that time. Finally, the algorithm has been implemented in the software package SHARPE. With SHARPE, the complexities of the phased-mission system is made transparent to the potential users. The user can conveniently specify a phased-mission model at a high level (through fault trees) and analyze the system quantitatively

  2. Using reliability analysis to support decision making\\ud in phased mission systems

    OpenAIRE

    Zhang, Yang; Prescott, Darren

    2017-01-01

    Due to the environments in which they will operate, future autonomous systems must be capable of reconfiguring quickly and safely following faults or environmental changes. Past research has shown how, by considering autonomous systems to perform phased missions, reliability analysis can support decision making by allowing comparison of the probability of success of different missions following reconfiguration. Binary Decision Diagrams (BDDs) offer fast, accurate reliability analysis that cou...

  3. Phased mission modelling of systems with maintenance-free operating periods using simulated Petri nets

    Energy Technology Data Exchange (ETDEWEB)

    Chew, S.P.; Dunnett, S.J. [Department of Aeronautical and Automotive Engineering, Loughborough University, Loughborough, Leics (United Kingdom); Andrews, J.D. [Department of Aeronautical and Automotive Engineering, Loughborough University, Loughborough, Leics (United Kingdom)], E-mail: j.d.andrews@lboro.ac.uk

    2008-07-15

    A common scenario in engineering is that of a system which operates throughout several sequential and distinct periods of time, during which the modes and consequences of failure differ from one another. This type of operation is known as a phased mission, and for the mission to be a success the system must successfully operate throughout all of the phases. Examples include a rocket launch and an aeroplane flight. Component or sub-system failures may occur at any time during the mission, yet not affect the system performance until the phase in which their condition is critical. This may mean that the transition from one phase to the next is a critical event that leads to phase and mission failure, with the root cause being a component failure in a previous phase. A series of phased missions with no maintenance may be considered as a maintenance-free operating period (MFOP). This paper describes the use of a Petri net (PN) to model the reliability of the MFOP and phased missions scenario. The model uses Monte-Carlo simulation to obtain its results, and due to the modelling power of PNs, can consider complexities such as component failure rate interdependencies and mission abandonment. The model operates three different types of PN which interact to provide the overall system reliability modelling. The model is demonstrated and validated by considering two simple examples that can be solved analytically.

  4. Phased mission modelling of systems with maintenance-free operating periods using simulated Petri nets

    International Nuclear Information System (INIS)

    Chew, S.P.; Dunnett, S.J.; Andrews, J.D.

    2008-01-01

    A common scenario in engineering is that of a system which operates throughout several sequential and distinct periods of time, during which the modes and consequences of failure differ from one another. This type of operation is known as a phased mission, and for the mission to be a success the system must successfully operate throughout all of the phases. Examples include a rocket launch and an aeroplane flight. Component or sub-system failures may occur at any time during the mission, yet not affect the system performance until the phase in which their condition is critical. This may mean that the transition from one phase to the next is a critical event that leads to phase and mission failure, with the root cause being a component failure in a previous phase. A series of phased missions with no maintenance may be considered as a maintenance-free operating period (MFOP). This paper describes the use of a Petri net (PN) to model the reliability of the MFOP and phased missions scenario. The model uses Monte-Carlo simulation to obtain its results, and due to the modelling power of PNs, can consider complexities such as component failure rate interdependencies and mission abandonment. The model operates three different types of PN which interact to provide the overall system reliability modelling. The model is demonstrated and validated by considering two simple examples that can be solved analytically

  5. Competing failure analysis in phased-mission systems with multiple functional dependence groups

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Peng, Rui; Pan, Zhusheng

    2017-01-01

    A phased-mission system (PMS) involves multiple, consecutive, non-overlapping phases of operation. The system structure function and component failure behavior in a PMS can change from phase to phase, posing big challenges to the system reliability analysis. Further complicating the problem is the functional dependence (FDEP) behavior where the failure of certain component(s) causes other component(s) to become unusable or inaccessible or isolated. Previous studies have shown that FDEP can cause competitions between failure propagation and failure isolation in the time domain. While such competing failure effects have been well addressed in single-phase systems, only little work has focused on PMSs with a restrictive assumption that a single FDEP group exists in one phase of the mission. Many practical systems (e.g., computer systems and networks), however may involve multiple FDEP groups during the mission. Moreover, different FDEP groups can be dependent due to sharing some common components; they may appear in a single phase or multiple phases. This paper makes new contributions by modeling and analyzing reliability of PMSs subject to multiple FDEP groups through a Markov chain-based methodology. Propagated failures with both global and selective effects are considered. Four case studies are presented to demonstrate application of the proposed method. - Highlights: • Reliability of phased-mission systems subject to competing failure propagation and isolation effects is modeled. • Multiple independent or dependent functional dependence groups are considered. • Propagated failures with global effects and selective effects are studied. • Four case studies demonstrate generality and application of the proposed Markov-based method.

  6. Reliability of k-out-of-n systems with phased-mission requirements and imperfect fault coverage

    International Nuclear Information System (INIS)

    Xing Liudong; Amari, Suprasad V.; Wang Chaonan

    2012-01-01

    In this paper, an efficient method is proposed for the exact reliability evaluation of k-out-of-n systems with identical components subject to phased-mission requirements and imperfect fault coverage. The system involves multiple, consecutive, and non-overlapping phases of operation, where the k values and failure time distributions of system components can change from phase to phase. The proposed method considers statistical dependencies of component states across phases as well as dynamics in system configuration and success criteria. It also considers the time-varying and phase-dependent failure distributions and associated cumulative damage effects for the system components. The proposed method is based on the total probability law, conditional probabilities and an efficient recursive formula to compute the overall mission reliability with the consideration of imperfect fault coverage. The main advantages of this method are that both its computational time and memory requirements are linear in terms of the system size, and it has no limitation on the type of time-to-failure distributions for the system components. Three examples are presented to illustrate the application and advantages of the proposed method.

  7. Competing failure analysis in phased-mission systems with functional dependence in one of phases

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Levitin, Gregory

    2012-01-01

    This paper proposes an algorithm for the reliability analysis of non-repairable phased-mission systems (PMS) subject to competing failure propagation and isolation effects. A failure originating from a system component which causes extensive damage to other system components is a propagated failure. When the propagated failure affects all the system components, causing the entire system failure, a propagated failure with global effect (PFGE) is said to occur. However, the failure propagation can be isolated in systems subject to functional dependence (FDEP) behavior, where the failure of a component (referred to as trigger component) causes some other components (referred to as dependent components) to become inaccessible or unusable (isolated from the system), and thus further failures from these dependent components have no effect on the system failure behavior. On the other hand, if any PFGE from dependent components occurs before the trigger failure, the failure propagation effect takes place, causing the overall system failure. In summary, there are two distinct consequences of a PFGE due to the competition between the failure isolation and failure propagation effects in the time domain. Existing works on such competing failures focus only on single-phase systems. However, many real-world systems are phased-mission systems (PMS), which involve multiple, consecutive and non-overlapping phases of operations or tasks. Consideration of competing failures for PMS is a challenging and difficult task because PMS exhibit dynamics in the system configuration and component behavior as well as statistical dependencies across phases for a given component. This paper proposes a combinatorial method to address the competing failure effects in the reliability analysis of binary non-repairable PMS. The proposed method is verified using a Markov-based method through a numerical example. Different from the Markov-based approach that is limited to exponential distribution, the

  8. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Directory of Open Access Journals (Sweden)

    Hua-Feng He

    2014-01-01

    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  9. BDD-based reliability evaluation of phased-mission systems with internal/external common-cause failures

    International Nuclear Information System (INIS)

    Xing, Liudong; Levitin, Gregory

    2013-01-01

    Phased-mission systems (PMS) are systems in which multiple non-overlapping phases of operations (or tasks) are accomplished in sequence for a successful mission. Examples of PMS abound in applications such as aerospace, nuclear power, and airborne weapon systems. Reliability analysis of a PMS must consider statistical dependence across different phases as well as dynamics in system configuration, failure criteria, and component behavior. This paper proposes a binary decision diagrams (BDD) based method for the reliability evaluation of non-repairable binary-state PMS with common-cause failures (CCF). CCF are simultaneous failure of multiple system elements, which can be caused by some external factors (e.g., lightning strikes, sudden changes in environment) or by propagated failures originating from some elements within the system. Both the external and internal CCF is considered in this paper. The proposed method is combinatorial, exact, and is applicable to PMS with arbitrary system structures and component failure distributions. An example with different CCF scenarios is analyzed to illustrate the application and advantages of the proposed method. -- Highlights: ► Non-repairable phased-mission systems with common-cause failures are analyzed. ► Common-cause failures caused by internal or external factors are considered. ► A combinatorial algorithm based on binary decision diagrams is suggested

  10. An efficient phased mission reliability analysis for autonomous vehicles

    International Nuclear Information System (INIS)

    Remenyte-Prescott, R.; Andrews, J.D.; Chung, P.W.H.

    2010-01-01

    Autonomous systems are becoming more commonly used, especially in hazardous situations. Such systems are expected to make their own decisions about future actions when some capabilities degrade due to failures of their subsystems. Such decisions are made without human input, therefore they need to be well-informed in a short time when the situation is analysed and future consequences of the failure are estimated. The future planning of the mission should take account of the likelihood of mission failure. The reliability analysis for autonomous systems can be performed using the methodologies developed for phased mission analysis, where the causes of failure for each phase in the mission can be expressed by fault trees. Unmanned autonomous vehicles (UAVs) are of a particular interest in the aeronautical industry, where it is a long term ambition to operate them routinely in civil airspace. Safety is the main requirement for the UAV operation and the calculation of failure probability of each phase and the overall mission is the topic of this paper. When components or subsystems fail or environmental conditions throughout the mission change, these changes can affect the future mission. The new proposed methodology takes into account the available diagnostics data and is used to predict future capabilities of the UAV in real time. Since this methodology is based on the efficient BDD method, the quickly provided advice can be used in making decisions. When failures occur appropriate actions are required in order to preserve safety of the autonomous vehicle. The overall decision making strategy for autonomous vehicles is explained in this paper. Some limitations of the methodology are discussed and further improvements are presented based on experimental results.

  11. System Geometries and Transit/Eclipse Probabilities

    Directory of Open Access Journals (Sweden)

    Howard A.

    2011-02-01

    Full Text Available Transiting exoplanets provide access to data to study the mass-radius relation and internal structure of extrasolar planets. Long-period transiting planets allow insight into planetary environments similar to the Solar System where, in contrast to hot Jupiters, planets are not constantly exposed to the intense radiation of their parent stars. Observations of secondary eclipses additionally permit studies of exoplanet temperatures and large-scale exo-atmospheric properties. We show how transit and eclipse probabilities are related to planet-star system geometries, particularly for long-period, eccentric orbits. The resulting target selection and observational strategies represent the principal ingredients of our photometric survey of known radial-velocity planets with the aim of detecting transit signatures (TERMS.

  12. Stochastic response of nonlinear system in probability domain

    Indian Academy of Sciences (India)

    Keywords. Stochastic average procedure; nonlinear single-DOF system; probability density function. Abstract. A stochastic averaging procedure for obtaining the probability density function (PDF) of the response for a strongly nonlinear single-degree-of-freedom system, subjected to both multiplicative and additive random ...

  13. Stochastic response of nonlinear system in probability domain

    Indian Academy of Sciences (India)

    Stochastic average procedure; nonlinear single-DOF system; proba- bility density function. 1. Introduction. Stochastic response analysis of nonlinear systems has been extensively studied in the fre- quency, time and probability domains. In the frequency domain, the stochastic linearization technique is generally used for ...

  14. The extinction probability in systems randomly varying in time

    Directory of Open Access Journals (Sweden)

    Imre Pázsit

    2017-09-01

    Full Text Available The extinction probability of a branching process (a neutron chain in a multiplying medium is calculated for a system randomly varying in time. The evolution of the first two moments of such a process was calculated previously by the authors in a system randomly shifting between two states of different multiplication properties. The same model is used here for the investigation of the extinction probability. It is seen that the determination of the extinction probability is significantly more complicated than that of the moments, and it can only be achieved by pure numerical methods. The numerical results indicate that for systems fluctuating between two subcritical or two supercritical states, the extinction probability behaves as expected, but for systems fluctuating between a supercritical and a subcritical state, there is a crucial and unexpected deviation from the predicted behaviour. The results bear some significance not only for neutron chains in a multiplying medium, but also for the evolution of biological populations in a time-varying environment.

  15. Efficient Simulation of the Outage Probability of Multihop Systems

    KAUST Repository

    Ben Issaid, Chaouki

    2017-10-23

    In this paper, we present an efficient importance sampling estimator for the evaluation of the outage probability of multihop systems with amplify-and-forward channel state-information-assisted. The proposed estimator is endowed with the bounded relative error property. Simulation results show a significant reduction in terms of number of simulation runs compared to naive Monte Carlo.

  16. Screening for retinitis in children with probable systemic ...

    African Journals Online (AJOL)

    Background. The incidence of immunocompromised children with probable systemic cytomegalovirus (CMV) infection is increasing. Currently, there is no protocol for screening children for CMV retinitis in South Africa. Screening for CMV retinitis may prevent permanent visual impairment. Objectives. To determine the ...

  17. A verification system survival probability assessment model test methods

    International Nuclear Information System (INIS)

    Jia Rui; Wu Qiang; Fu Jiwei; Cao Leituan; Zhang Junnan

    2014-01-01

    Subject to the limitations of funding and test conditions, the number of sub-samples of large complex system test less often. Under the single sample conditions, how to make an accurate evaluation of the performance, it is important for reinforcement of complex systems. It will be able to significantly improve the technical maturity of the assessment model, if that can experimental validation and evaluation model. In this paper, a verification system survival probability assessment model test method, the method by the test system sample test results, verify the correctness of the assessment model and a priori information. (authors)

  18. Time dependent non-extinction probability for prompt critical systems

    International Nuclear Information System (INIS)

    Gregson, M. W.; Prinja, A. K.

    2009-01-01

    The time dependent non-extinction probability equation is presented for slab geometry. Numerical solutions are provided for a nested inner/outer iteration routine where the fission terms (both linear and non-linear) are updated and then held fixed over the inner scattering iteration. Time dependent results are presented highlighting the importance of the injection position and angle. The iteration behavior is also described as the steady state probability of initiation is approached for both small and large time steps. Theoretical analysis of the nested iteration scheme is shown and highlights poor numerical convergence for marginally prompt critical systems. An acceleration scheme for the outer iterations is presented to improve convergence of such systems. Theoretical analysis of the acceleration scheme is also provided and the associated decrease in computational run time addressed. (authors)

  19. Error Probability Analysis of Hardware Impaired Systems with Asymmetric Transmission

    KAUST Repository

    Javed, Sidrah

    2018-04-26

    Error probability study of the hardware impaired (HWI) systems highly depends on the adopted model. Recent models have proved that the aggregate noise is equivalent to improper Gaussian signals. Therefore, considering the distinct noise nature and self-interfering (SI) signals, an optimal maximum likelihood (ML) receiver is derived. This renders the conventional minimum Euclidean distance (MED) receiver as a sub-optimal receiver because it is based on the assumptions of ideal hardware transceivers and proper Gaussian noise in communication systems. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds and approximations are derived for various adopted systems including transmitter and receiver I/Q imbalanced systems with or without transmitter distortions as well as transmitter or receiver only impaired systems. Motivated by recent studies that shed the light on the benefit of improper Gaussian signaling in mitigating the HWIs, asymmetric quadrature amplitude modulation or phase shift keying is optimized and adapted for transmission. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver, the tightness of the derived bounds and effectiveness of asymmetric transmission in dampening HWIs and improving overall system performance

  20. Hitting probabilities for nonlinear systems of stochastic waves

    CERN Document Server

    Dalang, Robert C

    2015-01-01

    The authors consider a d-dimensional random field u = \\{u(t,x)\\} that solves a non-linear system of stochastic wave equations in spatial dimensions k \\in \\{1,2,3\\}, driven by a spatially homogeneous Gaussian noise that is white in time. They mainly consider the case where the spatial covariance is given by a Riesz kernel with exponent \\beta. Using Malliavin calculus, they establish upper and lower bounds on the probabilities that the random field visits a deterministic subset of \\mathbb{R}^d, in terms, respectively, of Hausdorff measure and Newtonian capacity of this set. The dimension that ap

  1. A Probability-Based Hybrid User Model for Recommendation System

    Directory of Open Access Journals (Sweden)

    Jia Hao

    2016-01-01

    Full Text Available With the rapid development of information communication technology, the available information or knowledge is exponentially increased, and this causes the well-known information overload phenomenon. This problem is more serious in product design corporations because over half of the valuable design time is consumed in knowledge acquisition, which highly extends the design cycle and weakens the competitiveness. Therefore, the recommender systems become very important in the domain of product domain. This research presents a probability-based hybrid user model, which is a combination of collaborative filtering and content-based filtering. This hybrid model utilizes user ratings and item topics or classes, which are available in the domain of product design, to predict the knowledge requirement. The comprehensive analysis of the experimental results shows that the proposed method gains better performance in most of the parameter settings. This work contributes a probability-based method to the community for implement recommender system when only user ratings and item topics are available.

  2. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold.

  3. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the ... Centre for Ships and Ocean Structures (CeSOS), Norwegian University of Science and Technology, NO-7491, Trondheim, Norway ...

  4. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    It is therefore desirable to calculate the approximation of the failure probability functional in order to design a suboptimal control function which allows us to achieve a low variance of the estimator (5). Thus an iterative two-step importance sampling method is presented. (Ivanova & Naess 2004). The procedure uses both ...

  5. Probable existence of a Gondwana transcontinental rift system in ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 126; Issue 6 ... The study indicates a rift system spanning from Arabian plate in the north and extending to southern part of Africa that passes through Indus basin, western part of India and Madagascar, and existed from Late Carboniferous to Early Jurassic.

  6. An array-based study of increased system lifetime probability

    DEFF Research Database (Denmark)

    Nesgaard, Carsten

    2003-01-01

    Society's increased dependence on electronic systems calls for highly reliable power supplies comprised of multiple converters working in parallel. This paper describes a redundancy control scheme, based on the array technology that increases the overall reliability quite considerably and thereby...

  7. An array-based study of increased system lifetime probability

    DEFF Research Database (Denmark)

    Nesgaard, Carsten

    2002-01-01

    Society's increased dependence on electronic systems calls for highly reliable power supplies comprised of multiple converters working in parallel. This paper describes a redundancy control scheme, based on the array technology that increases the overall reliability quite considerably and thereby...

  8. Development of an integrated system for estimating human error probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  9. Probable existence of a Gondwana transcontinental rift system in ...

    Indian Academy of Sciences (India)

    S Mazumder

    2017-08-31

    Aug 31, 2017 ... Permian rifts in the East African countries of. South Africa, Kenya, Tanzania and Mozambique are mostly referred to as the Karoo System. South Africa: In the main Karoo Basin in South. Africa (figure 5), the Karoo sequence (figure 2) is subdivided into Dwyka Series (Late Carbonif- erous), Ecca Series (Early ...

  10. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  11. Optimisation of Structural Systems by Appropriately Assigning Probabilities of Failure : Application to Rubble Mound Breakwaters

    NARCIS (Netherlands)

    Viet, N.D.; Verhagen, H.J.; Van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2008-01-01

    An appropriate assignment of probabilities of failure to subsystems and components in a structural system can bring a minimum of costs and risk. In this paper, a method for economic optimisation of rubble mound breakwaters using pre-assigned probabilities of failure is presented. Application to a

  12. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Ghana. Draft

    International Nuclear Information System (INIS)

    Guelpa, Jean-Paul; Vogel, Wolfram

    1982-12-01

    The Republic of Ghana has no claimed uranium resources in the categories Reasonably Assured and Estimated Additional. The only occurrences known are within pegmatites and are of no economic importance. The IUREP Orientation Phase Mission to Ghana estimates that the Speculative Resources of the country fall between 15,000 and 40,000 tonnes uranium. The IUREP Orientation Phase Mission to Ghana believes that the Panafrican Mobile Belt has the highest uranium potential of all geological units of the country. The Obosum beds are the priority number two target. A three years exploration programme is recommended for a total cost of US $ 5,000,000. The Ghana Atomic Energy Commission and the Ghana Geological Survey provide a basic infrastructure for uranium exploration. Any future uranium development in Ghana should be embedded in a well defined national uranium policy. It is recommended that such a policy be draw, up by the Ghanaian authorities

  13. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  14. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  15. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  16. Optimisation of Structural Systems by Appropriately Assigning Probabilities of Failure: Application to Rubble Mound Breakwaters

    OpenAIRE

    Viet, N.D.; Verhagen, H.J.; Van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2008-01-01

    An appropriate assignment of probabilities of failure to subsystems and components in a structural system can bring a minimum of costs and risk. In this paper, a method for economic optimisation of rubble mound breakwaters using pre-assigned probabilities of failure is presented. Application to a design case shows that the proposed method is useful in estimating the optimal design variables in a conceptual design.

  17. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Turkey

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) mission to Turkey. The IUREP Orientation Phase mission to Turkey estimates that the Speculative Resources of that country fall within the range of 21 000 to 55 000 tonnes of uranium. This potential is expected to lie in areas of Neogene and possibly other Tertiary sediments, in particular in the areas of the Menderes Massif and Central Anatolia. The mission describes a proposed exploration programme with expenditures over a five year period of between $80 million and $110 million, with nearly half of the amount being spent on drilling. (author)

  18. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  19. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  20. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Portugal

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) mission to Portugal. The IUREP Orientation Phase mission to Portugal estimates that the Speculative Resources of that country fall within the range 20,000 to 80,000 tonnes uranium. The majority of this potential is expected to be located in intergranitic vein deposits and in pre-Ordovician schists, but other favourable geological environments include episyenites and Meso-Cainozoic continental sediments. The mission recommends that approximately US$25 million be spent on exploration in Portugal over the next 10 years. The majority of this ($18 million) would be spent on drilling, with a further $7 million on surface surveys and airborne radiometric surveys. It is the opinion of the IUREP Orientation Phase Mission that the considerable funding required for the outlined programme would most suitably be realized by inviting national or foreign commercial organisations to participate in the exploration effort under a partnership or shared production arrangements. (author)

  1. Transient stability probability evaluation of power system incorporating with wind farm and SMES

    DEFF Research Database (Denmark)

    Fang, Jiakun; Miao, Lu; Wen, Jinyu

    2013-01-01

    Large scale renewable power generation brings great challenges to the power system operation and stabilization. Energy storage is one of the most important technologies to face the challenges. This paper proposes a method for transient stability probability evaluation of power system with wind farm...... and SMES. Firstly, a modified 11-bus test system with both wind farm and SMES has been implemented. The wind farm is represented as a doubly fed induction generator (DFIG). Then a stochastic-based approach to evaluate the probabilistic transient stability index of the power system is presented. Uncertain...... the probability indices. With the proposed method based on Monte-Carlo simulation and bisection method, system stability is "measured". Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve...

  2. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  3. The role of numeracy and approximate number system acuity in predicting value and probability distortion.

    Science.gov (United States)

    Patalano, Andrea L; Saltiel, Jason R; Machlin, Laura; Barth, Hilary

    2015-12-01

    It is well documented that individuals distort outcome values and probabilities when making choices from descriptions, and there is evidence of systematic individual differences in distortion. In the present study, we investigated the relationship between individual differences in such distortions and two measures of numerical competence, numeracy and approximate number system (ANS) acuity. Participants indicated certainty equivalents for a series of simple monetary gambles, and data were used to estimate individual-level value and probability distortion, using a cumulative prospect theory framework. We found moderately strong negative correlations between numeracy and value and probability distortion, but only weak and non-statistically reliable correlations between ANS acuity and distortions. We conclude that low numeracy contributes to number distortion in decision making, but that approximate number system acuity might not underlie this relationship.

  4. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  5. Probability characteristics of nonlinear dynamical systems driven by δ -pulse noise

    Science.gov (United States)

    Dubkov, Alexander A.; Rudenko, Oleg V.; Gurbatov, Sergey N.

    2016-06-01

    For a nonlinear dynamical system described by the first-order differential equation with Poisson white noise having exponentially distributed amplitudes of δ pulses, some exact results for the stationary probability density function are derived from the Kolmogorov-Feller equation using the inverse differential operator. Specifically, we examine the "effect of normalization" of non-Gaussian noise by a linear system and the steady-state probability density function of particle velocity in the medium with Coulomb friction. Next, the general formulas for the probability distribution of the system perturbed by a non-Poisson δ -pulse train are derived using an analysis of system trajectories between stimuli. As an example, overdamped particle motion in the bistable quadratic-cubic potential under the action of the periodic δ -pulse train is analyzed in detail. The probability density function and the mean value of the particle position together with average characteristics of the first switching time from one stable state to another are found in the framework of the fast relaxation approximation.

  6. On the average capacity and bit error probability of wireless communication systems

    KAUST Repository

    Yilmaz, Ferkan

    2011-12-01

    Analysis of the average binary error probabilities and average capacity of wireless communications systems over generalized fading channels have been considered separately in the past. This paper introduces a novel moment generating function-based unified expression for both average binary error probabilities and average capacity of single and multiple link communication with maximal ratio combining. It is a matter to note that the generic unified expression offered in this paper can be easily calculated and that is applicable to a wide variety of fading scenarios, and the mathematical formalism is illustrated with the generalized Gamma fading distribution in order to validate the correctness of our newly derived results. © 2011 IEEE.

  7. Spontaneous emission and scattering in a two-atom system: Conservation of probability and energy

    International Nuclear Information System (INIS)

    Berman, P. R.

    2007-01-01

    An explicit calculation of conservation of probability and energy in a two-atom system is presented. One of the atoms is excited initially and undergoes spontaneous emission. The field radiated by this atom can be scattered by the second atom. It is seen that the Weisskopf-Wigner approximation must be applied using a specific prescription to guarantee conservation of probability and energy. Moreover, for consistency, it is necessary to take into account the rescattering by the source atom of radiation scattered by the second atom

  8. Efficient Estimation of first Passage Probability of high-Dimensional Nonlinear Systems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    on the system memory. Consequently, high-dimensional problems can be handled, and nonlinearities in the model neither bring any difficulty in applying it nor lead to considerable reduction of its efficiency. These characteristics suggest that the method is a powerful candidate for complicated problems. First......An efficient method for estimating low first passage probabilities of high-dimensional nonlinear systems based on asymptotic estimation of low probabilities is presented. The method does not require any a priori knowledge of the system, i.e. it is a black-box method, and has very low requirements...... of the wind turbine model is estimated down to very low values; this demonstrates the efficiency and power of the method on a realistic high-dimensional highly nonlinear system....

  9. An optimized Line Sampling method for the estimation of the failure probability of nuclear passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2010-01-01

    The quantitative reliability assessment of a thermal-hydraulic (T-H) passive safety system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. In this work, Line Sampling (LS) is adopted for efficient MC sampling. In the LS method, an 'important direction' pointing towards the failure domain of interest is determined and a number of conditional one-dimensional problems are solved along such direction; this allows for a significant reduction of the variance of the failure probability estimator, with respect, for example, to standard random sampling. Two issues are still open with respect to LS: first, the method relies on the determination of the 'important direction', which requires additional runs of the T-H code; second, although the method has been shown to improve the computational efficiency by reducing the variance of the failure probability estimator, no evidence has been given yet that accurate and precise failure probability estimates can be obtained with a number of samples reduced to below a few hundreds, which may be required in case of long-running models. The work presented in this paper addresses the first issue by (i) quantitatively comparing the efficiency of the methods proposed in the literature to determine the LS important direction; (ii) employing artificial neural network (ANN) regression models as fast-running surrogates of the original, long-running T-H code to reduce the computational cost associated to the

  10. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  11. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  12. Comparison of Cerebral Glucose Metabolism between Possible and Probable Multiple System Atrophy

    Directory of Open Access Journals (Sweden)

    Kyum-Yil Kwon

    2009-05-01

    Full Text Available Background: To investigate the relationship between presenting clinical manifestations and imaging features of multisystem neuronal dysfunction in MSA patients, using 18F-fluorodeoxyglucose positron emission tomography (18F-FDG PET. Methods: We studied 50 consecutive MSA patients with characteristic brain MRI findings of MSA, including 34 patients with early MSA-parkinsonian (MSA-P and 16 with early MSA-cerebellar (MSA-C. The cerebral glucose metabolism of all MSA patients was evaluated in comparison with 25 age-matched controls. 18F-FDG PET results were assessed by the Statistic Parametric Mapping (SPM analysis and the regions of interest (ROI method. Results: The mean time from disease onset to 18F-FDG PET was 25.9±13.0 months in 34 MSA-P patients and 20.1±11.1 months in 16 MSA-C patients. Glucose metabolism of the putamen showed a greater decrease in possible MSA-P than in probable MSA-P (p=0.031. Although the Unified Multiple System Atrophy Rating Scale (UMSARS score did not differ between possible MSA-P and probable MSA-P, the subscores of rigidity (p=0.04 and bradykinesia (p= 0.008 were significantly higher in possible MSA-P than in probable MSA-P. Possible MSA-C showed a greater decrease in glucose metabolism of the cerebellum than probable MSA-C (p=0.016. Conclusions: Our results may suggest that the early neuropathological pattern of possible MSA with a predilection for the striatonigral or olivopontocerebellar system differs from that of probable MSA, which has prominent involvement of the autonomic nervous system in addition to the striatonigral or olivopontocerebellar system.

  13. Automatic Monitoring System Design and Failure Probability Analysis for River Dikes on Steep Channel

    Science.gov (United States)

    Chang, Yin-Lung; Lin, Yi-Jun; Tung, Yeou-Koung

    2017-04-01

    The purposes of this study includes: (1) design an automatic monitoring system for river dike; and (2) develop a framework which enables the determination of dike failure probabilities for various failure modes during a rainstorm. The historical dike failure data collected in this study indicate that most dikes in Taiwan collapsed under the 20-years return period discharge, which means the probability of dike failure is much higher than that of overtopping. We installed the dike monitoring system on the Chiu-She Dike which located on the middle stream of Dajia River, Taiwan. The system includes: (1) vertical distributed pore water pressure sensors in front of and behind the dike; (2) Time Domain Reflectometry (TDR) to measure the displacement of dike; (3) wireless floating device to measure the scouring depth at the toe of dike; and (4) water level gauge. The monitoring system recorded the variation of pore pressure inside the Chiu-She Dike and the scouring depth during Typhoon Megi. The recorded data showed that the highest groundwater level insides the dike occurred 15 hours after the peak discharge. We developed a framework which accounts for the uncertainties from return period discharge, Manning's n, scouring depth, soil cohesion, and friction angle and enables the determination of dike failure probabilities for various failure modes such as overtopping, surface erosion, mass failure, toe sliding and overturning. The framework was applied to Chiu-She, Feng-Chou, and Ke-Chuang Dikes on Dajia River. The results indicate that the toe sliding or overturning has the highest probability than other failure modes. Furthermore, the overall failure probability (integrate different failure modes) reaches 50% under 10-years return period flood which agrees with the historical failure data for the study reaches.

  14. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Thailand

    International Nuclear Information System (INIS)

    1985-01-01

    The IURBP Orientation Phase Mission assesses the Speculative Uranium Resources in Thailand to be within the range of 1 500 to 38 500 tonnes U. Geological environments which are considered by the Mission to be favourable for uranium occurrences include the following: sandstones of Jurassic to Triassic age; Tertiary sedimentary basins (northern Thailand); Tertiary sedimentary basins (southern Thailand); associated with fluorite deposits; granitic rocks; black shales and graphitic slates of the Palaeozoic; associated with sedimentary phosphate deposits; and associated with monazite sands. Physical conditions in Thailand, including a wet tropical climate, dense forest growth and rugged terrain in some areas and relative inaccessibility, make exploration difficult and costly. There is currently no ready accessibility to detailed topographic and geological maps and other basic data. This lack of availability is a severe constraint to systematic exploration. The lack of skilled personnel experienced in uranium studies and the low level of technical support is a serious hindrance to exploration in Thailand. (author)

  15. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Turkey. September to November 1980

    International Nuclear Information System (INIS)

    Ziehr, H.; Komura, A.

    1985-02-01

    The IUREP Orientation Phase Mission to Turkey estimates the Speculative Resources of the country to lie between 21 000 and 55 000 tonnes uranium. Past exploration in Turkey, dating from 1953, has indicated a very high number of uranium occurrences and radioactive anomalies, but ore deposits of significant size and grade have not been found. Present reserves amount to 4 600 tonnes uranium which can be allocated to approximately 15 sandstone type deposits in Neogene continental sediments. Several hundreds of other occurrences and radioactive anomalies exist where ore reserves have not been delineated. The uranium occurrences and radioactive anomalies can be divided according to host rock into (a) crystalline massif and (b) Tertiary continental sediment. The greatest geological potential for further resources is estimated to exist in the above mentioned two geological terrains. The most favourable geological potential exists in Neogene continental sedimentary basins near the crystalline massifs. Because surface exploration in the known favourable areas such as the Koepruebasi Basin has been so systematic, extensive, and successful, it is improbable that additional surface work will have much effect in increasing the number of new radioactive anomalies or uranium occurrences detected at the surface in these areas. Surface survey work in these areas should be mainly designed to assist the understanding of structures at depth. Surface reconnaissance survey work is, however, required in other parts of the above mentioned two geological terrains in this country. Before starting such a reconnaissance survey in new areas, the Mission suggests that a careful and extensive library study be conducted in close co-operation with sedimentologists, petrologists, and remote sensing specialists. The Mission suggests that in the medium term, 8 to 10 years, some 85 - 110 million U.S. Dollars be spent on airborne and ground surveys, including geological, radiometric, geochemical, and

  16. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Venezuela

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) mission to Venezuela. The IUREP Orientation Phase mission to Venezuela estimates that the Speculative Resources of that country fall within the range 2,000 to 42,000 tonnes uranium.- The majority of this potential is expected to be located in the Precambrian crystalline and sedimentary rocks of the Guayana Shield. Other potentially favorable geologic environments include Cretaceous phosphorite beds, continental sandstone and granitic rocks. The mission recommends that approximately US $18 million be spent on exploration in Venezuela over the next five years. The majority of this expenditure would be for surface surveys utilizing geologic studies, radiometric and geochemical surveys and some drilling for geologic information. Additional drilling would be required later to substantiate preliminary findings. (author)

  17. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Uganda

    International Nuclear Information System (INIS)

    1985-01-01

    A full report has been compiled describing the findings of the International Uranium Resources Evaluation Project (IUREP) Orientation Phase Mission to Uganda. The Mission suggest that the speculative uranium resources of the country could be within the very wide range of 0 to 105 000 tonnes of uranium metal. The Mission finds that most of these speculative resources are related to Proterozoic unconformities and to Cenozoic sandstones of the Western Rift Valley. Some potential is also associated with Post-tectonic granites. The Mission recommends to rehabilitate the Geological Survey of Uganda in order to enable it to conduct and support a uranium exploration programme for unconformity related and for standstone hosted uranium deposits. Recommended exploration methods encompass geological mapping and compilation, an airborne gamma-ray spectrometer survey north of 1 deg. North latitude, stream sediment sampling, and ground scintillometric surveys in favourable areas. Follow up work should include VLF-EM surveys, emanometry and drilling. (author)

  18. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Colombia

    International Nuclear Information System (INIS)

    1984-01-01

    A full report has been released describing the findings of the International Uranium Resources Evaluation Project (IUREP) Orientation Phase Mission to Colombia. The Mission suggests that the speculative uranium resources of the country could be within the very wide range of 20 000 tonnes of 220 000 tonnes of uranium metal. The Mission finds that the area with the highest potential is the Llanos Orientales (Interior Zone), which has the potential of hosting quartz-pebble conglomerate deposits, Proterozoic unconformity-related deposits and sandstone deposits. The Mission recommends that approximately US$80 million should be expended in a phased ten-year exploration programme. It is likely that the majority of the funds will be needed for drilling, followed by ground surveys and airborne radiometry. It is the opinion of the Mission that the considerable funds required for the proposed programme could most suitably be raised by inviting national or foreign commercial organizations to participate under a shared production agreement. (author)

  19. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Burundi

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) Mission to Burundi. The IUREP Orientation Phase Mission to Burundi estimates that the Speculative Resources of that country fall within the range of 300 to more than 4 100 tonnes of uranium. The potential is rather evenly distributed throughout the Proterozoic of Burundi in various geological environments (unconformity, hydrothermal, fault controlled, etc.). The mission recommends that over a period of five years U.S. $ 3 to 4.5 million be spent on exploration in Burundi, with even spending on the various exploration techniques as e.g. prospecting, drilling trenching, geophysical surveys, analyses, etc. (author)

  20. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Bolivia

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) mission to Bolivia. The IUREP Orientation Phase mission to Bolivia estimates that the Speculative Uranium Resources of that country fall within the range of 100 to 107 500 tonnes uranium. The majority of this potential is expected to be located in the Precambrian crystalline and sedimentary rocks of the southwestern part of the Central Brazilian Shield. Other potentially favourable geologic environments include Palaeozoic two mica granites and their metasedimentary hosts, Mesozoic granites and granodiorites as well as the intruded formations and finally Tertiary acid to intermediate volcanics. The mission recommends that approximately US$ 13 million be spent on exploration in Bolivia over a five-year period. The majority of this expenditure would be for airborne and surface exploration utilising geologic, magnetometric, radiometric, and geochemical methods and some pitting, trenching, tunneling and drilling to further evaluate the discovered occurrences. (author)

  1. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Cameroon

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) Mission to Cameroon. The IUREP Orientation Phase Mission to Cameroon estimates the Speculative Resources of that country to be in the order of 10 000 tonnes uranium for syenite-associated U-deposits in southern Cameroon, and in the order of 5 000 tonnes uranium for uranium deposits associated with albitized and desilicified late tectonic Panafrican granites (episyenite) and Paleozoic volcanics in northern Cameroon. No specific tonnage is given for Francevillian equivalents (DJA-Series) and for Mesozoic and Cenozoic sedimentary basins, which are thought to hold limited potential for sandstone hosted uranium. However the Douala basin, consisting of mixed marine and continental sequences merits some attention. No specific budget and programme for uranium exploration are proposed for Cameroon. Instead specific recommendations concerning specific potential environments and general recommendation concerning the methodology of exploration are made. (author)

  2. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Madagascar

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been made public which describes the findings of the International Uranium Resources Evaluation Project (IUREP) Mission to Madagascar. The IUREP Orientation Phase Mission to Madagascar estimates the Speculative Resources of that country to be within the wide range of 4 000 to 38 000 tonnes uranium. Such resources could lie in areas with known occurrences (uranothorianite, Ft. Dauphin up to 5 000 t U, i.e. 'pegmatoids'; uranocircite, Antsirabe up to 3 000 t U in Neogene sediments; carnotiteautonite, Karoo area up to 30 000 t U in sandstones and in areas with as yet untested environments (e.g. related to unconformities and calcretes). Modifications to existing uranium exploration programmes are suggested and policy alternatives reviewed. No specific budget is proposed. (author)

  3. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Somalia

    International Nuclear Information System (INIS)

    Levich, Robert A.; Muller-Kahle, Eberhard

    1983-04-01

    The IUREP Orientation Phase Mission to Somalia suggests that in addition to the reasonably assured resources (RAR) of 5 000 t uranium and estimated additional resources (EAR) of 11 000 t uranium in calcrete deposits, the speculative resources (SR) could be within the wide range of 0 - 150 000 t uranium. The majority of these speculative resources are related to sandstone and calcrete deposits. The potential for magmatic hydrothermal deposits is relatively small. The Mission recommends an exploration programme of about US $ 22 000 000 to test the uranium potential of the country which is thought to be excellent. The Mission also suggests a reorganization of the Somalia Geological Survey in order to improve its efficiency. Recommended methods include geological mapping, Landsat Imagery Interpretation, airborne and ground scintillometer surveys, and geochemistry. Follow-up radiometric surveys, exploration geophysics, mineralogical studies, trenching and drilling are proposed in favourable areas

  4. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Rwanda

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) Mission to Rwanda. The IUREP Orientation Phase Mission to Rwanda estimates that the Speculative Resources of that country fall within the range of 500 to 5 000 tonnes of uranium. The majority of this potential is expected to be located in the Precambrian Ruzizian, especially in conjunction with tectonized pegmatoidal remobilizations of metamorphic sediments of western Rwanda. Other favourable geological environments include lamprophyric dikes and post tectonic granites of central Rwanda. The Mission recommends that over a period of five years approximately US$4.2 million be spent on exploration in Rwanda. The majority of this would be spent on airborne and ground geophysical surveys ($1.5 million) and exploration drilling ($1 million). Prospecting, trenching and tunneling and analytical work would require the remainder of the $4.2 million ($1.7 million). (author)

  5. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Peru

    International Nuclear Information System (INIS)

    1984-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (TUREP) Mission to Peru. The IUREP Orientation Phase Mission to Peru estimates that the Speculative Resources of that country fall within the range of 6 000 to 11 000 tonnes uranium. The majority of this potential is expected to be located in Late Tertiary ignimbrites and associated sediments in the high Andes of southern Peru. Other favourable geological environments include calcretes, developed from Tertiary volcanogenic sources over the Precambrian in the Pacific Coastal desert in southern Peru, and Hercynian subvolcanic granites in the eastern Cordillera of southern Peru. The Mission recommends that over a period of five years approximately U.S. $10 million be spent on exploration in Peru. The majority of this would be spent on drilling ($5 million) and tunnelling ($2 million) with an additional $3 million on surface and airborne radiometric surveys. (author)

  6. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Ghana

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) Mission to Ghana. The IUREP Orientation Phase Mission to Ghana estimates that the Speculative Resources of that country fall within the range of 15 000 to 40 000 tonnes of uranium. The majority of this potential is expected to be located in the Proterozoic Panafrican Mobile Belt (up to 17 000 tonnes uranium) and the Paleozoic Obosum Beds of the Voltaian basin (up to 15 000 tonnes uranium), the remainder being associated with various other geological environments. The mission recommends that over a period of three (3) years approximately U.S. $5 million) would be spent on exploration in Ghana. A major part of this (U.S $2 million) would be spent on an airborne spectrometer survey over the Voltaian basin (Obosum beds), much of the remainder being spent on ground surveys, trenching and percussion drilling. (author)

  7. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  8. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  9. Probable Griseofulvin-Induced Drug Reaction with Eosinophilia and Systemic Symptoms in a Child.

    Science.gov (United States)

    Smith, Robert J; Boos, Markus D; McMahon, Patrick

    2016-09-01

    A 9-year-old boy presented with fever, rash, anterior cervical lymphadenopathy, high liver enzymes, atypical lymphocytosis, and eosinophilia (drug reaction with eosinophilia and systemic symptoms [DRESS]). His history was notable for having taken griseofulvin for 3 weeks prior to onset of these findings. He improved after treatment with oral prednisone. We present a rare case of probable DRESS secondary to griseofulvin. © 2016 Wiley Periodicals, Inc.

  10. Deconvolution Filtering for Nonlinear Stochastic Systems with Randomly Occurring Sensor Delays via Probability-Dependent Method

    Directory of Open Access Journals (Sweden)

    Yuqiang Luo

    2013-01-01

    Full Text Available This paper deals with a robust H∞ deconvolution filtering problem for discrete-time nonlinear stochastic systems with randomly occurring sensor delays. The delayed measurements are assumed to occur in a random way characterized by a random variable sequence following the Bernoulli distribution with time-varying probability. The purpose is to design an H∞ deconvolution filter such that, for all the admissible randomly occurring sensor delays, nonlinear disturbances, and external noises, the input signal distorted by the transmission channel could be recovered to a specified extent. By utilizing the constructed Lyapunov functional relying on the time-varying probability parameters, the desired sufficient criteria are derived. The proposed H∞ deconvolution filter parameters include not only the fixed gains obtained by solving a convex optimization problem but also the online measurable time-varying probability. When the time-varying sensor delays occur randomly with a time-varying probability sequence, the proposed gain-scheduled filtering algorithm is very effective. The obtained design algorithm is finally verified in the light of simulation examples.

  11. Personnel reliability impact on petrochemical facilities monitoring system's failure skipping probability

    Science.gov (United States)

    Kostyukov, V. N.; Naumenko, A. P.

    2017-08-01

    The paper dwells upon urgent issues of evaluating impact of actions conducted by complex technological systems operators on their safe operation considering application of condition monitoring systems for elements and sub-systems of petrochemical production facilities. The main task for the research is to distinguish factors and criteria of monitoring system properties description, which would allow to evaluate impact of errors made by personnel on operation of real-time condition monitoring and diagnostic systems for machinery of petrochemical facilities, and find and objective criteria for monitoring system class, considering a human factor. On the basis of real-time condition monitoring concepts of sudden failure skipping risk, static and dynamic error, monitoring systems, one may solve a task of evaluation of impact that personnel's qualification has on monitoring system operation in terms of error in personnel or operators' actions while receiving information from monitoring systems and operating a technological system. Operator is considered as a part of the technological system. Although, personnel's behavior is usually a combination of the following parameters: input signal - information perceiving, reaction - decision making, response - decision implementing. Based on several researches on behavior of nuclear powers station operators in USA, Italy and other countries, as well as on researches conducted by Russian scientists, required data on operator's reliability were selected for analysis of operator's behavior at technological facilities diagnostics and monitoring systems. The calculations revealed that for the monitoring system selected as an example, the failure skipping risk for the set values of static (less than 0.01) and dynamic (less than 0.001) errors considering all related factors of data on reliability of information perception, decision-making, and reaction fulfilled is 0.037, in case when all the facilities and error probability are under

  12. Efficient Geometric Probabilities of Multi-Transiting Exoplanetary Systems from CORBITS

    Science.gov (United States)

    Brakensiek, Joshua; Ragozzine, Darin

    2016-04-01

    NASA’s Kepler Space Telescope has successfully discovered thousands of exoplanet candidates using the transit method, including hundreds of stars with multiple transiting planets. In order to estimate the frequency of these valuable systems, it is essential to account for the unique geometric probabilities of detecting multiple transiting extrasolar planets around the same parent star. In order to improve on previous studies that used numerical methods, we have constructed an efficient, semi-analytical algorithm called the Computed Occurrence of Revolving Bodies for the Investigation of Transiting Systems (CORBITS), which, given a collection of conjectured exoplanets orbiting a star, computes the probability that any particular group of exoplanets can be observed to transit. The algorithm applies theorems of elementary differential geometry to compute the areas bounded by circular curves on the surface of a sphere. The implemented algorithm is more accurate and orders of magnitude faster than previous algorithms, based on comparisons with Monte Carlo simulations. We use CORBITS to show that the present solar system would only show a maximum of three transiting planets, but that this varies over time due to dynamical evolution. We also used CORBITS to geometrically debias the period ratio and mutual Hill sphere distributions of Kepler's multi-transiting planet candidates, which results in shifting these distributions toward slightly larger values. In an Appendix, we present additional semi-analytical methods for determining the frequency of exoplanet mutual events, I.e., the geometric probability that two planets will transit each other (planet-planet occultation, relevant to transiting circumbinary planets) and the probability that this transit occurs simultaneously as they transit their star. The CORBITS algorithms and several worked examples are publicly available.

  13. Development of a Nonlinear Probability of Collision Tool for the Earth Observing System

    Science.gov (United States)

    McKinley, David P.

    2006-01-01

    The Earth Observing System (EOS) spacecraft Terra, Aqua, and Aura fly in constellation with several other spacecraft in 705-kilometer mean altitude sun-synchronous orbits. All three spacecraft are operated by the Earth Science Mission Operations (ESMO) Project at Goddard Space Flight Center (GSFC). In 2004, the ESMO project began assessing the probability of collision of the EOS spacecraft with other space objects. In addition to conjunctions with high relative velocities, the collision assessment method for the EOS spacecraft must address conjunctions with low relative velocities during potential collisions between constellation members. Probability of Collision algorithms that are based on assumptions of high relative velocities and linear relative trajectories are not suitable for these situations; therefore an algorithm for handling the nonlinear relative trajectories was developed. This paper describes this algorithm and presents results from its validation for operational use. The probability of collision is typically calculated by integrating a Gaussian probability distribution over the volume swept out by a sphere representing the size of the space objects involved in the conjunction. This sphere is defined as the Hard Body Radius. With the assumption of linear relative trajectories, this volume is a cylinder, which translates into simple limits of integration for the probability calculation. For the case of nonlinear relative trajectories, the volume becomes a complex geometry. However, with an appropriate choice of coordinate systems, the new algorithm breaks down the complex geometry into a series of simple cylinders that have simple limits of integration. This nonlinear algorithm will be discussed in detail in the paper. The nonlinear Probability of Collision algorithm was first verified by showing that, when used in high relative velocity cases, it yields similar answers to existing high relative velocity linear relative trajectory algorithms. The

  14. Outage Probability and Ergodic Capacity of Spectrum-Sharing Systems with MRC Diversity

    Science.gov (United States)

    Jarrouj, Jiana; Blagojevic, Vesna; Ivanis, Predrag

    2016-03-01

    The spectrum sharing system employing maximum ratio combining (MRC) is analyzed in Nakagami fading environment, for the case when the interference from the primary user is present at the input of the secondary user receiver. The closed-form expressions for the probability density function of the signal-to-interference-and-noise ratio, the outage probability and the ergodic capacity of the SU link are derived under both peak interference and maximal transmit power constraints. Asymptotical expressions are provided for the important region where peak interference power constraint dominates and the case when the interference from the primary user's is dominant compared to the noise at the secondary user's receiver. The obtained expressions are presented for both cases of outdated and mean-value based power allocation and verified by using Monte Carlo simulation method.

  15. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Sudan. February-March 1981

    International Nuclear Information System (INIS)

    Kneupper, G.; Scivetti, N.

    1981-01-01

    The IUREP Orientation Phase Mission to the Democratic Republic of the Sudan believes that the Speculative Resources of the country might fall between 20,000 and 40,000 tonnes uranium and more. This indicates that the Speculative Resources of the Sudan could be significantly higher than previously estimated (7,500 tonnes uranium) by the NEA/IAEA Steering Group on the Uranium Resources - IUREP Phase I. The Government is willing to consider valid exploration programmes presented by prospective partners as long as they serve the interests of both parties. Within the general six-year (1977/78-1982/83) plan for development of the country's mineral resources, the Ministry of Energy and Mining has set up certain priorities which it would like to see expeditiously implemented: uranium exploration and production stands high on the list of priorities. On the basis of very limited information on regional geology and on previous exploration which was available to the Mission, it is estimated that the greatest potential for the Speculative Resources of possible economic significance will prove to occur in the following geological environments of the Sudan (Red Sea Hills area is not included): precambrian basement complex, palaeozoic-mesozoic-tertiary sedimentary basins and the tertiary to recent calcretes. The IUREP Orientation Phase Mission believes that some 20 Million US$ (very rough estimate) will be needed to (1) check the validity of the basic geological concepts formulated on the uranium potential of the selected areas, (2) accumulate diagnostic geological, geophysical, geochemical data indicative of a true uranium potential there, (3) study the basement complex rocks and the sedimentary formations at least on a broad structural-stratigraphic reconnaissance basis (a tremendous amount of valuable water drilling data has accumulated over the last years for some of the selected sedimentary basins) and (4) determine the most appropriate investigation techniques to be utilized

  16. Probability of loss of assured safety in systems with multiple time-dependent failure modes.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Pilch, Martin.; Sallaberry, Cedric Jean-Marie.

    2012-09-01

    Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high-consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to deactivate the entire system before the SL system fails (i.e., degrades into a configuration that could allow an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). Representations for PLOAS for situations in which both link physical properties and link failure properties are time-dependent are derived and numerically evaluated for a variety of WL/SL configurations, including PLOAS defined by (i) failure of all SLs before failure of any WL, (ii) failure of any SL before failure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS are considered.

  17. Probability of Loss of Crew Achievability Studies for NASA's Exploration Systems Development

    Science.gov (United States)

    Boyer, Roger L.; Bigler, Mark; Rogers, James H.

    2015-01-01

    Over the last few years, NASA has been evaluating various vehicle designs for multiple proposed design reference missions (DRM) beyond low Earth orbit in support of its Exploration Systems Development (ESD) programs. This paper addresses several of the proposed missions and the analysis techniques used to assess the key risk metric, probability of loss of crew (LOC). Probability of LOC is a metric used to assess the safety risk as well as a design requirement. These risk assessments typically cover the concept phase of a DRM, i.e. when little more than a general idea of the mission is known and are used to help establish "best estimates" for proposed program and agency level risk requirements. These assessments or studies were categorized as LOC achievability studies to help inform NASA management as to what "ball park" estimates of probability of LOC could be achieved for each DRM and were eventually used to establish the corresponding LOC requirements. Given that details of the vehicles and mission are not well known at this time, the ground rules, assumptions, and consistency across the programs become the important basis of the assessments as well as for the decision makers to understand.

  18. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Morocco

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published on the findings of the mission to Morocco under the International Uranium Resources Evaluation Project (IUREP) Orientation Phase. The IUREP Orientation Phase Mission estimates that the speculative resources of Morocco range from 70 000 to 180 000 tonnes of uranium, half of which could be expected to occur in the Northern Provinces, which are relatively well explored, and the other half in the little explored Southern Provinces. In the north, speculative resources are fairly evenly distributed among the various types of deposit, in particular vein deposits (intragranitic and contact) linked with Hercynian and Precambrian blocks, the sandstone type deposits linked with Mesozoic strata and the volcanogenic deposits, especially of Precambrian age. The potential for large high-grade deposits, especially for those linked with unconformities and linear albitites, has been little investigated in Morocco and is chiefly thought to lie in the Precambrian in the Anti-Atlas and Southern Provinces. Here, the presence of acid volcanic rock reinforces the uranium potential, and there is also some potential for calcrete-related deposits. Phosphate-related uranium, to be recovered shortly, constitutes by far the largest reserves in Morocco, estimated at about 7 million tonnes of recoverable uranium. Recommendations have been made for further study of known occurrences and identification of new ones, such as unconformity and albitite-related deposits. (author) [fr

  19. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Zambia

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published which describes the findings of the International Uranium Resources Evaluation Project (IUREP) mission to Zambia. The IUREP Orientation Phase mission to Zambia estimates that the Speculative Resources of that country fall within the range of 33 000 and 100 000 tonnes uranium. The majority of these resources are believed to exist in the Karoo sediments. Other potentially favourable geological environments are the Precambrian Katanga sediments, as well as intrusive rocks of different chemical compositions and surficial duricrusts. Previous unofficial estimates of Zambia's Reasonably Assured Resources (RAR) and Estimated Additional Resources (EAR) are considered to be still valid: the total RAR amount to 6 000 tonnes uranium, located in Karoo (4 000 tonnes) and Katanga (2 000 tonnes) sediments, while the EAR are believed to total 4 000 tonnes being found only in Karoo sediments. The mission recommends that approximately US$ 40 million be spent on uranium exploration in Zambia over 10 years. The largest part of this expenditure would be for drilling, while the remainder should be spent on airborne and ground surveys, as well as on interpretative work on previous airborne data, Landsat imageries, etc. (author)

  20. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Somalia

    International Nuclear Information System (INIS)

    1985-01-01

    A full report has been compiled describing the findings of the International Uranium Resources Evaluation Project (IUREP) Orientation Phase Mission to Somalia. The Mission suggests that in addition to the reasonably assured resources (RAR) of 5 000 t uranium and estimated additional resources (EAR) of 11 000 t uranium in calcrete deposits, the speculative resources (SR) could be within the wide range of 0 - 150 000 t uranium. The majority of these speculative resources are related to sandstone and calcrete deposits. The potential for magmatic hydrothermal deposits is relatively small. The Mission recommends an exploration programme of about US$ 22 000 000 to test the uranium potential of the country which is thought to be excellent. The Mission also suggests a reorganization of the Somalia Geological Survey in order to improve its efficiency. Recommended methods include geological mapping, Landsat imagery interpretation, airborne and ground scintillometer surveys, and geochemistry. Follow-up radiometric surveys, exploration geophysics, mineralogical studies, trenching and drilling are proposed in favourable areas. (author)

  1. International Uranium Resources Evaluation Project (IUREP) orientation phase mission summary report: Morocco

    International Nuclear Information System (INIS)

    1985-01-01

    A report has recently been published on the findings of the mission to Morocco under the International Uranium Resources Evaluation Project (IUREP) Orientation Phase. The IUREP Orientation Phase Mission estimates that the speculative resources of Morocco range from 70 000 to 180 000 tonnes of uranium, half of which could be expected to occur in the Northern Provinces, which are relatively well explored, and the other half in the little explored Southern Provinces. In the north, speculative resources are fairly evenly distributed among the various types of deposit, in particular vein deposits (intragranitic and contact) linked with Hercynian and Precambrian blocks, the sandstone type deposits linked with Mesozoic strata and the volcanogenic deposits, especially of Precambrian age. The potential for large high-grade deposits, especially for those linked with unconformities and linear albitites, has been little investigated in Morocco and is chiefly thought to lie in the Precambrian in the Anti-Atlas and Southern Provinces. Here, the presence of acid volcanic rock reinforces the uranium potential, and there is also some potential for calcrete-related deposits. Phosphate-related uranium, to be recovered shortly, constitutes by far the largest reserves in Morocco, estimated at about 7 million tonnes of recoverable uranium. Recommendations have been made for further study of known occurrences and identification of new ones, such as unconformity and albitite-related deposits. (author)

  2. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  3. Outage probability of dual-hop FSO fixed gain relay transmission systems

    KAUST Repository

    Zedini, Emna

    2016-12-24

    In this paper, we analyze the end-to-end performance of dual-hop free-space optical (FSO) fixed gain relaying systems in the presence of atmospheric turbulence as well as pointing errors. More specifically, an exact closed-form expression for the outage probability is presented in terms of the bivariate Fox\\'s H function that accounts for both heterodyne detection as well as intensity modulation with direct detection. At high signal-to-noise ratio (SNR) regime, we provide very tight asymptotic result for this performance metric in terms of simple elementary functions. By using dual-hop FSO relaying, we demonstrate a better system performance as compared to the single FSO link. Numerical and Monte-Carlo simulation results are provided to verify the accuracy of the newly proposed results, and a perfect agreement is observed.

  4. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Republic of Burundi. Draft

    International Nuclear Information System (INIS)

    Gehrisch, W.; Chaigne, M.

    1983-06-01

    The basic objective of the International Uranium Resources Evaluation project lUREP is to 'Review the present body of knowledge pertinent to the existence of uranium resources, to review and evaluate the potential for the discovery of additional uranium resources and to suggest new exploration efforts which might be carried out in promising areas in collaboration with the countries concerned'. Therefore, the scope of the IUREP orientation phase Mission to Burundi was to review all data on past exploration in Burundi, to develop a better understanding of the uranium potential of the country, to make an estimate of the speculative resources of the country, to make recommendation as appropriate on the best methods or techniques for evaluating the resources in the favourable areas and for estimating possible costs as well, to compile a report which could be immediately available to the Burundian authorities. This mission gives a general introduction, a geological review of Burundi, information on non-uranium mining in Burundi, the history of uranium exploration, occurrences of uranium IUREP mission field reconnaissance, favourable areas for speculative potential, the uranium resources position and recommendations for future exploration. Conclusions are the following. The IUREP Orientation -phase mission to Burundi believes that the Speculative Resources of that country fall b etween 300 and 4100 tons uranium oxide but a less speculative appraisal is more likely between 0 and 1000 tons. There has been no uranium production and no official estimates of Uranium Resources in Burundi. Past exploration mainly dating from 1969 onwards and led the UNDP Mineral project has indicated a limited number of uranium occurrences and anomalies. The speculative uranium resources are thought to be possibly associated with potential unconformity related vein-like deposits of the Lower Burundian. Other speculative uranium resources could be associated with granitic or peribatholitic

  5. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  6. A Comparative Study of Probability Collectives Based Multi-agent Systems and Genetic Algorithms

    Science.gov (United States)

    Huang, Chien-Feng; Wolpert, David H.; Bieniawski, Stefan; Strauss, Charles E. M.

    2005-01-01

    We compare Genetic Algorithms (GA's) with Probability Collectives (PC), a new framework for distributed optimization and control. In contrast to GA's, PC-based methods do not update populations of solutions. Instead they update an explicitly parameterized probability distribution p over the space of solutions. That updating of p arises as the optimization of a functional of p. The functional is chosen so that any p that optimizes it should be p peaked about good solutions. The PC approach works in both continuous and discrete problems. It does not suffer from the resolution limitation of the finite bit length encoding of parameters into GA alleles. It also has deep connections with both game theory and statistical physics. We review the PC approach using its motivation as the information theoretic formulation of bounded rationality for multi-agent systems. It is then compared with GA's on a diverse set of problems. To handle high dimensional surfaces, in the PC method investigated here p is restricted to a product distribution. Each distribution in that product is controlled by a separate agent. The test functions were selected for their difficulty using either traditional gradient descent or genetic algorithms. On those functions the PC-based approach significantly outperforms traditional GA's in both rate of descent, trapping in false minima, and long term optimization.

  7. Predicting the Probability for Faculty Adopting an Audience Response System in Higher Education

    Directory of Open Access Journals (Sweden)

    Tan Fung Ivan Chan

    2016-08-01

    Full Text Available Instructional technologies can be effective tools to foster student engagement, but university faculty may be reluctant to integrate innovative and evidence-based modern learning technologies into instruction. Based on Rogers’ diffusion of innovation theory, this quantitative, nonexperimental, one-shot cross-sectional survey determined what attributes of innovation (relative advantage, compatibility, complexity, trialability, and observability predict the probability of faculty adopting the audience response system (ARS into instruction. The sample of the study consisted of 201 faculty at a university in the southeastern United States. Binary logistic regression analysis was used to determine the attributes of innovation that predict the probability of faculty adopting the ARS into instruction. Out of the five attributes, compatibility and trialability made significant contributions to the model. The implication of the findings is that, in order to maximize adoption, the faculty needs to be given the opportunity to pre-test the ARS prior to implementation, and they need to know how the technology will assist them in achieving their pedagogical goals. Recommendations were made to leverage these attributes to foster faculty adoption of the ARS into instruction.

  8. Reliability Assessment of Wind Farm Electrical System Based on a Probability Transfer Technique

    Directory of Open Access Journals (Sweden)

    Hejun Yang

    2018-03-01

    Full Text Available The electrical system of a wind farm has a significant influence on the wind farm reliability and electrical energy yield. The disconnect switch installed in an electrical system cannot only improve the operating flexibility, but also enhance the reliability for a wind farm. Therefore, this paper develops a probabilistic transfer technique for integrating the electrical topology structure, the isolation operation of disconnect switch, and stochastic failure of electrical equipment into the reliability assessment of wind farm electrical system. Firstly, as the traditional two-state reliability model of electrical equipment cannot consider the isolation operation, so the paper develops a three-state reliability model to replace the two-state model for incorporating the isolation operation. In addition, a proportion apportion technique is presented to evaluate the state probability. Secondly, this paper develops a probabilistic transfer technique based on the thoughts that through transfer the unreliability of electrical system to the energy transmission interruption of wind turbine generators (WTGs. Finally, some novel indices for describing the reliability of wind farm electrical system are designed, and the variance coefficient of the designed indices is used as a convergence criterion to determine the termination of the assessment process. The proposed technique is applied to the reliability assessment of a wind farm with the different topologies. The simulation results show that the proposed techniques are effective in practical applications.

  9. Demonstration Integrated Knowledge-Based System for Estimating Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, Jack L.

    1999-04-21

    Human Reliability Analysis (HRA) is currently comprised of at least 40 different methods that are used to analyze, predict, and evaluate human performance in probabilistic terms. Systematic HRAs allow analysts to examine human-machine relationships, identify error-likely situations, and provide estimates of relative frequencies for human errors on critical tasks, highlighting the most beneficial areas for system improvements. Unfortunately, each of HRA's methods has a different philosophical approach, thereby producing estimates of human error probabilities (HEPs) that area better or worse match to the error likely situation of interest. Poor selection of methodology, or the improper application of techniques can produce invalid HEP estimates, where that erroneous estimation of potential human failure could have potentially severe consequences in terms of the estimated occurrence of injury, death, and/or property damage.

  10. Probability of failure of the waste hoist brake system at the Waste Isolation Pilot Plant (WIPP)

    International Nuclear Information System (INIS)

    Greenfield, M.A.; Sargent, T.J.; Stanford Univ., CA

    1998-01-01

    In its most recent report on the annual probability of failure of the waste hoist brake system at the Waste Isolation Pilot Plant (WIPP), the annual failure rate is calculated to be 1.3E(-7)(1/yr), rounded off from 1.32E(-7). A calculation by the Environmental Evaluation Group (EEG) produces a result that is about 4% higher, namely 1.37E(-7)(1/yr). The difference is due to a minor error in the US Department of Energy (DOE) calculations in the Westinghouse 1996 report. WIPP's hoist safety relies on a braking system consisting of a number of components including two crucial valves. The failure rate of the system needs to be recalculated periodically to accommodate new information on component failure, changes in maintenance and inspection schedules, occasional incidents such as a hoist traveling out-of-control, either up or down, and changes in the design of the brake system. This report examines DOE's last two reports on the redesigned waste hoist system. In its calculations, the DOE has accepted one EEG recommendation and is using more current information about the component failures rates, the Nonelectronic Parts Reliability Data (NPRD). However, the DOE calculations fail to include the data uncertainties which are described in detail in the NPRD reports. The US Nuclear Regulatory Commission recommended that a system evaluation include mean estimates of component failure rates and take into account the potential uncertainties that exist so that an estimate can be made on the confidence level to be ascribed to the quantitative results. EEG has made this suggestion previously and the DOE has indicated why it does not accept the NRC recommendation. Hence, this EEG report illustrates the importance of including data uncertainty using a simple statistical example

  11. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Thailand. February-March 1981

    International Nuclear Information System (INIS)

    Inazumi, Satoru; Meyer, John H.

    1981-01-01

    The I.U.R.E.P. Orientation Phase Mission assesses the Speculative Uranium Resources in Thailand to be within the range of 1,500 to 38,500 tonnes U. This range is higher than the previous assessment in Phase I because the Mission recognizes additional favourable geological environments. At the same time, the untested and therefore the unknown degree of mineralization in some of these environments is acknowledged. Past exploration, dating from 1977, has been mainly confined to ground surveys of a small mineralized area and to airborne gamma-ray surveys of two small test areas. Ground reconnaissance work and prospecting has recognized some mineralization in several different host rocks and environments. Geological environments considered by the Mission to be favourable for uranium occurrences include sandstone of Jurassic to Triassic age, tertiary sedimentary basins (northern Thailand), tertiary sedimentary basins (southern Thailand), associated with fluorite deposits, granitic rocks, black shales and graphitic slates of the Paleozoic, associated with sedimentary phosphate deposits and associated with monazite sands. It is recommended that exploration for uranium resources in Thailand should continue. Planners of future exploration programmes should take the following activities into consideration. Rapid extension of carborne surveys to cover, without excessive overburdening, all areas having sufficient road density. Airborne gamma-ray surveys should be carried out in certain selected areas. In the selection of such areas, the considerable higher cost factor attendant on this method of surveying dictates that airborne surveys should only be carried out where carborne surveys prove ineffective (lack of adequate road network.) and where the topography is sufficiently even to assure a low but safe clearance and meaningful results. In certain areas, including the Khorat Plateau and the Tertiary Basins in northern and southern Thailand, there is a need for widely spaced

  12. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Bolivia. Draft

    International Nuclear Information System (INIS)

    Leroy, Jacques; Mueller-Kahle, Eberhard

    1982-08-01

    The uranium exploration done so far in Bolivia has been carried out by COBOEN, partly with IAEA support, and AGIP S.p.A. of Italy, which between 1974 and 1978 explored four areas in various parts of Bolivia under a production sharing contract with COBOEN. The basic objective of the International Uranium Resources Evaluation Project (IUREP) is to 'review the present body of knowledge pertinent to the existence of uranium resources, to review and evaluate the potential for discovery of additional uranium resources, and to suggest new exploitation efforts which might be carried out in promising areas in collaboration with the country concerned'. Following the initial bibliographic study which formed Phase I of IUREP, it was envisaged that a further assessment in cooperation with, and within, the country concerned would provide a better delineation of areas of high potential and a more reliable estimate as to the degree of favourability for the discovery of additional uranium resources. It was planned that such work would be accomplished through field missions to the country concerned and that these field missions and the resulting report would be known as the Orientation Phase of IUREP. The purpose of the Orientation Phase mission to Bolivia was a) to develop a better understanding of the uranium potential of the country, b) to make an estimate of the Speculative Resources of the country, c) to delineate areas favourable for the discovery of these uranium resources, d) to make recommendations as appropriate on the best methods for evaluating the favourable areas, operating procedures and estimated possible costs, e) to develop the logistical data required to carry out any possible further work, and f) to compile a report which would be immediately available to the Bolivian authorities. The mission reports contains information about a general introduction, non-uranium exploration and mining in Bolivia, manpower in exploration, geological review of Bolivia, past uranium

  13. Use of fault tree technique to determine the failure probability of electrical systems of IE class in nuclear installations

    International Nuclear Information System (INIS)

    Cruz S, W.D.

    1988-01-01

    This paper refers to emergency safety systems of Angra INPP (Brazil 1626 Mw(e)) such as containment, heat removal, emergency removal system, radioactive elements removal from containment environment, berated water infection, etc. Associated with these systems, the failure probability calculation of IE Class bars is achieved, this is a safety classification for electrical equipment essential for the systems mentioned above

  14. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  15. On the evolution of the density probability density function in strongly self-gravitating systems

    International Nuclear Information System (INIS)

    Girichidis, Philipp; Konstandin, Lukas; Klessen, Ralf S.; Whitworth, Anthony P.

    2014-01-01

    The time evolution of the probability density function (PDF) of the mass density is formulated and solved for systems in free-fall using a simple approximate function for the collapse of a sphere. We demonstrate that a pressure-free collapse results in a power-law tail on the high-density side of the PDF. The slope quickly asymptotes to the functional form P V (ρ)∝ρ –1.54 for the (volume-weighted) PDF and P M (ρ)∝ρ –0.54 for the corresponding mass-weighted distribution. From the simple approximation of the PDF we derive analytic descriptions for mass accretion, finding that dynamically quiet systems with narrow density PDFs lead to retarded star formation and low star formation rates (SFRs). Conversely, strong turbulent motions that broaden the PDF accelerate the collapse causing a bursting mode of star formation. Finally, we compare our theoretical work with observations. The measured SFRs are consistent with our model during the early phases of the collapse. Comparison of observed column density PDFs with those derived from our model suggests that observed star-forming cores are roughly in free-fall.

  16. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Uganda. Draft. November 1982 - January 1983

    International Nuclear Information System (INIS)

    Trey, Michel de; Levich, Robert A.

    1983-02-01

    At present, there are no reasonably assured resources of uranium in Uganda in any price category. Speculative resources are restricted to 2,400 metric tons of uranium in an apatite deposit, which in the past has been actively mined for phosphate. The possible recovery of this uranium is dependent upon a number of economic and technological conditions which have never been thoroughly studied. Although the geology of Uganda holds some interesting possibilities for hosting uranium deposits, the studies conducted between 1949 and 1979 were limited to known radioactive occurrences and anomalies in limited areas which had little economic significance. Vast areas, less known and less accessible were completely ignored. Uranium exploration must therefore be started again in a systematic manner using modern methods. The current economic situation in Uganda is so critical that International technical and financial assistance is vitally needed to help rehabilitate the Geological Survey and Mines Department. Uganda currently can offer only very restricted services. The transportation system is quite deficient: the railway does not presently cross the frontier with Kenya, and all equipment and goods must be transported from Mombasa by road. Housing is in very short supply, and many basic commodities are often unobtainable. Any organization or private company which begins an exploration program in Uganda must plan to import essentially all the equipment and supplies it shall require. It shall also have to construct offices and staff housing, and import and stockpile fuel and staple goods, so as not to be at the mercy of the (at times) inadequate local supplies. It shall most probably also have to provide basic local and imported food to its Ugandan staff and should plan to pay much higher local salaries than is customary. Lastly, it will have to provide its own fleet of trucks and organize its own transport system. (author)

  17. What probabilities tell about quantum systems, with application to entropy and entanglement

    CERN Document Server

    Myers, John M

    2010-01-01

    The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”

  18. THREE-MOMENT BASED APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING SYSTEMS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2014-03-01

    Full Text Available The paper deals with the problem of approximation of probability distributions of random variables defined in positive area of real numbers with coefficient of variation different from unity. While using queueing systems as models for computer networks, calculation of characteristics is usually performed at the level of expectation and variance. At the same time, one of the main characteristics of multimedia data transmission quality in computer networks is delay jitter. For jitter calculation the function of packets time delay distribution should be known. It is shown that changing the third moment of distribution of packets delay leads to jitter calculation difference in tens or hundreds of percent, with the same values of the first two moments – expectation value and delay variation coefficient. This means that delay distribution approximation for the calculation of jitter should be performed in accordance with the third moment of delay distribution. For random variables with coefficients of variation greater than unity, iterative approximation algorithm with hyper-exponential two-phase distribution based on three moments of approximated distribution is offered. It is shown that for random variables with coefficients of variation less than unity, the impact of the third moment of distribution becomes negligible, and for approximation of such distributions Erlang distribution with two first moments should be used. This approach gives the possibility to obtain upper bounds for relevant characteristics, particularly, the upper bound of delay jitter.

  19. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Peru. August - October 1981

    International Nuclear Information System (INIS)

    Hetland, Donald L.; Michie, Uisdean McL.

    1981-01-01

    The IUREP Orientation Phase Mission to Peru believes that the Speculative Resources of that country fall between 6,000 and 11,000 tonnes uranium. There has been no uranium production in Peru and there are no official estimates of uranium resources. Past exploration in Peru (dating from about 1952) has indicated a paucity of valid uranium occurrences and radioactive anomalies. Only recently (1980) have anomalous areas been identified, (Macusani-Picotani). The identified Speculative Resources are mainly in Late Tertiary ignimbrites and associated sediments in the high Andes of southern Peru. Geologically, there are direct parallels between these resources and deposits of the Los Frailes areas of neighbouring Bolivia. Other minor Speculative Resources may be present in calcretes developed from Tertiary volcanogenic sources over the Precambrian in the Pacific Coastal desert of southern Peru but no positive indications have been recognised. Hercynian sub-volcanic granites in the eastern cordillera of southern Peru may have some associated Speculative Resources both intra and extra granitic. No Speculative Potential could be identified in Permo-Triassic or Tertiary post tectonic continental sediments anywhere in Peru. Such potential may exist but further reconnaissance of the continental late Tertiary basins, with positive indications would be required before inclusion of potential in this category. Recent discoveries in the volcanogenic environment of southern Peru have been by carborne, helicopter borne and on on-foot reconnaissance of isolated areas. It is recommended that there be a more systematic, integrated study of the entire volcanic district assisted by volcanic petrographic examination. Assessment of the known occurrences requires immediate subsurface study by drilling and exploration audits to assess their continuity, grade variation and thickness. This phase will be significantly more expensive than previous exploration. Non-core drilling should supplement

  20. On Bit Error Probability and Power Optimization in Multihop Millimeter Wave Relay Systems

    KAUST Repository

    Chelli, Ali

    2018-01-15

    5G networks are expected to provide gigabit data rate to users via millimeter-wave (mmWave) communication technology. One of the major problem faced by mmWaves is that they cannot penetrate buildings. In this paper, we utilize multihop relaying to overcome the signal blockage problem in mmWave band. The multihop relay network comprises a source device, several relay devices and a destination device and uses device-todevice communication. Relay devices redirect the source signal to avoid the obstacles existing in the propagation environment. Each device amplifies and forwards the signal to the next device, such that a multihop link ensures the connectivity between the source device and the destination device. We consider that the relay devices and the destination device are affected by external interference and investigate the bit error probability (BEP) of this multihop mmWave system. Note that the study of the BEP allows quantifying the quality of communication and identifying the impact of different parameters on the system reliability. In this way, the system parameters, such as the powers allocated to different devices, can be tuned to maximize the link reliability. We derive exact expressions for the BEP of M-ary quadrature amplitude modulation (M-QAM) and M-ary phase-shift keying (M-PSK) in terms of multivariate Meijer’s G-function. Due to the complicated expression of the exact BEP, a tight lower-bound expression for the BEP is derived using a novel Mellin-approach. Moreover, an asymptotic expression for the BEP at high SIR regime is derived and used to determine the diversity and the coding gain of the system. Additionally, we optimize the power allocation at different devices subject to a sum power constraint such that the BEP is minimized. Our analysis reveals that optimal power allocation allows achieving more than 3 dB gain compared to the equal power allocation.This research work can serve as a framework for designing and optimizing mmWave multihop

  1. Stochastic Stability for Time-Delay Markovian Jump Systems with Sector-Bounded Nonlinearities and More General Transition Probabilities

    Directory of Open Access Journals (Sweden)

    Dan Ye

    2013-01-01

    Full Text Available This paper is concerned with delay-dependent stochastic stability for time-delay Markovian jump systems (MJSs with sector-bounded nonlinearities and more general transition probabilities. Different from the previous results where the transition probability matrix is completely known, a more general transition probability matrix is considered which includes completely known elements, boundary known elements, and completely unknown ones. In order to get less conservative criterion, the state and transition probability information is used as much as possible to construct the Lyapunov-Krasovskii functional and deal with stability analysis. The delay-dependent sufficient conditions are derived in terms of linear matrix inequalities to guarantee the stability of systems. Finally, numerical examples are exploited to demonstrate the effectiveness of the proposed method.

  2. ON PROBABILITY FUNCTION OF TRIP ROUTE CHOICE IN PASSENGER TRANSPORT SYSTEM OF CITIES

    Directory of Open Access Journals (Sweden)

    N. Nefedof

    2014-02-01

    Full Text Available The results of statistical processing of experimental research data in Kharkiv, aimed at determining the relation between the passenger trip choice probability and the actual vehicles waiting time at bus terminals are presented.

  3. Some considerations on the definition of risk based on concepts of systems theory and probability.

    Science.gov (United States)

    Andretta, Massimo

    2014-07-01

    The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines. © 2013 Society for Risk Analysis.

  4. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Venezuela. Draft

    International Nuclear Information System (INIS)

    Hetland, Donald L.; Obellianne, Jean-marie

    1981-04-01

    The IUREP Orientation Phase Mission to Venezuela believes that the Speculative Uranium Resources of that country fall between 2,000 and 42,000 tonnes. This assumes that a part of the Speculative Resources would be extracted as by-product uranium from wet-process phosphoric acid production. Past exploration in Venezuela has resulted in the discovery of very few uranium occurrences and radioactive anomalies except for the many airborne anomalies recorded on the Guayana Shield. To date no economic deposits or significant uranium occurrences have been found in Venezuela except for the uraniferous phosphorites in the Cretaceous Navey Formation which are very low grade. The uranium occurrences and radioactive anomalies can be divided according to host rock into: (1) Precambrian crystalline and sedimentary rocks, (2) Cretaceous phosphorite beds, (3) continental sandstone, and (4) granitic rocks. The greatest geological potential for further uranium resources is believed to exist in the crystalline and sedimentary Precambrian rocks of the Guayana Shield, but favorable geological potential also exist in younger continental sandstones. Since the Guayana Shield is the most promising for the discovery of economic uranium deposits most of the proposed exploration effort is directed toward that area. Considerable time, effort and capital will be required however, because of the severe logistical problems of exploration in this vast, rugged and inaccessable area, Meager exploration work done to date has been relatively negative suggesting the area is more of a thorium rather than a uranium province. However because of the possibility of several types of uranium deposits and because so little exploration work has been done, the Mission assigned a relatively small speculative potential to the area, i.e. 0 to 25,000 tonnes uranium. A small speculative potential (0 to 2,000 tonnes) was assigned to the El Baul area in Cojedes State, in the Llanos Province. This potential is postulated

  5. The Role of Probability-Based Inference in an Intelligent Tutoring System.

    Science.gov (United States)

    Mislevy, Robert J.; Gitomer, Drew H.

    Probability-based inference in complex networks of interdependent variables is an active topic in statistical research, spurred by such diverse applications as forecasting, pedigree analysis, troubleshooting, and medical diagnosis. This paper concerns the role of Bayesian inference networks for updating student models in intelligent tutoring…

  6. The probability distribution of maintenance cost of a system affected by the gamma process of degradation: Finite time solution

    International Nuclear Information System (INIS)

    Cheng, Tianjin; Pandey, Mahesh D.; Weide, J.A.M. van der

    2012-01-01

    The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  8. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  9. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  10. Theoretical determination of gamma spectrometry systems efficiency based on probability functions. Application to self-attenuation correction factors

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, Manuel, E-mail: manuel.barrera@uca.es [Escuela Superior de Ingeniería, University of Cadiz, Avda, Universidad de Cadiz 10, 11519 Puerto Real, Cadiz (Spain); Suarez-Llorens, Alfonso [Facultad de Ciencias, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cadiz (Spain); Casas-Ruiz, Melquiades; Alonso, José J.; Vidal, Juan [CEIMAR, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cádiz (Spain)

    2017-05-11

    A generic theoretical methodology for the calculation of the efficiency of gamma spectrometry systems is introduced in this work. The procedure is valid for any type of source and detector and can be applied to determine the full energy peak and the total efficiency of any source-detector system. The methodology is based on the idea of underlying probability of detection, which describes the physical model for the detection of the gamma radiation at the particular studied situation. This probability depends explicitly on the direction of the gamma radiation, allowing the use of this dependence the development of more realistic and complex models than the traditional models based on the point source integration. The probability function that has to be employed in practice must reproduce the relevant characteristics of the detection process occurring at the particular studied situation. Once the probability is defined, the efficiency calculations can be performed in general by using numerical methods. Monte Carlo integration procedure is especially useful to perform the calculations when complex probability functions are used. The methodology can be used for the direct determination of the efficiency and also for the calculation of corrections that require this determination of the efficiency, as it is the case of coincidence summing, geometric or self-attenuation corrections. In particular, we have applied the procedure to obtain some of the classical self-attenuation correction factors usually employed to correct for the sample attenuation of cylindrical geometry sources. The methodology clarifies the theoretical basis and approximations associated to each factor, by making explicit the probability which is generally hidden and implicit to each model. It has been shown that most of these self-attenuation correction factors can be derived by using a common underlying probability, having this probability a growing level of complexity as it reproduces more precisely

  11. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  12. Ensemble based system for whole-slide prostate cancer probability mapping using color texture features.

    LENUS (Irish Health Repository)

    DiFranco, Matthew D

    2011-01-01

    We present a tile-based approach for producing clinically relevant probability maps of prostatic carcinoma in histological sections from radical prostatectomy. Our methodology incorporates ensemble learning for feature selection and classification on expert-annotated images. Random forest feature selection performed over varying training sets provides a subset of generalized CIEL*a*b* co-occurrence texture features, while sample selection strategies with minimal constraints reduce training data requirements to achieve reliable results. Ensembles of classifiers are built using expert-annotated tiles from training images, and scores for the probability of cancer presence are calculated from the responses of each classifier in the ensemble. Spatial filtering of tile-based texture features prior to classification results in increased heat-map coherence as well as AUC values of 95% using ensembles of either random forests or support vector machines. Our approach is designed for adaptation to different imaging modalities, image features, and histological decision domains.

  13. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  14. Optimal Power Allocation Strategy in a Joint Bistatic Radar and Communication System Based on Low Probability of Intercept.

    Science.gov (United States)

    Shi, Chenguang; Wang, Fei; Salous, Sana; Zhou, Jianjiang

    2017-11-25

    In this paper, we investigate a low probability of intercept (LPI)-based optimal power allocation strategy for a joint bistatic radar and communication system, which is composed of a dedicated transmitter, a radar receiver, and a communication receiver. The joint system is capable of fulfilling the requirements of both radar and communications simultaneously. First, assuming that the signal-to-noise ratio (SNR) corresponding to the target surveillance path is much weaker than that corresponding to the line of sight path at radar receiver, the analytically closed-form expression for the probability of false alarm is calculated, whereas the closed-form expression for the probability of detection is not analytically tractable and is approximated due to the fact that the received signals are not zero-mean Gaussian under target presence hypothesis. Then, an LPI-based optimal power allocation strategy is presented to minimize the total transmission power for information signal and radar waveform, which is constrained by a specified information rate for the communication receiver and the desired probabilities of detection and false alarm for the radar receiver. The well-known bisection search method is employed to solve the resulting constrained optimization problem. Finally, numerical simulations are provided to reveal the effects of several system parameters on the power allocation results. It is also demonstrated that the LPI performance of the joint bistatic radar and communication system can be markedly improved by utilizing the proposed scheme.

  15. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  16. An intelligent system based on fuzzy probabilities for medical diagnosis – a study in aphasia diagnosis

    Directory of Open Access Journals (Sweden)

    Majid Moshtagh Khorasani

    2009-04-01

    Full Text Available

    • BACKGROUND: Aphasia diagnosis is particularly challenging due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with  mprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease.
    • METHODS: Fuzzy probability is proposed here as the basic framework for handling the uncertainties in medical diagnosis and particularly aphasia diagnosis. To efficiently construct this fuzzy probabilistic mapping, statistical analysis is performed that constructs input membership functions as well as determines an effective set of input features.
    • RESULTS: Considering the high sensitivity of performance measures to different distribution of testing/training sets, a statistical t-test of significance is applied to compare fuzzy approach results with NN  esults as well as author’s earlier work using fuzzy logic. The proposed fuzzy probability estimator approach clearly provides better diagnosis for both classes of data sets. Specifically, for the first and second type of fuzzy probability classifiers, i.e. spontaneous speech and comprehensive model, P-values are 2.24E-08 and 0.0059, espectively, strongly rejecting the null hypothesis.
    • CONCLUSIONS: The technique is applied and compared on both comprehensive and spontaneous speech test data for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. Statistical analysis confirms that the proposed approach can significantly improve accuracy using fewer Aphasia features.
    • KEYWORDS: Aphasia, fuzzy probability, fuzzy logic, medical diagnosis, fuzzy rules.

  17. Simulation of n-qubit quantum systems. IV. Parametrizations of quantum states, matrices and probability distributions

    Science.gov (United States)

    Radtke, T.; Fritzsche, S.

    2008-11-01

    , quantum information science has contributed to our understanding of quantum mechanics and has provided also new and efficient protocols, based on the use of entangled quantum states. To determine the behavior and entanglement of n-qubit quantum registers, symbolic and numerical simulations need to be applied in order to analyze how these quantum information protocols work and which role the entanglement plays hereby. Solution method: Using the computer algebra system Maple, we have developed a set of procedures that support the definition, manipulation and analysis of n-qubit quantum registers. These procedures also help to deal with (unitary) logic gates and (nonunitary) quantum operations that act upon the quantum registers. With the parameterization of various frequently-applied objects, that are implemented in the present version, the program now facilitates a wider range of symbolic and numerical studies. All commands can be used interactively in order to simulate and analyze the evolution of n-qubit quantum systems, both in ideal and noisy quantum circuits. Reasons for new version: In the first version of the FEYNMAN program [1], we implemented the data structures and tools that are necessary to create, manipulate and to analyze the state of quantum registers. Later [2,3], support was added to deal with quantum operations (noisy channels) as an ingredient which is essential for studying the effects of decoherence. With the present extension, we add a number of parametrizations of objects frequently utilized in decoherence and entanglement studies, such that as hermitian and unitary matrices, probability distributions, or various kinds of quantum states. This extension therefore provides the basis, for example, for the optimization of a given function over the set of pure states or the simple generation of random objects. Running time: Most commands that act upon quantum registers with five or less qubits take ⩽10 seconds of processor time on a Pentium 4 processor

  18. An intelligent system based on fuzzy probabilities for medical diagnosis- a study in aphasia diagnosis.

    Science.gov (United States)

    Moshtagh-Khorasani, Majid; Akbarzadeh-T, Mohammad-R; Jahangiri, Nader; Khoobdel, Mehdi

    2009-03-01

    Aphasia diagnosis is particularly challenging due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. Fuzzy probability is proposed here as the basic framework for handling the uncertainties in medical diagnosis and particularly aphasia diagnosis. To efficiently construct this fuzzy probabilistic mapping, statistical analysis is performed that constructs input membership functions as well as determines an effective set of input features. Considering the high sensitivity of performance measures to different distribution of testing/training sets, a statistical t-test of significance is applied to compare fuzzy approach results with NN results as well as author's earlier work using fuzzy logic. The proposed fuzzy probability estimator approach clearly provides better diagnosis for both classes of data sets. Specifically, for the first and second type of fuzzy probability classifiers, i.e. spontaneous speech and comprehensive model, P-values are 2.24E-08 and 0.0059, respectively, strongly rejecting the null hypothesis. THE TECHNIQUE IS APPLIED AND COMPARED ON BOTH COMPREHENSIVE AND SPONTANEOUS SPEECH TEST DATA FOR DIAGNOSIS OF FOUR APHASIA TYPES: Anomic, Broca, Global and Wernicke. Statistical analysis confirms that the proposed approach can significantly improve accuracy using fewer Aphasia features.

  19. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    DEFF Research Database (Denmark)

    Hu, Y.; Li, H.; Liao, X

    2016-01-01

    This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration...... method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...... of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components....

  20. Detection probability of least tern and piping plover chicks in a large river system

    Science.gov (United States)

    Roche, Erin A.; Shaffer, Terry L.; Anteau, Michael J.; Sherfy, Mark H.; Stucker, Jennifer H.; Wiltermuth, Mark T.; Dovichin, Colin M.

    2014-01-01

    Monitoring the abundance and stability of populations of conservation concern is often complicated by an inability to perfectly detect all members of the population. Mark-recapture offers a flexible framework in which one may identify factors contributing to imperfect detection, while at the same time estimating demographic parameters such as abundance or survival. We individually color-marked, recaptured, and re-sighted 1,635 federally listed interior least tern (Sternula antillarum; endangered) chicks and 1,318 piping plover (Charadrius melodus; threatened) chicks from 2006 to 2009 at 4 study areas along the Missouri River and investigated effects of observer-, subject-, and site-level covariates suspected of influencing detection. Increasing the time spent searching and crew size increased the probability of detecting both species regardless of study area and detection methods were not associated with decreased survival. However, associations between detection probability and the investigated covariates were highly variable by study area and species combinations, indicating that a universal mark-recapture design may not be appropriate.

  1. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  2. A H-infinity Fault Detection and Diagnosis Scheme for Discrete Nonlinear System Using Output Probability Density Estimation

    International Nuclear Information System (INIS)

    Zhang Yumin; Lum, Kai-Yew; Wang Qingguo

    2009-01-01

    In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus, the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.

  3. Robust Guaranteed Cost Observer Design for Singular Markovian Jump Time-Delay Systems with Generally Incomplete Transition Probability

    Directory of Open Access Journals (Sweden)

    Yanbo Li

    2014-01-01

    Full Text Available This paper is devoted to the investigation of the design of robust guaranteed cost observer for a class of linear singular Markovian jump time-delay systems with generally incomplete transition probability. In this singular model, each transition rate can be completely unknown or only its estimate value is known. Based on stability theory of stochastic differential equations and linear matrix inequality (LMI technique, we design an observer to ensure that, for all uncertainties, the resulting augmented system is regular, impulse free, and robust stochastically stable with the proposed guaranteed cost performance. Finally, a convex optimization problem with LMI constraints is formulated to design the suboptimal guaranteed cost filters for linear singular Markovian jump time-delay systems with generally incomplete transition probability.

  4. Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Windus, Alastair; Jensen, Henrik J [The Institute for Mathematical Sciences, 53 Prince' s Gate, South Kensington, London SW7 2PG (United Kingdom)], E-mail: h.jensen@imperial.ac.uk

    2008-11-15

    We consider a reaction-diffusion model incorporating the reactions A{yields}{phi}, A{yields}2A and 2A{yields}3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.

  5. Classical analogues of a quantum system in spatial and temporal domains: A probability amplitude approach

    Directory of Open Access Journals (Sweden)

    Pradipta Panchadhyayee

    2016-12-01

    Full Text Available We have simulated the similar features of the well-known classical phenomena in quantum domain under the formalism of probability amplitude method. The identical pattern of interference fringes of a Fabry–Perot interferometer (especially on reflection mode is obtained through the power-broadened spectral line shape of the population distribution in the excited state with careful delineation of a coherently driven two-level atomic model. In a unit wavelength domain, such pattern can be substantially modified by controlling typical spatial field arrangement in one and two dimensions, which is found complementary to the findings of recent research on atom localization in sub-wavelength domain. The spatial dependence of temporal dynamics has also been studied at a particular condition, which is equivalent to that could be obtained under Raman–Nath diffraction controlled by spatial phase.

  6. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  7. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  8. Screening for retinitis in children with probable systemic cytomegalovirus infection at Tygerberg Hospital, Cape Town, South Africa

    Directory of Open Access Journals (Sweden)

    Johan Engelbrecht

    2017-07-01

    Full Text Available Background. The incidence of immunocompromised children with probable systemic cytomegalovirus (CMV infection is increasing. Currently, there is no protocol for screening children for CMV retinitis in South Africa. Screening for CMV retinitis may prevent permanent visual impairment. Objectives. To determine the prevalence of retinitis in children with probable systemic CMV infection. To assess the value of clinical and laboratory data in identifying risk factors for the development of CMV retinitis in children. Methods. A retrospective, cross-sectional study design was used. All children (≤12 years with probable systemic CMV infection who underwent ophthalmic screening over a 5-year period, were included. Presumed CMV retinitis was diagnosed by dilated fundoscopy. All cases were evaluated to identify possible risk factors for the development of CMV retinitis. Results. A total of 164 children were screened. Presumed CMV retinitis was diagnosed in 4.9% of participants. Causes of immunosuppression were HIV infection (n=7 and chemotherapy (n=1. HIV infection showed a definite trend towards association with the development of CMV retinitis in our study population (p=0.064. Conclusion.The prevalence of CMV retinitis was 4.9% in our sample. Other than HIV, we were not able to identify additional risk factors for CMV retinitis. Our results show that CD4 levels are possibly not a reliable indicator to predict CMV retinitis.

  9. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Cameroon. Draft

    International Nuclear Information System (INIS)

    Trey, Michel de; Leney, George W.

    1983-05-01

    Geology Branch with mapping and exploration services. Most of CAMEROON is underlain by Precambrian granites , anatexites, migmatites, gneisses, and schists. The NTEM complex in the south is Archean age. Most other areas are probably Lower Proterozoic, with some Middle Proterozoic inliers of the LOM, POLI, and MBALMAYO-BENGBIS, and the DJA series in the southeast. The DJA tillite is probably Upper Proterozoic. There are small remnant basins of Paleozoic volcanics, and larger basins and grabens of Middle-Upper Cretaceous. Tertiary rocks are present in the Coastal basins and the southeast. The Lake Chad depression is covered with Quaternary sands. All of the older rocks except the NTEM complex and the DJA series are within the Panafrican Mobile Belt. The border faults of the Cretaceous BENOUE graben are throughgoing crustal features. Tertiary intrusives and plateau basalts are found along the 'LINE OF CAMEROON', and volcanism has continued to present. Uranium exploration from 1950 to present has produced mostly negative results. Only one important prospect has been found at GOBLE-KITONGO, and has been the subject of repeated exploration from 195 8 to present. An enriched zone in syenite at LOLODORF is also under investigation. Anomalies on an airborne survey of the DJA series are being checked for mineralization similar to the FRANCEVILLIAN of GABON. Total expenditures that may be assigned to uranium are on the order of $10 million (U.S.). There are no uranium ore reserves or reasonably assured resources in CAMEROON. Speculative resources of 10,000 tonnes have been assigned to the LOLODORT syenite, and 5000 tonnes to the GOBLE-KITONGO prospect and Paleozoic volcanics at $130/kg. (U.S.). Favorable areas are the DJA series, syenites of the Panafrican granite and 'granites ultimes', Mesozoic-Cenozoic basins (especially DOUALA), LOM and MBALMAYO-BENGBIS series, POLI-MAROUA, and Lower Proterozoic/Archean gneiss and granite. Recommendations have been made for topical studies of

  10. A comparison of error bounds for a nonlinear tracking system with detection probability Pd < 1.

    Science.gov (United States)

    Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin

    2012-12-14

    Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds.

  11. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  12. Cooperative AF Relaying in Spectrum-Sharing Systems: Outage Probability Analysis under Co-Channel Interferences and Relay Selection

    KAUST Repository

    Xia, Minghua

    2012-11-01

    For cooperative amplify-and-forward (AF) relaying in spectrum-sharing wireless systems, secondary users share spectrum resources originally licensed to primary users to communicate with each other and, thus, the transmit power of secondary transmitters is strictly limited by the tolerable interference powers at primary receivers. Furthermore, the received signals at a relay and at a secondary receiver are inevitably interfered by the signals from primary transmitters. These co-channel interferences (CCIs) from concurrent primary transmission can significantly degrade the performance of secondary transmission. This paper studies the effect of CCIs on outage probability of the secondary link in a spectrum-sharing environment. In particular, in order to compensate the performance loss due to CCIs, the transmit powers of a secondary transmitter and its relaying node are respectively optimized with respect to both the tolerable interference powers at the primary receivers and the CCIs from the primary transmitters. Moreover, when multiple relays are available, the technique of opportunistic relay selection is exploited to further improve system performance with low implementation complexity. By analyzing lower and upper bounds on the outage probability of the secondary system, this study reveals that it is the tolerable interference powers at primary receivers that dominate the system performance, rather than the CCIs from primary transmitters. System designers will benefit from this result in planning and designing next-generation broadband spectrum-sharing systems.

  13. Mathematical theory of nonequilibrium steady states on the frontier of probability and dynamical systems

    CERN Document Server

    Jiang, Da-Quan; Qian, Min-Ping

    2004-01-01

    This volume provides a systematic mathematical exposition of the conceptual problems of nonequilibrium statistical physics, such as entropy production, irreversibility, and ordered phenomena. Markov chains, diffusion processes, and hyperbolic dynamical systems are used as mathematical models of physical systems. A measure-theoretic definition of entropy production rate and its formulae in various cases are given. It vanishes if and only if the stationary system is reversible and in equilibrium. Moreover, in the cases of Markov chains and diffusion processes on manifolds, it can be expressed in terms of circulations on directed cycles. Regarding entropy production fluctuations, the Gallavotti-Cohen fluctuation theorem is rigorously proved.

  14. Robust ℋ∞ filtering of Markovian jump stochastic systems with uncertain transition probabilities

    Science.gov (United States)

    Yao, Xiuming; Wu, Ligang; Zheng, Wei Xing; Wang, Changhong

    2011-07-01

    This article investigates the problem of robust ℋ∞ filtering for a class of uncertain Markovian stochastic systems. The system under consideration not only contains Itô-type stochastic disturbances and time-varying delays, but also involves uncertainties both in the system matrices and in the mode transition rate matrix. Our aim is to design an ℋ∞ filter such that, for all admissible parameter uncertainties and time-delays, the filtering error system can be guaranteed to be robustly stochastically stable, and achieve a prescribed ℋ∞ disturbance rejection attenuation level. By constructing a proper stochastic Lyapunov-Krasovskii functional and employing the free-weighting matrix technique, sufficient conditions for the existence of the desired filters are established in terms of linear matrix inequalities, which can be readily solved by standard numerical software. Finally, a numerical example is provided to show the utility of the developed approaches.

  15. A general theorem on the transition probabilities of a quantum mechanical system with spatial degeneracy

    NARCIS (Netherlands)

    Tolhoek, H.A.; Groot, S.R. de

    1949-01-01

    In the general case of a quantum mechanical system with a Hamiltonian that is invariant for rotations spatial degeneracy will exist. So the initial state must be characterized except by the energy also by e.g. the magnetic quantum number. Both for emission of light and electrons plus neutrinos

  16. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  17. International Uranium Resources Evaluation Project (IUREP) orientation phase mission report: Madagascar. September-October 1981

    International Nuclear Information System (INIS)

    Meyer, John H.; Brinck, Johan W.

    1981-01-01

    This study, resulting from the IUREP Orientation Mission to Madagascar, includes the reported information on infrastructure, mining regulations and conditions made available to the Mission. Within the structure of the centrally planned economic system, uranium exploration and mining is considered the exclusive activity of OMNIS, an organization founded by the State for that purpose (Office Militaire National pour les Industries Strategiques). Madagascar has a long history of prospection and small-scale exploitation of uranium (thorium and radium). Some of this activity dates back to 1909, culminating in significant production of both uranium and thorium (in excess of 5900 tonnes of uranothorianite) by the CEA and private contractors in the Fort Dauphin area from 1955 to 1968. Past exploration and development work in a number of areas, notably by the CEA, OMNIS and the IAEA/UNDP, is reviewed and the uranium resources and mineral indications reported. The areas rated at present as the more important and which continue to be investigated (by OMNIS, in conjunction with IAEA/UNDP projects) in the order of priority are: the Fort Dauphin area, the Karroo formation and the Neogene lacustrine basin at Antsirabe. The Mission estimates that Madagascar has a moderate potential for undiscovered resources; it is estimated that such speculative resources could lie within the range of 4000 - 38000 tonnes U. In addition there are areas with as yet untested environments and with no known occurrences which may be favourable but which will require prospection. Modifications to existing programmes and new programmes are suggested. Policy alternatives are reviewed

  18. Study on system modeling of anti-EMP survival probability assessment on the basis of sample space sorting method

    International Nuclear Information System (INIS)

    Wu Qiang; Cao Leituan; Fu Jiwei; Wang Jingshu; Chen Xi

    2014-01-01

    Subject to the limitations of funding and test conditions, number of sub-samples of complex large system test is often few, sometimes only one sub-sample is available. How to list conditions to make an accurate evaluation of the performance, reinforcement is important for complex systems. This paper presents a complex large system performance single sample assessment model. The model is input to the list sample test results under conditions of strict test magnitude and the total number of pulses is N, and certain statistical characteristics of the typical components that can be obtained by experiment. The output of the model for the single pulse conditions under the identification of the order of lower bound of the confidence of survival probability. (authors)

  19. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  20. Active control of probability amplitudes in a mesoscale system via feedback-induced suppression of dissipation and noise

    Science.gov (United States)

    Gupta, Chaitanya; Peña Perez, Aldo; Fischer, Sean R.; Weinreich, Stephen B.; Murmann, Boris; Howe, Roger T.

    2016-12-01

    We demonstrate that a three-terminal potentiostat circuit reduces the coupling between an electronic excitation transfer (EET) system and its environment, by applying a low-noise voltage to its electrical terminals. Inter-state interference is preserved in the EET system by attenuating the dissipation in the quantum system arising from coupling to the surrounding thermodynamic bath. A classical equivalent circuit is introduced to model the environment-coupled excitation transfer for a simplified, two-state system. This model provides a qualitative insight into how the electronic feedback affects the transition probabilities and selectively reduces dissipative coupling for one of the participant energy levels EET system. Furthermore, we show that the negative feedback also constrains r.m.s. fluctuations of the energy of environmental vibrational states, resulting in persistent spectral coherence between the decoupled state and vibronic levels of the complementary state. The decoupled vibronic channel therefore can serve as a probe for characterizing the vibronic structure of the complementary channel of the EET system.

  1. Three-dimensional radiotherapy planning system for esophageal tumors: comparison of treatment techniques and analysis of probability of complications

    International Nuclear Information System (INIS)

    Justino, Pitagoras Baskara; Carvalho, Heloisa de Andrade; Ferauche, Debora; Ros, Renato

    2003-01-01

    Radiotherapy techniques for esophageal cancer were compared using a three-dimensional planning system. We studied the following treatment techniques used for a patient with squamous cell carcinoma of the middle third of the esophagus: two antero-posterior and two latero-lateral parallel opposed fields, three fields ('Y' and 'T'), and four fields ('X'). Dose-volume histograms were obtained considering spinal cord and lungs as organs at risk. Analysis was performed comparing doses in these organs as recommended by the Normal Tissue Complication Probability (NTCP) and Tumor Control Probability (TCP). When only the lungs were considered the best technique was two antero-posterior parallel opposed fields. The spinal cord was best protected using latero-lateral fields. We suggest the combination of at least two treatment techniques: antero-posterior fields with 'Y' or 'T' techniques, or latero-lateral fields in order to balance the doses in the lungs and the spinal cord. Another option may be the use of any of the three-field techniques during the whole treatment. (author)

  2. A computer-aided diagnosis system for prediction of the probability of malignancy of breast masses on ultrasound images

    Science.gov (United States)

    Cui, Jing; Sahiner, Berkman; Chan, Heang-ping; Shi, Jiazheng; Nees, Alexis; Paramagul, Chintana; Hadjiiski, Lubomir M.

    2009-02-01

    A computer-aided diagnosis (CADx) system with the ability to predict the probability of malignancy (PM) of a mass can potentially assist radiologists in making correct diagnostic decisions. In this study, we designed a CADx system using logistic regression (LR) as the feature classifier which could estimate the PM of a mass. Our data set included 488 ultrasound (US) images from 250 biopsy-proven breast masses (100 malignant and 150 benign). The data set was divided into two subsets T1 and T2. Two experienced radiologists, R1 and R2, independently provided Breast Imaging Reporting and Data System (BI-RADS) assessments and PM ratings for data subsets T2 and T1, respectively. An LR classifier was designed to estimate the PM of a mass using two-fold cross validation, in which the data subsets T1 and T2 served once as the training and once as the test set. To evaluate the performance of the system, we compared the PM estimated by the CADx system with radiologists' PM ratings (12-point scale) and BI-RADS assessments (6-point scale). The correlation coefficients between the PM ratings estimated by the radiologists and by the CADx system were 0.71 and 0.72 for data subsets T1 and T2, respectively. For the BI-RADS assessments provided by the radiologists and estimated by the CADx system, the correlation coefficients were 0.60 and 0.67 for data subsets T1 and T2, respectively. Our results indicate that the CADx system may be able to provide not only a malignancy score, but also a more quantitative estimate for the PM of a breast mass.

  3. A novel unified expression for the capacity and bit error probability of wireless communication systems over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-07-01

    Analysis of the average binary error probabilities (ABEP) and average capacity (AC) of wireless communications systems over generalized fading channels have been considered separately in past years. This paper introduces a novel moment generating function (MGF)-based unified expression for the ABEP and AC of single and multiple link communications with maximal ratio combining. In addition, this paper proposes the hyper-Fox\\'s H fading model as a unified fading distribution of a majority of the well-known generalized fading environments. As such, the authors offer a generic unified performance expression that can be easily calculated, and that is applicable to a wide variety of fading scenarios. The mathematical formulism is illustrated with some selected numerical examples that validate the correctness of the authors\\' newly derived results. © 1972-2012 IEEE.

  4. Physical Constructivism and Quantum Probability

    Science.gov (United States)

    Ozhigov, Yu. I.

    2009-03-01

    I describe the main ideas of constructive physics and its role for the probability interpretation of quantum theory. It is shown how the explicit probability space for quantum systems gives the formal representation of entanglement and decoherence.

  5. Quantitative assessment of probability of failing safely for the safety instrumented system using reliability block diagram method

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Zhao, Shoutang; Hu, Bin

    2015-01-01

    Highlights: • Models of PFS for SIS were established by using the reliability block diagram. • The more accurate calculation of PFS for SIS can be acquired by using SL. • Degraded operation of complex SIS does not affect the availability of SIS. • The safe undetected failure is the largest contribution to the PFS of SIS. - Abstract: The spurious trip of safety instrumented system (SIS) brings great economic losses to production. How to ensure the safety instrumented system is reliable and available has been put on the schedule. But the existing models on spurious trip rate (STR) or probability of failing safely (PFS) are too simplified and not accurate, in-depth studies of availability to obtain more accurate PFS for SIS are required. Based on the analysis of factors that influence the PFS for the SIS, using reliability block diagram method (RBD), the quantitative study of PFS for the SIS is carried out, and gives some application examples. The results show that, the common cause failure will increase the PFS; degraded operation does not affect the availability of the SIS; if the equipment was tested and repaired one by one, the unavailability of the SIS can be ignored; the corresponding occurrence time of independent safe undetected failure should be the system lifecycle (SL) rather than the proof test interval and the independent safe undetected failure is the largest contribution to the PFS for the SIS

  6. Drug reaction with eosinophilia and systemic symptoms syndrome probably induced by a lamotrigine-ginseng drug interaction.

    Science.gov (United States)

    Myers, Amy P; Watson, Troy A; Strock, Steven B

    2015-03-01

    The likelihood of a drug reaction with lamotrigine is increased by dose escalation that is too rapid or drug interactions that increase the concentration of lamotrigine. There is a well-documented interaction between valproic acid and lamotrigine in which lamotrigine levels are increased, subsequently increasing the risk of a drug reaction with eosinophilia and systemic symptoms (DRESS) syndrome. This syndrome is characterized by fever, lymphadenopathy, diffuse maculopapular rash, multivisceral involvement, eosinophilia, and atypical lymphocytes and has a mortality rate of 10-40%. We describe the first case, to our knowledge, of DRESS syndrome that was probably induced by a drug interaction between lamotrigine and ginseng. A 44-year-old white man presented to the emergency department after experiencing a possible seizure. His medical history included two other lifetime events concerning for seizures at ages 14 and 29 years old. After referral to the neurology clinic, he was diagnosed with generalized tonic-clonic seizure disorder, and lamotrigine was started with up-titration according to the drug's package insert to a goal dosage of 150 mg twice/day. The patient had also been taking deer antler velvet and ginseng that he continued during his lamotrigine therapy. On day 43 of therapy, the patient presented to the emergency department with a pruritic rash that had started on his extremities and spread to his torso. He was thought to have experienced a drug reaction to lamotrigine, and the drug was discontinued. Thirteen days later, the patient was admitted from the acute care clinic for inpatient observation due to laboratory abnormalities in the setting of continued rash, headache, and myalgias. His admission laboratory results on that day were remarkable for leukocytosis, with a white blood cell count up to 17.6 × 10(3) /mm(3) , with a prominent eosinophilia of 3.04 × 10(3) /mm(3) ; his liver enzyme levels were also elevated, with an aspartate

  7. Comparative analysis of three drug-drug interaction screening systems against probable clinically relevant drug-drug interactions: a prospective cohort study.

    Science.gov (United States)

    Muhič, Neža; Mrhar, Ales; Brvar, Miran

    2017-07-01

    Drug-drug interaction (DDI) screening systems report potential DDIs. This study aimed to find the prevalence of probable DDI-related adverse drug reactions (ADRs) and compare the clinical usefulness of different DDI screening systems to prevent or warn against these ADRs. A prospective cohort study was conducted in patients urgently admitted to medical departments. Potential DDIs were checked using Complete Drug Interaction®, Lexicomp® Online™, and Drug Interaction Checker®. The study team identified the patients with probable clinically relevant DDI-related ADRs on admission, the causality of which was assessed using the Drug Interaction Probability Scale (DIPS). Sensitivity, specificity, and positive and negative predictive values of screening systems to prevent or warn against probable DDI-related ADRs were evaluated. Overall, 50 probable clinically relevant DDI-related ADRs were found in 37 out of 795 included patients taking at least two drugs, most common of them were bleeding, hyperkalemia, digitalis toxicity, and hypotension. Complete Drug Interaction showed the best sensitivity (0.76) for actual DDI-related ADRs, followed by Lexicomp Online (0.50), and Drug Interaction Checker (0.40). Complete Drug Interaction and Drug Interaction Checker had positive predictive values of 0.07; Lexicomp Online had 0.04. We found no difference in specificity and negative predictive values among these systems. DDI screening systems differ significantly in their ability to detect probable clinically relevant DDI-related ADRs in terms of sensitivity and positive predictive value.

  8. An analysis of the annual probability of failure of the waste hoist brake system at the Waste Isolation Pilot Plant (WIPP)

    Energy Technology Data Exchange (ETDEWEB)

    Greenfield, M.A. [Univ. of California, Los Angeles, CA (United States); Sargent, T.J.

    1995-11-01

    The Environmental Evaluation Group (EEG) previously analyzed the probability of a catastrophic accident in the waste hoist of the Waste Isolation Pilot Plant (WIPP) and published the results in Greenfield (1990; EEG-44) and Greenfield and Sargent (1993; EEG-53). The most significant safety element in the waste hoist is the hydraulic brake system, whose possible failure was identified in these studies as the most important contributor in accident scenarios. Westinghouse Electric Corporation, Waste Isolation Division has calculated the probability of an accident involving the brake system based on studies utilizing extensive fault tree analyses. This analysis conducted for the U.S. Department of Energy (DOE) used point estimates to describe the probability of failure and includes failure rates for the various components comprising the brake system. An additional controlling factor in the DOE calculations is the mode of operation of the brake system. This factor enters for the following reason. The basic failure rate per annum of any individual element is called the Event Probability (EP), and is expressed as the probability of failure per annum. The EP in turn is the product of two factors. One is the {open_quotes}reported{close_quotes} failure rate, usually expressed as the probability of failure per hour and the other is the expected number of hours that the element is in use, called the {open_quotes}mission time{close_quotes}. In many instances the {open_quotes}mission time{close_quotes} will be the number of operating hours of the brake system per annum. However since the operation of the waste hoist system includes regular {open_quotes}reoperational check{close_quotes} tests, the {open_quotes}mission time{close_quotes} for standby components is reduced in accordance with the specifics of the operational time table.

  9. An analysis of the annual probability of failure of the waste hoist brake system at the Waste Isolation Pilot Plant (WIPP)

    International Nuclear Information System (INIS)

    Greenfield, M.A.; Sargent, T.J.

    1995-11-01

    The Environmental Evaluation Group (EEG) previously analyzed the probability of a catastrophic accident in the waste hoist of the Waste Isolation Pilot Plant (WIPP) and published the results in Greenfield (1990; EEG-44) and Greenfield and Sargent (1993; EEG-53). The most significant safety element in the waste hoist is the hydraulic brake system, whose possible failure was identified in these studies as the most important contributor in accident scenarios. Westinghouse Electric Corporation, Waste Isolation Division has calculated the probability of an accident involving the brake system based on studies utilizing extensive fault tree analyses. This analysis conducted for the U.S. Department of Energy (DOE) used point estimates to describe the probability of failure and includes failure rates for the various components comprising the brake system. An additional controlling factor in the DOE calculations is the mode of operation of the brake system. This factor enters for the following reason. The basic failure rate per annum of any individual element is called the Event Probability (EP), and is expressed as the probability of failure per annum. The EP in turn is the product of two factors. One is the open-quotes reportedclose quotes failure rate, usually expressed as the probability of failure per hour and the other is the expected number of hours that the element is in use, called the open-quotes mission timeclose quotes. In many instances the open-quotes mission timeclose quotes will be the number of operating hours of the brake system per annum. However since the operation of the waste hoist system includes regular open-quotes reoperational checkclose quotes tests, the open-quotes mission timeclose quotes for standby components is reduced in accordance with the specifics of the operational time table

  10. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  11. Uncommon evolution of probable central nervous system histoplasmosis: from leptomeningitis to posterior fossa granuloma. A case report with magnetic resonance images

    International Nuclear Information System (INIS)

    Carrilho, Paulo Eduardo Mestrinelli; Alves, Orival

    2006-01-01

    We report a case of a young immunocompetent patient with probable central nervous system histoplasmosis with evolutive peculiar findings seen on magnetic resonance imaging. Leptomeningeal thickening was initially observed which subsequently became a posterior fossa granuloma. The diagnosis of fungal infection was only reached by histopathological study and the treatment was based on long term therapy with fluconazole wth good initial response. (author)

  12. Probability of Detection (POD) Analysis for the Advanced Retirement for Cause (RFC)/Engine Structural Integrity Program (ENSIP) Nondestructive Evaluation (NDE) System-Volume 3: Material Correlation Study

    National Research Council Canada - National Science Library

    Berens, Alan

    2000-01-01

    .... Volume 1 presents a description of changes made to the probability of detection (POD) analysis program of Mil-STD-1823 and the statistical evaluation of modifications that were made to version 3 of the Eddy Current Inspection System (ECIS v3...

  13. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  14. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  15. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  16. Effect of delayed link failure on probability of loss of assured safety in temperature-dependent systems with multiple weak and strong links.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (ProStat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ)

    2007-05-01

    Weak link (WL)/strong link (SL) systems constitute important parts of the overall operational design of high consequence systems, with the SL system designed to permit operation of the system only under intended conditions and the WL system designed to prevent the unintended operation of the system under accident conditions. Degradation of the system under accident conditions into a state in which the WLs have not deactivated the system and the SLs have failed in the sense that they are in a configuration that could permit operation of the system is referred to as loss of assured safety. The probability of such degradation conditional on a specific set of accident conditions is referred to as probability of loss of assured safety (PLOAS). Previous work has developed computational procedures for the calculation of PLOAS under fire conditions for a system involving multiple WLs and SLs and with the assumption that a link fails instantly when it reaches its failure temperature. Extensions of these procedures are obtained for systems in which there is a temperature-dependent delay between the time at which a link reaches its failure temperature and the time at which that link actually fails.

  17. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  18. Sensitivity analysis on the effect of software-induced common cause failure probability in the computer-based reactor trip system unavailability

    International Nuclear Information System (INIS)

    Kamyab, Shahabeddin; Nematollahi, Mohammadreza; Shafiee, Golnoush

    2013-01-01

    Highlights: ► Importance and sensitivity analysis has been performed for a digitized reactor trip system. ► The results show acceptable trip unavailability, for software failure probabilities below 1E −4 . ► However, the value of Fussell–Vesley indicates that software common cause failure is still risk significant. ► Diversity and effective test is founded beneficial to reduce software contribution. - Abstract: The reactor trip system has been digitized in advanced nuclear power plants, since the programmable nature of computer based systems has a number of advantages over non-programmable systems. However, software is still vulnerable to common cause failure (CCF). Residual software faults represent a CCF concern, which threat the implemented achievements. This study attempts to assess the effectiveness of so-called defensive strategies against software CCF with respect to reliability. Sensitivity analysis has been performed by re-quantifying the models upon changing the software failure probability. Importance measures then have been estimated in order to reveal the specific contribution of software CCF in the trip failure probability. The results reveal the importance and effectiveness of signal and software diversity as applicable strategies to ameliorate inefficiencies due to software CCF in the reactor trip system (RTS). No significant change has been observed in the rate of RTS failure probability for the basic software CCF greater than 1 × 10 −4 . However, the related Fussell–Vesley has been greater than 0.005, for the lower values. The study concludes that consideration of risk associated with the software based systems is a multi-variant function which requires compromising among them in more precise and comprehensive studies

  19. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  20. Retrieval system for emplaced spent unreprocessed fuel (SURF) in salt bed depository. Baseline concept criteria specifications and mechanical failure probabilities

    International Nuclear Information System (INIS)

    Hudson, E.E.; McCleery, J.E.

    1979-05-01

    One of the integral elements of the Nuclear Waste Management Program is the material handling task of retrieving Canisters containing spent unreprocessed fuel from their emplacement in a deep geologic salt bed Depository. A study of the retrieval concept data base predicated this report. In this report, alternative concepts for the tasks are illustrated and critiqued, a baseline concept in scenario form is derived and basic retrieval subsystem specifications are presented with cyclic failure probabilities predicted. The report is based on the following assumptions: (a) during retrieval, a temporary radiation seal is placed over each Canister emplacement; (b) a sleeve, surrounding the Canister, was initially installed during the original emplacement; (c) the emplacement room's physical and environmental conditions established in this report are maintained while the task is performed

  1. Retrieval system for emplaced spent unreprocessed fuel (SURF) in salt bed depository. Baseline concept criteria specifications and mechanical failure probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Hudson, E.E.; McCleery, J.E.

    1979-05-01

    One of the integral elements of the Nuclear Waste Management Program is the material handling task of retrieving Canisters containing spent unreprocessed fuel from their emplacement in a deep geologic salt bed Depository. A study of the retrieval concept data base predicated this report. In this report, alternative concepts for the tasks are illustrated and critiqued, a baseline concept in scenario form is derived and basic retrieval subsystem specifications are presented with cyclic failure probabilities predicted. The report is based on the following assumptions: (a) during retrieval, a temporary radiation seal is placed over each Canister emplacement; (b) a sleeve, surrounding the Canister, was initially installed during the original emplacement; (c) the emplacement room's physical and environmental conditions established in this report are maintained while the task is performed.

  2. An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters using the Total Probability Theorem

    Science.gov (United States)

    2015-05-12

    22 YYYY tYtYER   (9) where 11   is the correlation coefficient and    YtYE  and   YtY...of the covariance matrix where the coefficients are independent standard normal random variables. The time interval of interest  T,0 is... correlation among up-crossings in [0, T]. We define the failure event as        0,,:,0  tYttgTtF  FX . The probability   TttPf 0

  3. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  4. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  5. Efficient estimator of probabilities of large power spills in an stand-alone system with wind generation and storage

    NARCIS (Netherlands)

    D. Bhaumik (Debarati); D.T. Crommelin (Daan); A.P. Zwart (Bert)

    2016-01-01

    textabstractThe challenges of integrating unpredictable wind energy into a power system can be alleviated using energy storage devices. We assessed a single domestic energy system with a wind turbine and a battery. We investigated the best operation mode of the battery such that the occurrence of

  6. Retrieval system for emplaced spent unreprocessed fuel (SURF) in salt bed depository: accident event analysis and mechanical failure probabilities. Final report

    International Nuclear Information System (INIS)

    Bhaskaran, G.; McCleery, J.E.

    1979-10-01

    This report provides support in developing an accident prediction event tree diagram, with an analysis of the baseline design concept for the retrieval of emplaced spent unreprocessed fuel (SURF) contained in a degraded Canister. The report contains an evaluation check list, accident logic diagrams, accident event tables, fault trees/event trees and discussions of failure probabilities for the following subsystems as potential contributors to a failure: (a) Canister extraction, including the core and ram units; (b) Canister transfer at the hoist area; and (c) Canister hoisting. This report is the second volume of a series. It continues and expands upon the report Retrieval System for Emplaced Spent Unreprocessed Fuel (SURF) in Salt Bed Depository: Baseline Concept Criteria Specifications and Mechanical Failure Probabilities. This report draws upon the baseline conceptual specifications contained in the first report

  7. Retrieval system for emplaced spent unreprocessed fuel (SURF) in salt bed depository: accident event analysis and mechanical failure probabilities. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bhaskaran, G.; McCleery, J.E.

    1979-10-01

    This report provides support in developing an accident prediction event tree diagram, with an analysis of the baseline design concept for the retrieval of emplaced spent unreprocessed fuel (SURF) contained in a degraded Canister. The report contains an evaluation check list, accident logic diagrams, accident event tables, fault trees/event trees and discussions of failure probabilities for the following subsystems as potential contributors to a failure: (a) Canister extraction, including the core and ram units; (b) Canister transfer at the hoist area; and (c) Canister hoisting. This report is the second volume of a series. It continues and expands upon the report Retrieval System for Emplaced Spent Unreprocessed Fuel (SURF) in Salt Bed Depository: Baseline Concept Criteria Specifications and Mechanical Failure Probabilities. This report draws upon the baseline conceptual specifications contained in the first report.

  8. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  9. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  10. Deriving a frequentist conservative confidence bound for probability of failure per demand for systems with different operational and test profiles

    International Nuclear Information System (INIS)

    Bishop, Peter; Povyakalo, Andrey

    2017-01-01

    Reliability testing is typically used in demand-based systems (such as protection systems) to derive a confidence bound for a specific operational profile. To be realistic, the number of tests for each class of demand should be proportional to the demand frequency of the class. In practice, however, the actual operational profile may differ from that used during testing. This paper provides a means for estimating the confidence bound when the test profile differs from the profile used in actual operation. Based on this analysis the paper examines what bound can be claimed for different types of profile uncertainty and options for dealing with this uncertainty. We also show that the same conservative bound estimation equations can be applied to cases where different measures of software test coverage and operational profile are used. - Highlights: • Calculation of a new confidence bound when the operational profile changes. • The bound formula is an analytic approximation that is always conservative. • Formula can be used to optimise testing to allow for profile uncertainty. • Formulated for demand based systems with different demand classes. • But formulae can be generalised (e.g. to continuous time systems).

  11. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    NARCIS (Netherlands)

    Mourik, van S.; Braak, ter C.J.F.; Stigter, J.D.; Molenaar, J.

    2014-01-01

    Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate

  12. Copula-based probability of concurrent hydrological drought in the Poyang lake-catchment-river system (China) from 1960 to 2013

    Science.gov (United States)

    Zhang, Dan; Chen, Peng; Zhang, Qi; Li, Xianghu

    2017-10-01

    Investigation of concurrent hydrological drought events is helpful for understanding the inherent mechanism of hydrological extremes and designing corresponding adaptation strategy. This study investigates concurrent hydrological drought in the Poyang lake-catchment-river system from 1960 to 2013 based on copula functions. The standard water level index (SWI) and the standard runoff index (SRI) are employed to identify hydrological drought in the lake-catchment-river system. The appropriate marginal distributions and copulas are selected by the corrected Akaike Information Criterion and Bayesian copulas selection method. The probability of hydrological drought in Poyang Lake in any given year is 16.6% (return period of 6 years), and droughts occurred six times from 2003 to 2013. Additionally, the joint probability of concurrent drought events between the lake and catchment is 10.1% (return period of 9.9 years). Since 2003, concurrent drought has intensified in spring due to frequent hydrological drought in the catchment. The joint probability of concurrent drought between the lake and the Yangtze River is 11.5% (return period of 8.7 years). This simultaneous occurrence intensified in spring, summer and autumn from 2003 to 2013 due to the weakened blocking effect of the Yangtze River. Notably, although the lake drought intensified in winter during the past decade, hydrological drought in the catchment and the Yangtze River did not intensify simultaneously. Thus, this winter intensification might be caused by human activities in the lake region. The results of this study demonstrate that the Poyang lake-catchment-river system has been drying since 2003 based on a statistical approach. An adaptation strategy should be urgently established to mitigate the worsening situation in the Poyang lake-catchment-river system.

  13. No Magic Bullet: A Theory-Based Meta-Analysis of Markov Transition Probabilities in Studies of Service Systems for Persons With Mental Disabilities.

    Science.gov (United States)

    Leff, Hugh Stephen; Chow, Clifton M; Graves, Stephen C

    2017-03-01

    A random-effects meta-analysis of studies that used Markov transition probabilities (TPs) to describe outcomes for mental health service systems of differing quality for persons with serious mental illness was implemented to improve the scientific understanding of systems performance, to use in planning simulations to project service system costs and outcomes over time, and to test a theory of how outcomes for systems varying in quality differ. Nineteen systems described in 12 studies were coded as basic (B), maintenance (M), and recovery oriented (R) on the basis of descriptions of services provided. TPs for studies were aligned with a common functional-level framework, converted to a one-month time period, synthesized, and compared with theory-based expectations. Meta-regression was employed to explore associations between TPs and characteristics of service recipients and studies. R systems performed better than M and B systems. However, M systems did not perform better than B systems. All systems showed negative as well as positive TPs. For approximately one-third of synthesized TPs, substantial interstudy heterogeneity was noted. Associations were found between TPs and service recipient and study variables Conclusions: Conceptualizing systems as B, M, and R has potential for improving scientific understanding and systems planning. R systems appear more effective than B and M systems, although there is no "magic bullet" system for all service recipients. Interstudy heterogeneity indicates need for common approaches to reporting service recipient states, time periods for TPs, service recipient attributes, and service system characteristics. TPs found should be used in Markov simulations to project system effectiveness and costs of over time.

  14. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07

    International Nuclear Information System (INIS)

    Jasso G, J.; Diaz S, A.; Mendoza G, G.; Sainz M, E.; Garcia de la C, F. M.

    2014-10-01

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  15. Grading system to categorize breast MRI using BI-RADS 5th edition: a statistical study of non-mass enhancement descriptors in terms of probability of malignancy.

    Science.gov (United States)

    Asada, Tatsunori; Yamada, Takayuki; Kanemaki, Yoshihide; Fujiwara, Keishi; Okamoto, Satoko; Nakajima, Yasuo

    2018-03-01

    To analyze the association of breast non-mass enhancement descriptors in the BI-RADS 5th edition with malignancy, and to establish a grading system and categorization of descriptors. This study was approved by our institutional review board. A total of 213 patients were enrolled. Breast MRI was performed with a 1.5-T MRI scanner using a 16-channel breast radiofrequency coil. Two radiologists determined internal enhancement and distribution of non-mass enhancement by consensus. Corresponding pathologic diagnoses were obtained by either biopsy or surgery. The probability of malignancy by descriptor was analyzed using Fisher's exact test and multivariate logistic regression analysis. The probability of malignancy by category was analyzed using Fisher's exact and multi-group comparison tests. One hundred seventy-eight lesions were malignant. Multivariate model analysis showed that internal enhancement (homogeneous vs others, p < 0.001, heterogeneous and clumped vs clustered ring, p = 0.003) and distribution (focal and linear vs segmental, p < 0.001) were the significant explanatory variables. The descriptors were classified into three grades of suspicion, and the categorization (3, 4A, 4B, 4C, and 5) by sum-up grades showed an incremental increase in the probability of malignancy (p < 0.0001). The three-grade criteria and categorization by sum-up grades of descriptors appear valid for non-mass enhancement.

  16. Application of tests of goodness of fit in determining the probability density function for spacing of steel sets in tunnel support system

    Directory of Open Access Journals (Sweden)

    Farnoosh Basaligheh

    2015-12-01

    Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.

  17. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  18. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  19. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  20. Effects of Exposure to the Communities That Care Prevention System on Youth Problem Behaviors in a Community-Randomized Trial: Employing an Inverse Probability Weighting Approach.

    Science.gov (United States)

    Rhew, Isaac C; Oesterle, Sabrina; Coffman, Donna; Hawkins, J David

    2018-01-01

    Earlier intention-to-treat (ITT) findings from a community-randomized trial demonstrated effects of the Communities That Care (CTC) prevention system on reducing problem behaviors among youth. In ITT analyses, youth were analyzed according to their original study community's randomized condition even if they moved away from the community over the course of follow-up and received little to no exposure to intervention activities. Using inverse probability weights (IPWs), this study estimated effects of CTC in the same randomized trial among youth who remained in their original study communities throughout follow-up. Data were from the Community Youth Development Study, a community-randomized trial of 24 small towns in the United States. A cohort of 4,407 youth was followed from fifth grade (prior to CTC implementation) to eighth grade. IPWs for one's own moving status were calculated using fifth- and sixth-grade covariates. Results from inverse probability weighted multilevel models indicated larger effects for youth who remained in their study community for the first 2 years of CTC intervention implementation compared to ITT estimates. These effects included reduced likelihood of alcohol use, binge drinking, smokeless tobacco use, and delinquent behavior. These findings strengthen support for CTC as an efficacious system for preventing youth problem behaviors.

  1. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  2. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  3. Teachers' Understandings of Probability

    Science.gov (United States)

    Liu, Yan; Thompson, Patrick

    2007-01-01

    Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…

  4. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  5. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  6. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  7. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  8. Low probability of intercept-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems

    Science.gov (United States)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-12-01

    In this paper, we investigate the problem of low probability of intercept (LPI)-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems, where the radar system optimizes the transmitted waveform such that the interference caused to the cellular communication systems is strictly controlled. Assuming that the precise knowledge of the target spectra, the power spectral densities (PSDs) of signal-dependent clutters, the propagation losses of corresponding channels and the communication signals is known by the radar, three different LPI based criteria for radar waveform optimization are proposed to minimize the total transmitted power of the radar system by optimizing the multicarrier radar waveform with a predefined signal-to-interference-plus-noise ratio (SINR) constraint and a minimum required capacity for the cellular communication systems. These criteria differ in the way the communication signals scattered off the target are considered in the radar waveform design: (1) as useful energy, (2) as interference or (3) ignored altogether. The resulting problems are solved analytically and their solutions represent the optimum power allocation for each subcarrier in the multicarrier radar waveform. We show with numerical results that the LPI performance of the radar system can be significantly improved by exploiting the scattered echoes off the target due to cellular communication signals received at the radar receiver.

  9. Initial experience of evaluation of coronary artery with 320-slice row CT system in high pre-test probability population without heart rate (rhythm) control

    International Nuclear Information System (INIS)

    Sun Gang; Li Guoying; Li Min; Ding Juan; Li Shenghui; Li Li; Zhu Shifang; Lin Changling; Zou Xiaofeng

    2009-01-01

    Objective: To investigate the accuracy of 320-slice row CT system for the detection of coronary artery disease (CAD) in high pre-test probability population without heart rate/rhythm control. Methods: Thirty patients with a high pre-test probability of CAD underwent 320-slice row CT without preceding heart rate/rhythm control. Invasive coronary angiography (ICA) served as the standard reference. Data sets were evaluated by 2 observers in consensus with respect to stenoses ≥50% decreased diameter. The sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) and Youden index were analyzed; the impact of heart rate and calcification on image quality as well as diagnostic accuracy were also analyzed by Chi-square test. Results: Mean heart rate during scanning was 73.7±15.4 beats per min(bpm), and median(QR) of Agatston score of segment was 45.6 (181). On a per-segment analysis, overall sensitivity was 96.1% (74/77, 95% CI:89.03%-99.19%), specificity was 98.3% (337/343, 95% CI:96.23%-99.36%), PPV was 92.5% (74/80, 95% CI:84.39%-97.20%), NPV of 99.1% (337/340, 95% CI: 97.44%-99.82%) and the Youden index was 0.94. In both heart-rate subgroups (242 in heart rate < 70 bpm group, 169 in heart rate ≥70 bpm group), diagnostic accuracy for the assessment of coronary artery stenosis was similar (P<0.05). The accuracy and the quality score of the subgroup Agatston score ≥100 were lower than that of the subgroup Agatston score <100; however, the difference of results between 320-slice row CT and ICA was not significant (P<0.05). Conclusion: 320-detector row CT can reliably detect coronary artery stenoses in a high pre-test probability population without heart rate/rhythm control. (authors)

  10. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  11. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  12. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  13. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  14. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  15. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  16. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  17. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  18. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  19. Verification test problems for the calculation of probability of loss of assured safety in temperature-dependent systems with multiple weak and strong links.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ)

    2006-06-01

    Four verification test problems are presented for checking the conceptual development and computational implementation of calculations to determine the probability of loss of assured safety (PLOAS) in temperature-dependent systems with multiple weak links (WLs) and strong links (SLs). The problems are designed to test results obtained with the following definitions of loss of assured safety: (1) Failure of all SLs before failure of any WL, (2) Failure of any SL before failure of any WL, (3) Failure of all SLs before failure of all WLs, and (4) Failure of any SL before failure of all WLs. The test problems are based on assuming the same failure properties for all links, which results in problems that have the desirable properties of fully exercising the numerical integration procedures required in the evaluation of PLOAS and also possessing simple algebraic representations for PLOAS that can be used for verification of the analysis.

  20. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  1. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  2. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    set of methods, many of which have their origin in probability in Banach spaces , that arise across a broad range of contemporary problems in di↵erent...salesman problem, . . . • Probability in Banach spaces : probabilistic limit theorems for Banach - valued random variables, empirical processes, local...theory of Banach spaces , geometric functional analysis, convex geometry. • Mixing times and other phenomena in high-dimensional Markov chains. At

  3. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  4. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  5. Grading System to Categorize Breast MRI in BI-RADS 5th Edition: A Multivariate Study of Breast Mass Descriptors in Terms of Probability of Malignancy.

    Science.gov (United States)

    Fujiwara, Keishi; Yamada, Takayuki; Kanemaki, Yoshihide; Okamoto, Satoko; Kojima, Yasuyuki; Tsugawa, Koichiro; Nakajima, Yasuo

    2018-03-01

    The purpose of this study is to analyze the association between the probability of malignancy and breast mass descriptors in the BI-RADS 5th edition and to devise criteria for grading mass lesions, including subcategorization of category 4 lesions with or without apparent diffusion coefficient (ADC) values. A total of 519 breast masses in 499 patients were selected. Breast MRI was performed with a 1.5-T MRI scanner using a 16-channel dedicated breast radiofrequency coil. Two radiologists determined the morphologic and kinetic features of the breast masses. Mean ADC values were measured on ADC maps by placing round ROIs that encircled the largest possible solid mass portions. An optimal ADC threshold was chosen to maximize the Youden index. Corresponding pathologic diagnoses were obtained by either biopsy or surgery. A total of 472 masses were malignant. Multivariate model analysis showed that shape (irregular, p 4) were significant with respect to malignancy (p < 0.01). The inclusion of ADC values improved the positive predictive values for categories 3, 4A, and 4B. The 3-point scoring system for findings suspicious for malignancy and the proposed classification system for breast mass descriptors would be valid as a categorization system. ADC values may be used to downgrade benign lesions in categories 3, 4A, and 4B.

  6. Analysis and probability

    CERN Document Server

    Spataru, Aurel

    2013-01-01

    Probability theory is a rapidly expanding field and is used in many areas of science and technology. Beginning from a basis of abstract analysis, this mathematics book develops the knowledge needed for advanced students to develop a complex understanding of probability. The first part of the book systematically presents concepts and results from analysis before embarking on the study of probability theory. The initial section will also be useful for those interested in topology, measure theory, real analysis and functional analysis. The second part of the book presents the concepts, methodology and fundamental results of probability theory. Exercises are included throughout the text, not just at the end, to teach each concept fully as it is explained, including presentations of interesting extensions of the theory. The complete and detailed nature of the book makes it ideal as a reference book or for self-study in probability and related fields. It covers a wide range of subjects including f-expansions, Fuk-N...

  7. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  8. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems.

    Science.gov (United States)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-28

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper , we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E ; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. 'explore or not?'; 'open new well or not?'; 'contaminated by water or not?'; 'double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism).This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  10. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  11. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  12. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  13. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  14. Estimating the Probability of Human Error by Incorporating Component Failure Data from User-Induced Defects in the Development of Complex Electrical Systems.

    Science.gov (United States)

    Majewicz, Peter J; Blessner, Paul; Olson, Bill; Blackburn, Timothy

    2017-04-05

    This article proposes a methodology for incorporating electrical component failure data into the human error assessment and reduction technique (HEART) for estimating human error probabilities (HEPs). The existing HEART method contains factors known as error-producing conditions (EPCs) that adjust a generic HEP to a more specific situation being assessed. The selection and proportioning of these EPCs are at the discretion of an assessor, and are therefore subject to the assessor's experience and potential bias. This dependence on expert opinion is prevalent in similar HEP assessment techniques used in numerous industrial areas. The proposed method incorporates factors based on observed trends in electrical component failures to produce a revised HEP that can trigger risk mitigation actions more effectively based on the presence of component categories or other hazardous conditions that have a history of failure due to human error. The data used for the additional factors are a result of an analysis of failures of electronic components experienced during system integration and testing at NASA Goddard Space Flight Center. The analysis includes the determination of root failure mechanisms and trend analysis. The major causes of these defects were attributed to electrostatic damage, electrical overstress, mechanical overstress, or thermal overstress. These factors representing user-induced defects are quantified and incorporated into specific hardware factors based on the system's electrical parts list. This proposed methodology is demonstrated with an example comparing the original HEART method and the proposed modified technique. © 2017 Society for Risk Analysis.

  15. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    “quite probably, also the end of Angola's existence as an independent country”. It went on: “The victory at Cuito Cuanavale for the liberation forces and their Cuban compatriots was therefore decisive in consolidating Angola's independence and achieving that of Namibia.” Therefore, when reflecting on the events, “it is not ...

  16. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    He spends several months in India visiting schools, colleges and universities. He enjoys teaching mathematics and statistics at all levels. He loves Indian classical and folk music. This issue of Resonance features Joseph Leonard. Doob, who played a critical role in the devel- opment of probability theory in the world from.

  17. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...

  18. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  19. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  20. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  1. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  2. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  3. the theory of probability

    Indian Academy of Sciences (India)

    OF PROBABILITY *. The simplest laws of natural science are those that state the conditions under which some event of interest to us will either certainly occur or certainly not occur; i.e., these conditions may be expressed in one of the following two forms: 1. If a complex (i.e., a set or collection) of conditions S is realized, then.

  4. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  5. Stochastic Modeling of Climatic Probabilities.

    Science.gov (United States)

    1979-11-01

    students who contributed in a major way to the success of the project are Sarah Autrey, Jeff Em erson, Karl Grammel , Tom licknor and Debbie Wa i te. A...sophisticati . n an d cost of weapons systems and the recognition that the environment di-grades or offers opportunities h ~s led to tile requirement for...First , make a h istogram of the data , an d then “smooth” the histogram to obtain a frequency distribution (probability density function). The

  6. Probability Bracket Notation: the Unified Expressions of Conditional Expectation and Conditional Probability in Quantum Modeling

    OpenAIRE

    Wang, Xing M.

    2009-01-01

    After a brief introduction to Probability Bracket Notation (PBN), indicator operator and conditional density operator (CDO), we investigate probability spaces associated with various quantum systems: system with one observable (discrete or continuous), system with two commutative observables (independent or dependent) and a system of indistinguishable non-interacting many-particles. In each case, we derive unified expressions of conditional expectation (CE), conditional probability (CP), and ...

  7. Probability of Loss of Assured Safety in Systems with Multiple Time-Dependent Failure Modes: Incorporation of Delayed Link Failure in the Presence of Aleatory Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon C. [Arizona State Univ., Tempe, AZ (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sallaberry, Cedric Jean-Marie. [Engineering Mechanics Corp. of Columbus, OH (United States)

    2018-02-01

    Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximation and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.

  8. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  9. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  12. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  13. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  14. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  15. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...

  16. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  17. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  18. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  19. Identification of probabilities.

    Science.gov (United States)

    Vitányi, Paul M B; Chater, Nick

    2017-02-01

    Within psychology, neuroscience and artificial intelligence, there has been increasing interest in the proposal that the brain builds probabilistic models of sensory and linguistic input: that is, to infer a probabilistic model from a sample. The practical problems of such inference are substantial: the brain has limited data and restricted computational resources. But there is a more fundamental question: is the problem of inferring a probabilistic model from a sample possible even in principle? We explore this question and find some surprisingly positive and general results. First, for a broad class of probability distributions characterized by computability restrictions, we specify a learning algorithm that will almost surely identify a probability distribution in the limit given a finite i.i.d. sample of sufficient but unknown length. This is similarly shown to hold for sequences generated by a broad class of Markov chains, subject to computability assumptions. The technical tool is the strong law of large numbers. Second, for a large class of dependent sequences, we specify an algorithm which identifies in the limit a computable measure for which the sequence is typical, in the sense of Martin-Löf (there may be more than one such measure). The technical tool is the theory of Kolmogorov complexity. We analyze the associated predictions in both cases. We also briefly consider special cases, including language learning, and wider theoretical implications for psychology.

  20. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  1. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  2. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  3. A conservative bound for the probability of failure of a 1-out-of-2 protection system with one hardware-only and one software-based protection train

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Littlewood, Bev; Popov, Peter; Povyakalo, Andrey; Strigini, Lorenzo

    2014-01-01

    Redundancy and diversity have long been used as means to obtain high reliability in critical systems. While it is easy to show that, say, a 1-out-of-2 diverse system will be more reliable than each of its two individual “trains”, assessing the actual reliability of such systems can be difficult because the trains cannot be assumed to fail independently. If we cannot claim independence of train failures, the computation of system reliability is difficult, because we would need to know the probability of failure on demand (pfd) for every possible demand. These are unlikely to be known in the case of software. Claims for software often concern its marginalpfd, i.e. average across all possible demands. In this paper we consider the case of a 1-out-of-2 safety protection system in which one train contains software (and hardware), and the other train contains only hardware equipment. We show that a useful upper (i.e. conservative) bound can be obtained for the system pfd using only the unconditional pfd for software together with information about the variation of hardware failure probability across demands, which is likely to be known or estimatable. The worst-case result is obtained by “allocating” software failure probability among demand “classes” so as to maximize system pfd

  4. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  5. From Random Motion of Hamiltonian Systems to Boltzmann’s H Theorem and Second Law of Thermodynamics: a Pathway by Path Probability

    Directory of Open Access Journals (Sweden)

    Qiuping A. Wang

    2014-02-01

    Full Text Available A numerical experiment of ideal stochastic motion of a particle subject to conservative forces and Gaussian noise reveals that the path probability depends exponentially on action. This distribution implies a fundamental principle generalizing the least action principle of the Hamiltonian/Lagrangian mechanics and yields an extended formalism of mechanics for random dynamics. Within this theory, Liouville’s theorem of conservation of phase density distribution must be modified to allow time evolution of phase density and consequently the Boltzmann H theorem. We argue that the gap between the regular Newtonian dynamics and the random dynamics was not considered in the criticisms of the H theorem.

  6. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  7. Failure probability of regional flood defences

    NARCIS (Netherlands)

    Lendering, K.T.; lang, M.; Klijn, F.; Samuels, P.

    2016-01-01

    Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This

  8. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  9. Coverage Probability of Random Intervals

    OpenAIRE

    Chen, Xinjia

    2007-01-01

    In this paper, we develop a general theory on the coverage probability of random intervals defined in terms of discrete random variables with continuous parameter spaces. The theory shows that the minimum coverage probabilities of random intervals with respect to corresponding parameters are achieved at discrete finite sets and that the coverage probabilities are continuous and unimodal when parameters are varying in between interval endpoints. The theory applies to common important discrete ...

  10. Optimal design of a hybrid solar-wind-battery system using the minimization of the annualized cost system and the minimization of the loss of power supply probability (LPSP)

    Energy Technology Data Exchange (ETDEWEB)

    Ould Bilal, B.; Sambou, V.; Ndiaye, P.A.; Kebe, C.M.F. [Centre International de Formation et de Recherche en Energie Solaire (C.I.F.R.E.S), ESP BP: 5085 Dakar Fann (Senegal); Ndongo, M. [Centre de Recherche Appliquee aux Energies Renouvelables de l' Eau et du Froid (CRAER)/FST/Universite de Nouakchott (Mauritania)

    2010-10-15

    Potou is an isolated site, located in the northern coast of Senegal. The populations living in this area have no easy access to electricity supply. The use of renewable energies can contribute to the improvement of the living conditions of these populations. The methodology used in this paper consists in Sizing a hybrid solar-wind-battery system optimized through multi-objective genetic algorithm for this site and the influence of the load profiles on the optimal configuration. The two principal aims are: the minimization of the annualized cost system and the minimization of the loss of power supply probability (LPSP). To study the load profile influence, three load profiles with the same energy (94 kW h/day) have been used. The achieved results show that the cost of the optimal configuration strongly depends on the load profile. For example, the cost of the optimal configuration decreases by 7% and 5% going from profile 1 to 2 and for those ones going from 1 to 3. (author)

  11. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  12. Probability and complex quantum trajectories

    International Nuclear Information System (INIS)

    John, Moncy V.

    2009-01-01

    It is shown that in the complex trajectory representation of quantum mechanics, the Born's Ψ*Ψ probability density can be obtained from the imaginary part of the velocity field of particles on the real axis. Extending this probability axiom to the complex plane, we first attempt to find a probability density by solving an appropriate conservation equation. The characteristic curves of this conservation equation are found to be the same as the complex paths of particles in the new representation. The boundary condition in this case is that the extended probability density should agree with the quantum probability rule along the real line. For the simple, time-independent, one-dimensional problems worked out here, we find that a conserved probability density can be derived from the velocity field of particles, except in regions where the trajectories were previously suspected to be nonviable. An alternative method to find this probability density in terms of a trajectory integral, which is easier to implement on a computer and useful for single particle solutions, is also presented. Most importantly, we show, by using the complex extension of Schrodinger equation, that the desired conservation equation can be derived from this definition of probability density

  13. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  14. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  15. Familiarity and preference for pitch probability profiles.

    Science.gov (United States)

    Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L

    2015-05-01

    We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge.

  16. Matrix-Specific Method Validation of an Automated Most-Probable-Number System for Use in Measuring Bacteriological Quality of Grade "A" Milk Products.

    Science.gov (United States)

    Lindemann, Samantha; Kmet, Matthew; Reddy, Ravinder; Uhlig, Steffen

    2016-11-01

    The U.S. Food and Drug Administration (FDA) oversees a long-standing cooperative federal and state milk sanitation program that uses the grade "A" Pasteurized Milk Ordinance standards to maintain the safety of grade "A" milk sold in the United States. The Pasteurized Milk Ordinance requires that grade "A" milk samples be tested using validated total aerobic bacterial and coliform count methods. The objective of this project was to conduct an interlaboratory method validation study to compare performance of a film plate method with an automated most-probable-number method for total aerobic bacterial and coliform counts, using statistical approaches from international data standards. The matrix-specific validation study was administered concurrently with the FDA's annual milk proficiency test to compare method performance in five milk types. Eighteen analysts from nine laboratories analyzed test portions from 12 samples in triplicate. Statistics, including mean bias and matrix standard deviation, were calculated. Sample-specific bias of the alternative method for total aerobic count suggests that there are no large deviations within the population of samples considered. Based on analysis of 648 data points, mean bias of the alternative method across milk samples for total aerobic count was 0.013 log CFU/ml and the confidence interval for mean deviation was -0.066 to 0.009 log CFU/ml. These results indicate that the mean difference between the selected methods is small and not statistically significant. Matrix standard deviation was 0.077 log CFU/ml, showing that there is a low risk for large sample-specific bias based on milk matrix. Mean bias of the alternative method was -0.160 log CFU/ml for coliform count data. The 95% confidence interval was -0.210 to -0.100 log CFU/ml, indicating that mean deviation is significantly different from zero. The standard deviation of the sample-specific bias for coliform data was 0.033 log CFU/ml, indicating no significant effect of

  17. Probable existence of a Gondwana transcontinental rift system in western India: Implications in hydrocarbon exploration in Kutch and Saurashtra offshore: A GIS-based approach

    Science.gov (United States)

    Mazumder, S.; Tep, Blecy; Pangtey, K. K. S.; Das, K. K.; Mitra, D. S.

    2017-08-01

    The Gondwanaland assembly rifted dominantly during Late Carboniferous-Early Permian forming several intracratonic rift basins. These rifts were subsequently filled with a thick sequence of continental clastic sediments with minor marine intercalations in early phase. In western part of India, these sediments are recorded in enclaves of Bikaner-Nagaur and Jaisalmer basins in Rajasthan. Facies correlatives of these sediments are observed in a number of basins that were earlier thought to be associated with the western part of India. The present work is a GIS based approach to reconnect those basins to their position during rifting and reconstruct the tectono-sedimentary environment at that time range. The study indicates a rift system spanning from Arabian plate in the north and extending to southern part of Africa that passes through Indus basin, western part of India and Madagascar, and existed from Late Carboniferous to Early Jurassic. Extensions related to the opening of Neo-Tethys led to the formation of a number of cross trends in the rift systems that acted as barriers to marine transgressions from the north as well as disrupted the earlier continuous longitudinal drainage systems. The axis of this rift system is envisaged to pass through present day offshore Kutch and Saurashtra and implies a thick deposit of Late Carboniferous to Early Jurassic sediments in these areas. Based on analogy with other basins associated with this rift system, these sediments may be targeted for hydrocarbon exploration.

  18. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  19. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  20. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  1. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  2. Logic with a Probability Semantics

    CERN Document Server

    Hailperin, Theodore

    2010-01-01

    The present study is an extension of the topic introduced in Dr. Hailperin's Sentential Probability Logic, where the usual true-false semantics for logic is replaced with one based more on probability, and where values ranging from 0 to 1 are subject to probability axioms. Moreover, as the word "sentential" in the title of that work indicates, the language there under consideration was limited to sentences constructed from atomic (not inner logical components) sentences, by use of sentential connectives ("no," "and," "or," etc.) but not including quantifiers ("for all," "there is"). An initial

  3. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  4. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  5. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  6. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  7. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  8. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  9. Return probability: Exponential versus Gaussian decay

    Energy Technology Data Exchange (ETDEWEB)

    Izrailev, F.M. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)]. E-mail: izrailev@sirio.ifuap.buap.mx; Castaneda-Mendoza, A. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)

    2006-02-13

    We analyze, both analytically and numerically, the time-dependence of the return probability in closed systems of interacting particles. Main attention is paid to the interplay between two regimes, one of which is characterized by the Gaussian decay of the return probability, and another one is the well-known regime of the exponential decay. Our analytical estimates are confirmed by the numerical data obtained for two models with random interaction. In view of these results, we also briefly discuss the dynamical model which was recently proposed for the implementation of a quantum computation.

  10. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  11. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  12. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  13. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Michael C. Wittmann

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  14. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Response of a phagocyte cell system to products of macrophage breakdown as a probable mechanism of alveolar phagocytosis adaptation to deposition of particles of different cytotoxicity.

    Science.gov (United States)

    Privalova, L I; Katsnelson, B A; Osipenko, A B; Yushkov, B N; Babushkina, L G

    1980-04-01

    The adaptation of the alveolar phagocytosis response to the quantitative and qualitative features of dust deposited during inhalation consists not only in enhanced recruitment of alveolar macrophages (AM), but also in adding a more or less pronounced neutrophil leukocyte (NL) recruitment as an auxiliary participant of particle clearance. The NL contribution to clearance is especially typical for response to cytotoxic particles (quartz, in particular). An important feature of the adaptation considered is the limitation of the number of AM and NL recruited when an efficient clearance can be achieved by a lesser number of cells due to increased AM reistance to the damaging actin of phagocytized particles. The main mechanism providing the adequacy of the alveolar phagocytosis response is its self-regulation thrugh the products of macrophage breakdown (PMB). In a series of experiments with intraperitoneal and intratracheal injections of syngenetic PMB into rats and mice, it was shown that these products stimulate respiration and migration of phagocytic cells, their dose-dependent attraction to the site of PMB formation with the predominant NL contribution, increasing with the increase of amount of PMB, the AM and NL precursor cells recruitment from reserve pools, and the replenishment of these reserves in the process of hemopoiesis. At least some of the above effects are connected with the action of the lipid components of PMB. The action of specialized regulative systems of the organism can modify the response to PMB, judging by the results obtained by hydrocortisone injection. Autocontrol of alveolar phagocytosis requires great care in attempts at artificial stimulation of this process, as an excessive cell recruitment may promote the retention of particles in lungs.

  16. Exploring non-signalling polytopes with negative probability

    Science.gov (United States)

    Oas, G.; Acacio de Barros, J.; Carvalhaes, C.

    2014-12-01

    Bipartite and tripartite EPR-Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory.

  17. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  18. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  19. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  20. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  1. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  2. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  3. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  4. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  5. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  6. Space Shuttle Program (SSP) Orbiter Main Propulsion System (MPS) Gaseous Hydrogen (GH2) Flow Control Valve (FCV) Poppet Eddy Current (EC) Inspection Probability of Detection (POD) Study. Volume 2; Appendices

    Science.gov (United States)

    Piascik, Robert S.; Prosser, William H.

    2011-01-01

    The Director of the NASA Engineering and Safety Center (NESC), requested an independent assessment of the anomalous gaseous hydrogen (GH2) flow incident on the Space Shuttle Program (SSP) Orbiter Vehicle (OV)-105 during the Space Transportation System (STS)-126 mission. The main propulsion system (MPS) engine #2 GH2 flow control valve (FCV) LV-57 transition from low towards high flow position without being commanded. Post-flight examination revealed that the FCV LV-57 poppet had experienced a fatigue failure that liberated a section of the poppet flange. The NESC assessment provided a peer review of the computational fluid dynamics (CFD), stress analysis, and impact testing. A probability of detection (POD) study was requested by the SSP Orbiter Project for the eddy current (EC) nondestructive evaluation (NDE) techniques that were developed to inspect the flight FCV poppets. This report contains the Appendices to the main report.

  7. Space Shuttle Program (SSP) Orbiter Main Propulsion System (MPS) Gaseous Hydrogen (GH2) Flow Control Valve (FCV) Poppet Eddy Current (EC) Inspection Probability of Detection (POD) Study. Volume 1

    Science.gov (United States)

    Piascik, Robert S.; Prosser, William H.

    2011-01-01

    The Director of the NASA Engineering and Safety Center (NESC), requested an independent assessment of the anomalous gaseous hydrogen (GH2) flow incident on the Space Shuttle Program (SSP) Orbiter Vehicle (OV)-105 during the Space Transportation System (STS)-126 mission. The main propulsion system (MPS) engine #2 GH2 flow control valve (FCV) LV-57 transition from low towards high flow position without being commanded. Post-flight examination revealed that the FCV LV-57 poppet had experienced a fatigue failure that liberated a section of the poppet flange. The NESC assessment provided a peer review of the computational fluid dynamics (CFD), stress analysis, and impact testing. A probability of detection (POD) study was requested by the SSP Orbiter Project for the eddy current (EC) nondestructive evaluation (NDE) techniques that were developed to inspect the flight FCV poppets. This report contains the findings and recommendations from the NESC assessment.

  8. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  9. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  10. Kolmogorov complexity and probability measures

    Czech Academy of Sciences Publication Activity Database

    Šindelář, Jan; Boček, Pavel

    2002-01-01

    Roč. 38, č. 6 (2002), s. 729-745 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * probability measure Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002

  11. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  12. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  13. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...

  14. Measurement Invariance, Entropy, and Probability

    Directory of Open Access Journals (Sweden)

    D. Eric Smith

    2010-02-01

    Full Text Available We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student’s probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability.

  15. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  16. Collision probabilities and response matrices: an overview

    International Nuclear Information System (INIS)

    Leonard, A.

    1975-01-01

    Generally the term collision probability method is applied to a technique that employs a discretization of the integral form of the transport equation. Relative to the discrete ordinates method, the collision probability technique has the advantages of dealing with fewer number of variables (no angular coordinates) and generally faster convergence. Significant disadvantages include dense coupling of the variables, expensive precalculation of collision probabilities, and difficulties in treating anisotropic scattering. Various techniques for circumventing these weaknesses are described. In the response matrix method the assembly or system to be analyzed is decomposed into a number of simple subunits. The approximate Green's functions or response matrices of each type of subunit are then precalculated. To the desired accuracy, these response matrices yield the outgoing neutron currents to any given input. Thus the unknowns are the interface currents, and the coefficient matrix contains all the response matrices. A wide variety of techniques can and have been used to generate response matrices--diffusion theory, S/sub n/ methods, Monte Carlo, collision probabilities, and even response matrices. Again the precalculations are expensive. On the other hand once a response matrix has been computed, it may be stored and used again. Thus response matrix methods appear to be particularly advantageous for burnup, optimization, and possibly many kinetics problems where the properties of many subunits do not change. (43 references) (U.S.)

  17. Three-dimensional radiotherapy planning system for esophageal tumors: comparison of treatment techniques and analysis of probability of complications; Planejamento tridimensional para radioterapia de tumores de esofago: comparacao de tecnicas de tratamento e analise de probabilidade de complicacoes

    Energy Technology Data Exchange (ETDEWEB)

    Justino, Pitagoras Baskara; Carvalho, Heloisa de Andrade; Ferauche, Debora; Ros, Renato [Sao Paulo Uni., SP (Brazil). Hospital das Clinicas. Instituto de Radioterapia (InRad)]. E-mail: pitagorasb@hotmail.com

    2003-06-01

    Radiotherapy techniques for esophageal cancer were compared using a three-dimensional planning system. We studied the following treatment techniques used for a patient with squamous cell carcinoma of the middle third of the esophagus: two antero-posterior and two latero-lateral parallel opposed fields, three fields ('Y' and 'T'), and four fields ('X'). Dose-volume histograms were obtained considering spinal cord and lungs as organs at risk. Analysis was performed comparing doses in these organs as recommended by the Normal Tissue Complication Probability (NTCP) and Tumor Control Probability (TCP). When only the lungs were considered the best technique was two antero-posterior parallel opposed fields. The spinal cord was best protected using latero-lateral fields. We suggest the combination of at least two treatment techniques: antero-posterior fields with 'Y' or 'T' techniques, or latero-lateral fields in order to balance the doses in the lungs and the spinal cord. Another option may be the use of any of the three-field techniques during the whole treatment. (author)

  18. Pediatric Chest Pain-Low-Probability Referral: A Multi-Institutional Analysis From Standardized Clinical Assessment and Management Plans (SCAMPs®), the Pediatric Health Information Systems Database, and the National Ambulatory Medical Care Survey.

    Science.gov (United States)

    Harahsheh, Ashraf S; O'Byrne, Michael L; Pastor, Bill; Graham, Dionne A; Fulton, David R

    2017-11-01

    We conducted a study to assess test characteristics of red-flag criteria for identifying cardiac disease causing chest pain and technical charges of low-probability referrals. Accuracy of red-flag criteria was ascertained through study of chest pain Standardized Clinical Assessment and Management Plans (SCAMPs®) data. Patients were divided into 2 groups: Group1 (concerning clinical elements) and Group2 (without). We compared incidence of cardiac disease causing chest pain between these 2 groups. Technical charges of Group 2 were analyzed using the Pediatric Health Information System database. Potential savings for the US population was estimated using National Ambulatory Medical Care Survey data. Fifty-two percent of subjects formed Group 1. Cardiac disease causing chest pain was identified in 8/1656 (0.48%). No heart disease was identified in patients in Group 2 ( P = .03). Applying red-flags in determining need for referral identified patients with cardiac disease causing chest pain with 100% sensitivity. Median technical charges for Group 2, over a 4-year period, were US2014$775 559. Eliminating cardiac testing of low-probability referrals would save US2014$3 775 182 in technical charges annually. Red-flag criteria were an effective screen for children with chest pain. Eliminating cardiac testing in children without red-flags for referral has significant technical charge savings.

  19. Pipe failure probability - the Thomas paper revisited

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    2000-01-01

    Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas

  20. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  1. Probability densities in strong turbulence

    Science.gov (United States)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  2. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...yields an analog magnitude monotonically related to the proportion of possibilities in the mental model in which Obama is re- elected. We refer to this... internal representation that corresponds to a simple line within two boundaries: |−−−−−− | The left vertical represents impossibility, the right

  3. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  4. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  5. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  6. Half-life and delayed neutron emission probabilities measurement of 44S,45-47Cl spectrometer: their implication in the 48Ca/46Ca ratio understanding in the solar system

    International Nuclear Information System (INIS)

    Sorlin, O.

    1991-06-01

    After a detailed description of the simulation program LISE, experimental measurements of the radioactive decay (half-lives and neutron delayed emission probabilities) of the neutron rich nuclei 44 S, 45 Cl, 47 Cl are presented. The comparison between these results and the theoretical predictions clearly shows that mass and deformation measurements of these nuclei are very important to determine precisely the radioactive decay parameters necessary to astrophysics. Effectively, these ones are used in star evolution models to understand the elements abundance curve of the solar system. Even if the shape of this curve is well reproduced, some anomalies subsist. The experimental results seem to show that, with the constraints they bring, a double r process in low-mass supernovae of type II can explain twice the isotopic anomalies and the shape of the elements abundance curve

  7. The instrumentalist aspects of quantum mechanics stem from probability theory

    Science.gov (United States)

    Vervoort, Louis

    2012-03-01

    The aim of the article is to argue that the interpretations of quantum mechanics and of probability are much closer than usually thought. Indeed, a detailed analysis of the concept of probability (within the standard frequency interpretation of R. von Mises) reveals that this notion always refers to an observing system. Therefore the instrumentalist aspects of quantum mechanics, and in particular the enigmatic role of the observer in the Copenhagen interpretation, derive from a precise understanding of probability.

  8. Reaction probability for sequential separatrix crossings

    International Nuclear Information System (INIS)

    Cary, J.R.; Skodje, R.T.

    1988-01-01

    The change of the crossing parameter (essentially the phase) between sequential slow separatrix crossings is calculated for Hamiltonian systems with one degree of freedom. Combined with the previous separatrix crossing analysis, these results reduce the dynamics of adiabatic systems with separatrices to a map. This map determines whether a trajectory leaving a given separatrix lobe is ultimately captured by the other lobe. Averaging these results over initial phase yields the reaction probability, which does not asymptote to the fully phase-mixed result even for arbitrarily long times between separatrix crossings

  9. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  10. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  11. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  12. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  13. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  14. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  16. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  17. Probability, statistics and queueing theory, with computer science applications

    CERN Document Server

    Allen, Arnold O

    1978-01-01

    Probability, Statistics, and Queueing Theory: With Computer Science Applications focuses on the use of statistics and queueing theory for the design and analysis of data communication systems, emphasizing how the theorems and theory can be used to solve practical computer science problems. This book is divided into three parts. The first part discusses the basic concept of probability, probability distributions commonly used in applied probability, and important concept of a stochastic process. Part II covers the discipline of queueing theory, while Part III deals with statistical inference. T

  18. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    framework is an important way to focus research in the most critical areas as well as providing an integrated approach to a range of complex processes. Uncertainty in both event probability and consequences can formally be accounted for within a decision framework and therefore is explicitly communicated to decision makers. Such an approach also tends to open new questions about volcanic systems and their interactions with humans and infrastructure, thereby driving new basic research

  19. Sampling probability distributions of lesions in mammograms

    Science.gov (United States)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  20. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  1. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  2. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  3. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  4. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  5. First hitting probabilities for semi markov chains and estimation

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos

    2017-01-01

    . In the latter case, a nonparametric estimator for the first hitting probability is proposed and the asymptotic properties of strong consistency and asymptotic normality are proven. Finally, a numerical application on a five-state system is presented to illustrate the performance of this estimator.......We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain...

  6. Probable Linezolid-Induced Pancytopenia

    Directory of Open Access Journals (Sweden)

    Nita Lakhani

    2005-01-01

    Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.

  7. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  8. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  9. Constraints on probability distributions of grammatical forms

    Directory of Open Access Journals (Sweden)

    Kostić Aleksandar

    2007-01-01

    Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.

  10. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  11. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...... and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring...

  12. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  13. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  14. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  15. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  16. Multiple decomposability of probabilities on contractible locally ...

    Indian Academy of Sciences (India)

    1970) (Berlin-Heidelberg-New. York: Springer). [10] Heyer H, Probability Measures on Locally Compact Groups (1977) (Berlin-Heidelberg-. New York: Springer). [11] Jurek Z and Mason D, Operator Limit Distributions in Probability Theory (1993).

  17. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  18. Editorial: The interpretation of probability in probabilistic safety assessments

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    Probability used to be thought of as an objective quantity that is relative-frequency based. This view was challenged about twenty years ago, when the need to quantify the risks from large technological systems was recognized and resources were expended to produce numerical results. Uncertainty is, an integral part of the concept of risk. Probability is the numerical measure of this uncertainty. The quantification of risks should simply be a straightforward application of the Theory of Probability. However, the quantification of risks requires the frequencies of rare events, e.g. major accidents, for which data are not available. Engineering judgment was then used to produce probabilities and frequencies as it became evident that the important problems of Probabilistic Safety Assessment (PSA) could not be handled with the methods of traditional statistics. The Reactor Safety Study was the first comprehensive investigation of nuclear power reactor risks. It developed distributions for the failure rates of equipment and combined these subjective distributions using the rules of the Theory of Probability. The two approaches, a statistical one using quantified probability and the Bayesian approach, using failure rates of equipment, combined with subjective distributions using the theory of probability are compared. Questions on the validity of the two methods are posed. The rest of the chapters are answers to the questions on probability in Safety Assessment. (author)

  19. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    reports that the judgments of only a minority of well- educated individuals corroborated it and only for some sorts of conditional [83]. Reasoners rely...and its application to Boolean systems. J. Cogn. Psychol. 25, 365 389 7 Beth, E.W. and Piaget, J. (1966) Mathematical Epistemology and Psychology

  20. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  2. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  3. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  4. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  5. Posteriori Probabilities and Likelihoods Combination for Speech and Speaker Recognition

    OpenAIRE

    BenZeghiba, Mohamed Faouzi; Bourlard, Hervé

    2004-01-01

    This paper investigates a new approach to perform simultaneous speech and speaker recognition. The likelihood estimated by a speaker identification system is combined with the posterior probability estimated by the speech recognizer. So, the joint posterior probability of the pronounced word and the speaker identity is maximized. A comparison study with other standard techniques is carried out in three different applications, (1) closed set speech and speaker identification, (2) open set spee...

  6. Adolescents' misinterpretation of health risk probability expressions.

    Science.gov (United States)

    Cohn, L D; Schydlower, M; Foley, J; Copeland, R L

    1995-05-01

    To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).

  7. Dependence in Probability and Statistics

    CERN Document Server

    Doukhan, Paul; Surgailis, Donatas; Teyssiere, Gilles

    2010-01-01

    This volume collects recent works on weakly dependent, long-memory and multifractal processes and introduces new dependence measures for studying complex stochastic systems. Other topics include the statistical theory for bootstrap and permutation statistics for infinite variance processes, the dependence structure of max-stable processes, and the statistical properties of spectral estimators of the long memory parameter. The asymptotic behavior of Fejer graph integrals and their use for proving central limit theorems for tapered estimators are investigated. New multifractal processes are intr

  8. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  9. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2017-06-05

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. UT Biomedical Informatics Lab (BMIL) Probability Wheel.

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  11. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  12. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  13. On The Left Tail-End Probabilities and the Probability Generating ...

    African Journals Online (AJOL)

    On The Left Tail-End Probabilities and the Probability Generating Function. ... Journal of the Nigerian Association of Mathematical Physics ... In this paper, another tail-end probability function is proposed using the left tail-end probabilities, p( ≤ i ) = Πṙ The resulting function, πx(t), is continuous and converges uniformly ...

  14. Can an ensemble give anything more than Gaussian probabilities?

    Directory of Open Access Journals (Sweden)

    J. C. W. Denholm-Price

    2003-01-01

    Full Text Available Can a relatively small numerical weather prediction ensemble produce any more forecast information than can be reproduced by a Gaussian probability density function (PDF? This question is examined using site-specific probability forecasts from the UK Met Office. These forecasts are based on the 51-member Ensemble Prediction System of the European Centre for Medium-range Weather Forecasts. Verification using Brier skill scores suggests that there can be statistically-significant skill in the ensemble forecast PDF compared with a Gaussian fit to the ensemble. The most significant increases in skill were achieved from bias-corrected, calibrated forecasts and for probability forecasts of thresholds that are located well inside the climatological limits at the examined sites. Forecast probabilities for more climatologically-extreme thresholds, where the verification more often lies within the tails or outside of the PDF, showed little difference in skill between the forecast PDF and the Gaussian forecast.

  15. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  16. Selection of minimum earthquake intensity in calculating pipe failure probabilities

    International Nuclear Information System (INIS)

    Lo, T.Y.

    1985-01-01

    In a piping reliability analysis, it is sometimes necessary to specify a minimum ground motion intensity, usually the peak acceleration, below which the ground motions are not considered as earthquakes and, hence, are neglected. The calculated probability of failure of a piping system is dependent on this selected minimum earthquake intensity chosen for the analysis. A study was conducted to determine the effects of the minimum earthquake intensity on the probability of pipe failure. The results indicated that the probability of failure of the piping system is not very sensitive to the variations of the selected minimum peak ground acceleration. However, it does have significant effects on various scenarios that make up the system failure

  17. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  18. Uncommon evolution of probable central nervous system histoplasmosis: from leptomeningitis to posterior fossa granuloma. A case report with magnetic resonance images; Evolucao incomum de provavel histoplasmose de sistema nervoso central: de leptomeningite para granuloma da fossa posterior. Relato de caso com imagens por ressonancia magnetica

    Energy Technology Data Exchange (ETDEWEB)

    Carrilho, Paulo Eduardo Mestrinelli; Alves, Orival [Universidade Estadual do Oeste do Parana - UNIOESTE, Cascavel, PR (Brazil). Curso de Medicina. Disciplina de Neurologia e Neurocirurgia]. E-mail: carrilho@certto.com.br; Budant, Manfredo [UNITOM - Unidade de Diagnostico por Imagem, Cascavel, PR (Brazil). Centro de Tomografia; Bozo, Marlon K.; Meirelles, Noel [Universidade Estadual do Oeste do Parana - UNIOESTE, Cascavel, PR (Brazil). Curso de Medicina; Bueno, Alexandre Galvao [ANATOM - Instituto de Anatomia Patologica de Cascavel, PR (Brazil)

    2006-01-15

    We report a case of a young immunocompetent patient with probable central nervous system histoplasmosis with evolutive peculiar findings seen on magnetic resonance imaging. Leptomeningeal thickening was initially observed which subsequently became a posterior fossa granuloma. The diagnosis of fungal infection was only reached by histopathological study and the treatment was based on long term therapy with fluconazole wth good initial response. (author)

  19. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  20. The probability and the management of human error

    Energy Technology Data Exchange (ETDEWEB)

    Dufey, R.B. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, ON (Canada); Saull, J.W. [International Federation of Airworthiness, Sussex (United Kingdom)

    2004-07-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error ({lambda}) that combines the influences of early inexperience, learning from experience ({epsilon}) and stochastic occurrences with having a finite minimum rate, this equation is {lambda} 5.10{sup -5} + ((1/{epsilon}) - 5.10{sup -5}) exp(-3*{epsilon}). The future failure rate is entirely determined by the experience: thus the past defines the future.

  1. Analytical Study of Thermonuclear Reaction Probability Integrals

    OpenAIRE

    Chaudhry, M. A.; Haubold, H. J.; Mathai, A. M.

    2000-01-01

    An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.

  2. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  3. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  4. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  5. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  6. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  7. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  8. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  9. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  10. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  11. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  12. Fixed setpoints introduce error in licensing probability

    Energy Technology Data Exchange (ETDEWEB)

    Laratta, F., E-mail: flaratta@cogeco.ca [Oakville, ON (Canada)

    2015-07-01

    Although we license fixed (constrained) trip setpoints to a target probability, there is no provision for error in probability calculations or how error can be minimized. Instead, we apply reverse-compliance preconditions on the accident scenario such as a uniform and slow LOR to make probability seem error-free. But how can it be? Probability is calculated from simulated pre-LOR detector readings plus uncertainties before the LOR progression is even knowable. We can conserve probability without preconditions by continuously updating field setpoint equations with on-line detector data. Programmable Digital Controllers (PDC's) in CANDU 6 plants already have variable setpoints for Steam Generator and Pressurizer Low Level. Even so, these setpoints are constrained as a ramp or step in other CANDU plants and don't exhibit unconstrained variability. Fixed setpoints penalize safety and operation margins and cause spurious trips. We nevertheless continue to design suboptimal trip setpoint comparators for all trip parameters. (author)

  13. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the ...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions.......Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...

  14. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  15. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  16. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  17. Excitation probability and effective temperature in the stationary regime of conductivity for Coulomb Glasses

    Directory of Open Access Journals (Sweden)

    Caravaca Garratón Manuel

    2017-07-01

    Full Text Available In this paper, we shall illustrate the numerical calculation of the effective temperature in Coulomb glasses by excitation probability provided that the system has been placed in a stationary state after applying a strong electric field. The excitation probability becomes a better alternative than the occupation probability, which has been classically employed to calculate the effective temperature and characterize the thermodynamics of Coulomb glasses out of equilibrium. This is due to the fact that the excitation probability shows better statistics than the occupation probability. In addition, our simulations show that the excitation probability does not depend on the choice of the chemical potential, which critically affects the occupation probability. Our results allow us to propose the excitation probability as a standard procedure to determine the effective temperature in Coulomb glasses as well as in other complex systems such as spin glasses.

  18. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  19. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  20. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  1. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one is willing to assume Subjective Expected Utility....

  2. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  3. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  4. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  5. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  6. Optimal design of unit hydrographs using probability distribution and ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Keywords. Unit hydrograph; rainfall-runoff; hydrology; genetic algorithms; optimization; probability distribution. 1. Introduction. One of the most common interests of hydrologists is the estimation of direct runoff from a watershed for specified distribution of rainfall. This can be achieved either by a system or a physical approach ...

  7. Probability density estimation in stochastic environmental models using reverse representations

    NARCIS (Netherlands)

    Van den Berg, E.; Heemink, A.W.; Lin, H.X.; Schoenmakers, J.G.M.

    2003-01-01

    The estimation of probability densities of variables described by systems of stochastic dierential equations has long been done using forward time estimators, which rely on the generation of realizations of the model, forward in time. Recently, an estimator based on the combination of forward and

  8. Talking probabilities: communicating probabilistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to

  9. Talking probabilities: communicating probalistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to provide

  10. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  11. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  12. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  13. On Convergent Probability of a Random Walk

    Science.gov (United States)

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  14. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  15. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  16. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  17. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  18. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  19. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  20. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  1. Liquefaction Probability Curves for Surficial Geologic Units

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2009-12-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both

  2. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship......-ship collisions, ship-platform collisions, and ship groundings. The main benefit of the method is that it allows comparisons of various navigation routes....

  3. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  4. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  5. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations betwee...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  6. Theoretical analysis on the probability of initiating persistent fission chain

    International Nuclear Information System (INIS)

    Liu Jianjun; Wang Zhe; Zhang Ben'ai

    2005-01-01

    For the finite multiplying system of fissile material in the presence of a weak neutron source, the authors analyses problems on the probability of initiating a persistent fission chain through reckoning the stochastic theory of neutron multiplication. In the theoretical treatment, the conventional point reactor conception model is developed to an improved form with position x and velocity v dependence. The estimated results including approximate value of the probability mentioned above and its distribution are given by means of diffusion approximation and compared with those with previous point reactor conception model. They are basically consistent, however the present model can provide details on the distribution. (authors)

  7. Evaluation of DNA match probability in criminal case.

    Science.gov (United States)

    Lee, J W; Lee, H S; Park, M; Hwang, J J

    2001-02-15

    The new emphasis on quantification of evidence has led to perplexing courtroom decisions and it has been difficult for forensic scientists to pursue logical arguments. Especially, for evaluating DNA evidence, though both the genetic relationship for two compared persons and the examined locus system should be considered, the understanding for this has not yet drawn much attention. In this paper, we suggest to calculate the match probability by using coancestry coefficient when the family relationship is considered, and thus the performances of the identification values depending on the calculation of match probability are compared under various situations.

  8. The mathematics of games an introduction to probability

    CERN Document Server

    Taylor, David G

    2014-01-01

    Dice, Coins, and Candy Introduction ProbabilityCandy (Yum)! Wheels and More Dice RouletteCrapsCounting the Pokers Cards and CountingSeven Card Pokers Texas Hold'Em BluffingWindmills and Black Jacks? Blackjack Blackjack VariantsMore Fun Dice!Liar's Dice Yahtzee Zombie Dice Board Games, Not ""Bored"" Games Board Game Movement Pay Day (The Board Game) MonopolySpread, RevisitedCan You Bet and Win? Betting SystemsGambler's RuinThere Are More Games! The LotteryBingo Baccarat Farkle BackgammonMemoryAppendices A Probabilities with Infinity B St. Petersburg Paradox C Prisoner's Dilemma and More Game Th

  9. Flux continuity and probability conservation in complexified Bohmian mechanics

    International Nuclear Information System (INIS)

    Poirier, Bill

    2008-01-01

    Recent years have seen increased interest in complexified Bohmian mechanical trajectory calculations for quantum systems as both a pedagogical and computational tool. In the latter context, it is essential that trajectories satisfy probability conservation to ensure they are always guided to where they are most needed. We consider probability conservation for complexified Bohmian trajectories. The analysis relies on time-reversal symmetry considerations, leading to a generalized expression for the conjugation of wave functions of complexified variables. This in turn enables meaningful discussion of complexified flux continuity, which turns out not to be satisfied in general, though a related property is found to be true. The main conclusion, though, is that even under a weak interpretation, probability is not conserved along complex Bohmian trajectories

  10. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  11. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  12. Electricity price forecasting using Enhanced Probability Neural Network

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2010-01-01

    This paper proposes a price forecasting system for electric market participants to reduce the risk of price volatility. Combining the Probability Neural Network (PNN) and Orthogonal Experimental Design (OED), an Enhanced Probability Neural Network (EPNN) is proposed in the solving process. In this paper, the Locational Marginal Price (LMP), system load and temperature of PJM system were collected and the data clusters were embedded in the Excel Database according to the year, season, workday, and weekend. With the OED to smooth parameters in the EPNN, the forecasting error can be improved during the training process to promote the accuracy and reliability where even the ''spikes'' can be tracked closely. Simulation results show the effectiveness of the proposed EPNN to provide quality information in a price volatile environment. (author)

  13. Probability, arrow of time and decoherence

    Science.gov (United States)

    Bacciagaluppi, Guido

    This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.

  14. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  15. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  16. Quantum Probability and Spectral Analysis of Graphs

    CERN Document Server

    Hora, Akihito

    2007-01-01

    This is the first book to comprehensively cover the quantum probabilistic approach to spectral analysis of graphs. This approach has been developed by the authors and has become an interesting research area in applied mathematics and physics. The book can be used as a concise introduction to quantum probability from an algebraic aspect. Here readers will learn several powerful methods and techniques of wide applicability, which have been recently developed under the name of quantum probability. The exercises at the end of each chapter help to deepen understanding. Among the topics discussed along the way are: quantum probability and orthogonal polynomials; asymptotic spectral theory (quantum central limit theorems) for adjacency matrices; the method of quantum decomposition; notions of independence and structure of graphs; and asymptotic representation theory of the symmetric groups.

  17. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  18. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  19. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  20. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  1. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  2. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  3. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  4. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  5. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  7. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...

  8. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...... of ship-ship collisions, ship-platform collisions, and ship groundings. The main benefit of the method is that it allows comparisons of various navigation routes and procedures by assessing the relative frequencies of collisions and groundings....

  9. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  10. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  11. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  12. Probability groups as orbits of groups

    International Nuclear Information System (INIS)

    Bhattarai, H.N.

    2003-11-01

    The set of double cosets of a group with respect to a subgroup and the set of orbits of a group with respect to a group of automorphisms have structures which can be studied as multigroups, hypergroups or Pasch geometries. When the subgroup or the group of automorphisms are finite, the multivalued products can be provided with some weightages forming so-called Probability Groups. It is shown in this paper that some abstract probability groups can be realized as orbit spaces of groups. (author)

  13. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  14. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  15. Nonstationary envelope process and first excursion probability.

    Science.gov (United States)

    Yang, J.-N.

    1972-01-01

    The definition of stationary random envelope proposed by Cramer and Leadbetter, is extended to the envelope of nonstationary random process possessing evolutionary power spectral densities. The density function, the joint density function, the moment function, and the crossing rate of a level of the nonstationary envelope process are derived. Based on the envelope statistics, approximate solutions to the first excursion probability of nonstationary random processes are obtained. In particular, applications of the first excursion probability to the earthquake engineering problems are demonstrated in detail.

  16. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  17. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  18. Path probabilities of continuous time random walks

    Science.gov (United States)

    Eule, Stephan; Friedrich, Rudolf

    2014-12-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman-Kac formulae.

  19. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  20. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  1. Method and system for dynamic probabilistic risk assessment

    Science.gov (United States)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  2. Outage Probability Analysis of FSO Links over Foggy Channel

    KAUST Repository

    Esmail, Maged Abdullah

    2017-02-22

    Outdoor Free space optic (FSO) communication systems are sensitive to atmospheric impairments such as turbulence and fog, in addition to being subject to pointing errors. Fog is particularly severe because it induces an attenuation that may vary from few dBs up to few hundreds of dBs per kilometer. Pointing errors also distort the link alignment and cause signal fading. In this paper, we investigate and analyze the FSO systems performance under fog conditions and pointing errors in terms of outage probability. We then study the impact of several effective communication mitigation techniques that can improve the system performance including multi-hop, transmit laser selection (TLS) and hybrid RF/FSO transmission. Closed-form expressions for the outage probability are derived and practical and comprehensive numerical examples are suggested to assess the obtained results. We found that the FSO system has limited performance that prevents applying FSO in wireless microcells that have a 500 m minimum cell radius. The performance degrades more when pointing errors appear. Increasing the transmitted power can improve the performance under light to moderate fog. However, under thick and dense fog the improvement is negligible. Using mitigation techniques can play a major role in improving the range and outage probability.

  3. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  4. Virus isolation: Specimen type and probable transmission

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.

  5. Eliciting Subjective Probability Distributions on Continuous Variables

    Science.gov (United States)

    1975-08-01

    STATEMENT (3l Ihl» Riporl) Approved for Public Release; Distribiition Unlimited vT u.VH SUTiON STATEMENT (ol in, motif el oofnd In Block 20, II...Adjusting Proper Scoring Rule Fractile Subjective Probability Uncertainty Measures ZO. ABSTRACT (Conllnuo an r«v*r*« oido H nocoomtry and

  6. Complexity of Fuzzy Probability Logics II

    Czech Academy of Sciences Publication Activity Database

    Hájek, Petr

    2007-01-01

    Roč. 158, č. 23 (2007), s. 2605-2611 ISSN 0165-0114 R&D Projects: GA AV ČR IAA100300503 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * probability * computational complexity Subject RIV: BA - General Mathematics Impact factor: 1.373, year: 2007

  7. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...

  8. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  9. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  10. Low Probability of Intercept Laser Range Finder

    Science.gov (United States)

    2017-07-19

    performs the signal processing . Processor 30 performs a continuous sweep over the photodetector 38 output to isolate and amplify the optical signals ...December 2017 The below identified patent application is available for licensing. Requests for information should be addressed to...1 of 12 LOW PROBABILITY OF INTERCEPT LASER RANGE FINDER STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein may be

  11. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  12. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Saad, E.A.; Hendi, A.A.

    1984-07-01

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  13. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  14. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  15. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  16. Exploring Concepts in Probability: Using Graphics Calculators

    Science.gov (United States)

    Ghosh, Jonaki

    2004-01-01

    This article describes a project in which certain key concepts in probability were explored using graphics calculators with year 10 students. The lessons were conducted in the regular classroom where students were provided with a Casio CFX 9850 GB PLUS graphics calculator with which they were familiar from year 9. The participants in the…

  17. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  18. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  19. The Britannica Guide to Statistics and Probability

    CERN Document Server

    2011-01-01

    By observing patterns and repeated behaviors, mathematicians have devised calculations to significantly reduce human potential for error. This volume introduces the historical and mathematical basis of statistics and probability as well as their application to everyday situations. Readers will also meet the prominent thinkers who advanced the field and established a numerical basis for prediction

  20. Comonotonic Book-Making with Nonadditive Probabilities

    NARCIS (Netherlands)

    Diecidue, E.; Wakker, P.P.

    2000-01-01

    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the

  1. Reduction of Compound Lotteries with Objective Probabilities

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2015-01-01

    The reduction of compound lotteries axiom (ROCL) has assumed a central role in the evaluation of behavior toward risk and uncertainty. We present experimental evidence on its validity in the domain of objective probabilities. Our battery of lottery pairs includes simple one-stage lotteries, two...

  2. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  3. Confusion between Odds and Probability, a Pandemic?

    Science.gov (United States)

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  4. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  5. Analysis of the probability of channel satisfactory state in P2P live ...

    African Journals Online (AJOL)

    In this paper a model based on user behaviour of P2P live streaming systems was developed in order to analyse one of the key QoS parameter of such systems, i.e. the probability of channel-satisfactory state, the impact of upload bandwidths and channels' popularity on the probability of channel-satisfactory state was also ...

  6. analysis of the probability of channel satisfactory state in p2p live

    African Journals Online (AJOL)

    userpc

    ABSTRACT. In this paper a model based on user behaviour of P2P live streaming systems was developed in order to analyse one of the key QoS parameter of such systems, i.e. the probability of channel-satisfactory state, the impact of upload bandwidths and channels' popularity on the probability of channel-satisfactory ...

  7. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  8. Κ-electron capture probability in 167Tm

    International Nuclear Information System (INIS)

    Sree Krishna Murty, G.; Chandrasekhar Rao, M.V.S.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Ramana Rao, P.V.; Sastry, D.L.

    1990-01-01

    The Κ-electron capture probability in the decay of 167 Tm for the first-forbidden transition 1/2 + →3/2 - was measured using the sum-coincidence method and employing a hyper-pure Ge system. The P Κ value is found to be 0.835±0.029, in agreement with the theoretical value of 0.829. (author)

  9. Kappa. -electron capture probability in sup 167 Tm

    Energy Technology Data Exchange (ETDEWEB)

    Sree Krishna Murty, G.; Chandrasekhar Rao, M.V.S.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Ramana Rao, P.V.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Labs. for Nuclear Research); Chintalapudi, S.N. (Variable Energy Cyclotron Centre, Calcutta (India))

    1990-07-01

    The {Kappa}-electron capture probability in the decay of {sup 167}Tm for the first-forbidden transition 1/2{sup +}{yields}3/2{sup -} was measured using the sum-coincidence method and employing a hyper-pure Ge system. The P{sub {Kappa}} value is found to be 0.835{plus minus}0.029, in agreement with the theoretical value of 0.829. (author).

  10. Outage probability of distributed beamforming with co-channel interference

    KAUST Repository

    Yang, Liang

    2012-03-01

    In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.

  11. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  12. Cisplatin and radiation: Interaction probabilities and therapeutic possibilities

    International Nuclear Information System (INIS)

    Begg, A.C.

    1990-01-01

    This paper examines the probability of interactions occurring between drug lesions and radiation lesions in DNA for the cytotoxic and radiosensitizing agent cisplatin. The number of cisplatin-induced DNA adducts and radiation-induced strand breaks after a given dose of each agent are known for given cell systems, from which the probability that these lesions will interact can be estimated. Results of these calculations indicate that the probability of interaction could be high, depending on the distance over which two lesions can interact and the probability of repair of the interaction lesion. Calculated lesion numbers have been compared with known data on radiation modification, including illustrations of inconsistencies. In the second part of the paper, ways in which combined therapy with cisplatin and radiation can be improved are described. Development of methods to predict which types of tumor and which individual tumors within a given type are sensitive to the cytotoxic and radiosensitizing effects of the drug would aid rational selection of patients for combination treatments. Immunocytochemical methods sensitive enough to monitor cisplatin-DNA interactions in patients are available and may be useful in this context. The delivery and maintenance of higher tumor concentrations of radiosensitizer offers a further possibility for improvement. Studies of intratumoral injection of cisplatin have shown promise for achieving this goal while limiting normal tissue toxicity.46 references

  13. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  14. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  15. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  16. Quantum Theory and Probability Theory: Their Relationship and Origin in Symmetry

    Directory of Open Access Journals (Sweden)

    Philip Goyal

    2011-04-01

    Full Text Available Quantum theory is a probabilistic calculus that enables the calculation of the probabilities of the possible outcomes of a measurement performed on a physical system. But what is the relationship between this probabilistic calculus and probability theory itself? Is quantum theory compatible with probability theory? If so, does it extend or generalize probability theory? In this paper, we answer these questions, and precisely determine the relationship between quantum theory and probability theory, by explicitly deriving both theories from first principles. In both cases, the derivation depends upon identifying and harnessing the appropriate symmetries that are operative in each domain. We prove, for example, that quantum theory is compatible with probability theory by explicitly deriving quantum theory on the assumption that probability theory is generally valid.

  17. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Galetovic, Alexander [Facultad de Ciencias Economicas y Empresariales, Universidad de los Andes, Santiago (Chile); Munoz, Cristian M. [Departamento de Ingenieria Electrica, Universidad de Chile, Mariano Sanchez Fontecilla 310, piso 3 Las Condes, Santiago (Chile)

    2009-02-15

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower. (author)

  18. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower. (author)

  19. Universal critical wrapping probabilities in the canonical ensemble

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2015-09-01

    Full Text Available Universal dimensionless quantities, such as Binder ratios and wrapping probabilities, play an important role in the study of critical phenomena. We study the finite-size scaling behavior of the wrapping probability for the Potts model in the random-cluster representation, under the constraint that the total number of occupied bonds is fixed, so that the canonical ensemble applies. We derive that, in the limit L→∞, the critical values of the wrapping probability are different from those of the unconstrained model, i.e. the model in the grand-canonical ensemble, but still universal, for systems with 2yt−d>0 where yt=1/ν is the thermal renormalization exponent and d is the spatial dimension. Similar modifications apply to other dimensionless quantities, such as Binder ratios. For systems with 2yt−d≤0, these quantities share same critical universal values in the two ensembles. It is also derived that new finite-size corrections are induced. These findings apply more generally to systems in the canonical ensemble, e.g. the dilute Potts model with a fixed total number of vacancies. Finally, we formulate an efficient cluster-type algorithm for the canonical ensemble, and confirm these predictions by extensive simulations.

  20. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  1. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  2. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  3. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  4. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  5. Collision probabilities in spatially stochastic media II

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2008-01-01

    An improved model for calculating collision probabilities in spatially stochastic media is described based upon a method developed by Cassell and Williams [Cassell, J.S., Williams, M.M.R., in press. An approximate method for solving radiation and neutron transport problems in spatially stochastic media. Annals of Nuclear Energy] and is applicable to three-dimensional problems. We shall show how to evaluate the collision probability in an arbitrarily shaped non-re-entrant lump, consisting of a random dispersal of two phases, for any form of autocorrelation function. Specific examples, with numerical values, are given for a sphere and a slab. In the case of the slab we allow the material to have different stochastic properties in the x, y and z directions

  6. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  7. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  8. Heart sounds analysis using probability assessment

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel

    2017-01-01

    Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016

  9. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...... of uncertain parameters. Monte Carlo simulation is readily used for practical calculations. However, an alternative approach is offered by possibility theory making use of possibility distributions such as intervals and fuzzy intervals. This approach is well suited to represent lack of knowledge or imprecision...

  10. Interaction probability value calculi for some scintillators

    International Nuclear Information System (INIS)

    Garcia-Torano Martinez, E.; Grau Malonda, A.

    1989-01-01

    Interaction probabilities for 17 gamma-ray energies between 1 and 1.000 KeV have been computed and tabulated. The tables may be applied to the case of cylindrical vials with radius 1,25 cm and volumes 5, 10 and 15 ml. Toluene, Toluene/Alcohol, Dioxane-Naftalen, PCS, INSTAGEL and HISAFE II scintillators are considered. Graphical results for 10 ml are also given. (Author) 11 refs

  11. Classical and quantum probabilities as truth values

    Science.gov (United States)

    Döring, Andreas; Isham, Chris J.

    2012-03-01

    We show how probabilities can be treated as truth values in suitable sheaf topoi. The scheme developed in this paper is very general and applies both in classical and quantum physics. On the quantum side, the results naturally tie in with the topos approach to quantum theory that has been developed in the last 14 years by the authors and others . Earlier results on the representation of arbitrary quantum states are complemented with a purely logical perspective.

  12. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  13. Toward General Analysis of Recursive Probability Models

    OpenAIRE

    Pless, Daniel; Luger, George

    2013-01-01

    There is increasing interest within the research community in the design and use of recursive probability models. Although there still remains concern about computational complexity costs and the fact that computing exact solutions can be intractable for many nonrecursive models and impossible in the general case for recursive problems, several research groups are actively developing computational techniques for recursive stochastic languages. We have developed an extension to the traditional...

  14. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  15. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  16. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  17. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  18. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07; Calculo de la probabilidad de falla de tuberias del sistema RCIC de una central nuclear mediante el software WinPRAISE 07

    Energy Technology Data Exchange (ETDEWEB)

    Jasso G, J.; Diaz S, A.; Mendoza G, G.; Sainz M, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Garcia de la C, F. M., E-mail: angeles.diaz@inin.gob.mx [Comision Federal de Electricidad, Central Nucleoelectrica Laguna Verde, Km 44.5 Carretera Cardel-Nautla, 91476 Laguna Verde, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  19. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  20. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    S. Kuzio

    2004-01-01

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be