WorldWideScience

Sample records for probable future large

  1. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  2. Probability Weighting and Loss Aversion in Futures Hedging

    NARCIS (Netherlands)

    Mattos, F.; Garcia, P.; Pennings, J.M.E.

    2008-01-01

    We analyze how the introduction of probability weighting and loss aversion in a futures hedging model affects decision making. Analytical findings indicate that probability weighting alone always affects optimal hedge ratios, while loss and risk aversion only have an impact when probability

  3. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  4. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  5. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  6. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  7. Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.

    Science.gov (United States)

    Speirs, Calandra; Huang, Vivian; Konnert, Candace

    2017-09-01

    Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.

  8. On asymptotically efficient simulation of large deviation probabilities.

    NARCIS (Netherlands)

    Dieker, A.B.; Mandjes, M.R.H.

    2005-01-01

    ABSTRACT: Consider a family of probabilities for which the decay is governed by a large deviation principle. To find an estimate for a fixed member of this family, one is often forced to use simulation techniques. Direct Monte Carlo simulation, however, is often impractical, particularly if the

  9. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    Science.gov (United States)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  10. Quantum probability, choice in large worlds, and the statistical structure of reality.

    Science.gov (United States)

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  11. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    Science.gov (United States)

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  12. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  13. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  14. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis.

    Science.gov (United States)

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-06-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.

  15. Future probabilities of coastal floods in Finland

    Science.gov (United States)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  16. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, S; Streit, R D; Chou, C K

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10{sup -12}). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  17. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    International Nuclear Information System (INIS)

    Lu, S.; Streit, R.D.; Chou, C.K.

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10 -12 ). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  18. Fixation probability of a nonmutator in a large population of asexual mutators.

    Science.gov (United States)

    Jain, Kavita; James, Ananthu

    2017-11-21

    In an adapted population of mutators in which most mutations are deleterious, a nonmutator that lowers the mutation rate is under indirect selection and can sweep to fixation. Using a multitype branching process, we calculate the fixation probability of a rare nonmutator in a large population of asexual mutators. We show that when beneficial mutations are absent, the fixation probability is a nonmonotonic function of the mutation rate of the mutator: it first increases sublinearly and then decreases exponentially. We also find that beneficial mutations can enhance the fixation probability of a nonmutator. Our analysis is relevant to an understanding of recent experiments in which a reduction in the mutation rates has been observed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  20. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    Science.gov (United States)

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to

  1. Genefer: Programs for Finding Large Probable Generalized Fermat Primes

    Directory of Open Access Journals (Sweden)

    Iain Arthur Bethune

    2015-11-01

    Full Text Available Genefer is a suite of programs for performing Probable Primality (PRP tests of Generalised Fermat numbers 'b'2'n'+1 (GFNs using a Fermat test. Optimised implementations are available for modern CPUs using single instruction, multiple data (SIMD instructions, as well as for GPUs using CUDA or OpenCL. Genefer has been extensively used by PrimeGrid – a volunteer computing project searching for large prime numbers of various kinds, including GFNs. Genefer’s architecture separates the high level logic such as checkpointing and user interface from the architecture-specific performance-critical parts of the implementation, which are suitable for re-use. Genefer is released under the MIT license. Source and binaries are available from www.assembla.com/spaces/genefer.

  2. Theory including future not excluded

    DEFF Research Database (Denmark)

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  3. Concepts for Future Large Fire Modeling

    Science.gov (United States)

    A. P. Dimitrakopoulos; R. E. Martin

    1987-01-01

    A small number of fires escape initial attack suppression efforts and become large, but their effects are significant and disproportionate. In 1983, of 200,000 wildland fires in the United States, only 4,000 exceeded 100 acres. However, these escaped fires accounted for roughly 95 percent of wildfire-related costs and damages (Pyne, 1984). Thus, future research efforts...

  4. Statistical eruption forecast for the Chilean Southern Volcanic Zone: typical probabilities of volcanic eruptions as baseline for possibly enhanced activity following the large 2010 Concepción earthquake

    Directory of Open Access Journals (Sweden)

    Y. Dzierma

    2010-10-01

    Full Text Available A probabilistic eruption forecast is provided for ten volcanoes of the Chilean Southern Volcanic Zone (SVZ. Since 70% of the Chilean population lives in this area, the estimation of future eruption likelihood is an important part of hazard assessment. After investigating the completeness and stationarity of the historical eruption time series, the exponential, Weibull, and log-logistic distribution functions are fit to the repose time distributions for the individual volcanoes and the models are evaluated. This procedure has been implemented in two different ways to methodologically compare details in the fitting process. With regard to the probability of at least one VEI ≥ 2 eruption in the next decade, Llaima, Villarrica and Nevados de Chillán are most likely to erupt, while Osorno shows the lowest eruption probability among the volcanoes analysed. In addition to giving a compilation of the statistical eruption forecasts along the historically most active volcanoes of the SVZ, this paper aims to give "typical" eruption probabilities, which may in the future permit to distinguish possibly enhanced activity in the aftermath of the large 2010 Concepción earthquake.

  5. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  6. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  7. Safeguarding future large-scale plutonium bulk handling facilities

    International Nuclear Information System (INIS)

    1979-01-01

    The paper reviews the current status, advantages, limitations and probable future developments of material accountancy and of containment and surveillance. The major limitations on the use of material accountancy in applying safeguards to future plants arise from the uncertainty with which flows and inventories can be measured (0.5 to 1.0%), and the necessity to carry out periodical physical inventories to determine whether material has been diverted. The use of plant instrumentation to determine in-process inventories has commenced and so has the development of statistical methods for the evaluations of the data derived from a series of consecutive material balance periods. The limitations of accountancy can be overcome by increased use of containment and surveillance measures which have the advantage that they are independent of the operator's actions. In using these measures it will be necessary to identify the credible diversion paths, build in sufficient redundancy to reduce false alarm rates, develop automatic data recording and alarming

  8. Greenland plays a large role in the gloomy picture painted of probable future sea-level rise

    Science.gov (United States)

    Hanna, Edward

    2012-12-01

    Antarctica by 3000 AD is no more than 94 cm Antarctica remains relatively insensitive for future sea-level rise given a temperature increase of no more than 5-6 °C (quite a lot) above present levels. Oceanic thermal expansion and, especially, glacier melt seem very much second-order effects, compared with the Greenland sea-level contribution, for the next millennium. As expected, there are considerable differences between the outcomes of the model experiments depending on the time and level at which greenhouse gas emissions are stabilised. I am not quite sure why they 'prefer' the model version which reaches stabilisation at 2000 greenhouse gas levels since those levels have since been significantly exceeded and show no signs of tailing off yet—quite the reverse. According to the famous Keeling et al dataset from Mauna Loa in Hawaii, atmospheric CO2 levels at about 369 parts per million by volume of the global atmosphere in 2000 have since risen to about 392 ppmv in 2012, and this increase shows no signs of abating. Realistically, it's going to be at least another decade or two (or longer) before we can effectively even begin to stabilise atmospheric greenhouse gas levels, assuming the political will is there: which at the moment it is not. Of course this does not commit us to the other three more extreme experimental results (from greenhouse gas stabilization at 2100) reported in the study but we are heading dangerously in that direction. In effect the simulations are sensitivity studies, which may be largely unrealistic but are still useful as a kind of guide to what might happen under future climate change. Naturally, many uncertainties remain, especially concerning how ice-sheet motion ('dynamics') is represented in the models (e.g. the absence of so-called 'higher order physics' including longitudinal (push-pull) stresses which can rapidly transfer peripheral ice velocity perturbations inland (Price et al 2011)). Furthermore, the atmospheric model used in LOVECLIM is

  9. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  10. Using unplanned fires to help suppressing future large fires in Mediterranean forests.

    Directory of Open Access Journals (Sweden)

    Adrián Regos

    Full Text Available Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain, we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050. An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire

  11. Using unplanned fires to help suppressing future large fires in Mediterranean forests.

    Science.gov (United States)

    Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís

    2014-01-01

    Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be

  12. Exit probability of the one-dimensional q-voter model: Analytical results and simulations for large networks

    Science.gov (United States)

    Timpanaro, André M.; Prado, Carmen P. C.

    2014-05-01

    We discuss the exit probability of the one-dimensional q-voter model and present tools to obtain estimates about this probability, both through simulations in large networks (around 107 sites) and analytically in the limit where the network is infinitely large. We argue that the result E(ρ )=ρq/ρq+(1-ρ)q, that was found in three previous works [F. Slanina, K. Sznajd-Weron, and P. Przybyła, Europhys. Lett. 82, 18006 (2008), 10.1209/0295-5075/82/18006; R. Lambiotte and S. Redner, Europhys. Lett. 82, 18007 (2008), 10.1209/0295-5075/82/18007, for the case q =2; and P. Przybyła, K. Sznajd-Weron, and M. Tabiszewski, Phys. Rev. E 84, 031117 (2011), 10.1103/PhysRevE.84.031117, for q >2] using small networks (around 103 sites), is a good approximation, but there are noticeable deviations that appear even for small systems and that do not disappear when the system size is increased (with the notable exception of the case q =2). We also show that, under some simple and intuitive hypotheses, the exit probability must obey the inequality ρq/ρq+(1-ρ)≤E(ρ)≤ρ/ρ +(1-ρ)q in the infinite size limit. We believe this settles in the negative the suggestion made [S. Galam and A. C. R. Martins, Europhys. Lett. 95, 48005 (2001), 10.1209/0295-5075/95/48005] that this result would be a finite size effect, with the exit probability actually being a step function. We also show how the result that the exit probability cannot be a step function can be reconciled with the Galam unified frame, which was also a source of controversy.

  13. The probability and the management of human error

    International Nuclear Information System (INIS)

    Dufey, R.B.; Saull, J.W.

    2004-01-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future

  14. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  15. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  16. Survey Probability and Factors affecting Farmers Participation in Future and Option Markets Case Study: Cotton product in Gonbad kavos city

    Directory of Open Access Journals (Sweden)

    F. sakhi

    2016-03-01

    .5 respectively. Multinomial Logit model estimation results for the probability of participation in the future and option markets showed that variables of the level of education, farm ownership, cotton acreage, and non-farm income, work experience in agriculture, the index of willing to use new technologies, the index of risk perception cotton market and risk aversion index are statistically significant. The variables of farm ownership, non-farm income and work experience in agriculture, showed negative effects and the other variables showed positive effects on the probability of participation in these markets. The results are in line with previous studies. Conclusion: The purpose of the current study was to look at the possibility of farmers participations in the future and option markets that presented as a means to reduce the cotton prices volatility. The dependent variable for this purpose, have four categories: participation in both market, and future market, participation in option market and participation in both future and option markets. Multinomial Legit Regression Model was used for data analysis. Results indicated that during the period of 2014 -2015 and the sample under study 35% of cotton growers unwilling to participate in the future and option markets. Farmers willingness to participate in the future and option market was 19% and %21.5, respectively. Multinomial Legit model estimation results for the probability of participation in the future and option markets showed that the variables of the level of education, farm ownership, cotton acreage, and non-farm income, work experience in agriculture, the index of willing to use new technologies, the index of risk perception cotton market and risk aversion index were statistically significant. The variables of farm ownership, non-farm income and work experience in agriculture, showed negative effects and the other variables positive effects on the probability of participation in these markets. The results are in line

  17. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. High-Energy Physics Strategies and Future Large-Scale Projects

    CERN Document Server

    Zimmermann, F

    2015-01-01

    We sketch the actual European and international strategies and possible future facilities. In the near term the High Energy Physics (HEP) community will fully exploit the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). Post-LHC options include a linear e+e- collider in Japan (ILC) or at CERN (CLIC), as well as circular lepton or hadron colliders in China (CepC/SppC) and Europe (FCC). We conclude with linear and circular acceleration approaches based on crystals, and some perspectives for the far future of accelerator-based particle physics.

  20. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  1. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  2. Future development of large steam turbines

    International Nuclear Information System (INIS)

    Chevance, A.

    1975-01-01

    An attempt is made to forecast the future of the large steam turbines till 1985. Three parameters affect the development of large turbines: 1) unit output; and a 2000 to 2500MW output may be scheduled; 2) steam quality: and two steam qualities may be considered: medium pressure saturated or slightly overheated steam (light water, heavy water); light enthalpie drop, high pressure steam, high temperature; high enthalpic drop; and 3) the quality of cooling supply. The largest range to be considered might be: open system cooling for sea-sites; humid tower cooling and dry tower cooling. Bi-fluid cooling cycles should be also mentioned. From the study of these influencing factors, it appears that the constructor, for an output of about 2500MW should have at his disposal the followings: two construction technologies for inlet parts and for high and intermediate pressure parts corresponding to both steam qualities; exhaust sections suitable for the different qualities of cooling supply. The two construction technologies with the two steam qualities already exist and involve no major developments. But, the exhaust section sets the question of rotational speed [fr

  3. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  4. Ruin probability of the renewal model with risky investment and large claims

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The ruin probability of the renewal risk model with investment strategy for a capital market index is investigated in this paper.For claim sizes with common distribution of extended regular variation,we study the asymptotic behaviour of the ruin probability.As a corollary,we establish a simple asymptotic formula for the ruin probability for the case of Pareto-like claims.

  5. Electric vehicles to support large wind power penetration in future danish power systems

    DEFF Research Database (Denmark)

    Pillai, Jayakrishnan Radhakrishna; Bak-Jensen, Birgitte; Thøgersen, Paul

    2012-01-01

    Electric Vehicles (EVs) could play major role in the future intelligent grids to support a large penetration of renewable energy in Denmark, especially electricity production from wind turbines. The future power systems aims to phase-out big conventional fossil-fueled generators with large number...... on low voltage residential networks. Significant amount of EVs could be integrated in local distribution grids with the support of intelligent grid and smart charging strategies....

  6. Future changes in large-scale transport and stratosphere-troposphere exchange

    Science.gov (United States)

    Abalos, M.; Randel, W. J.; Kinnison, D. E.; Garcia, R. R.

    2017-12-01

    Future changes in large-scale transport are investigated in long-term (1955-2099) simulations of the Community Earth System Model - Whole Atmosphere Community Climate Model (CESM-WACCM) under an RCP6.0 climate change scenario. We examine artificial passive tracers in order to isolate transport changes from future changes in emissions and chemical processes. The model suggests enhanced stratosphere-troposphere exchange in both directions (STE), with decreasing tropospheric and increasing stratospheric tracer concentrations in the troposphere. Changes in the different transport processes are evaluated using the Transformed Eulerian Mean continuity equation, including parameterized convective transport. Dynamical changes associated with the rise of the tropopause height are shown to play a crucial role on future transport trends.

  7. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  8. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  9. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  10. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  12. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  13. Rainfall and net infiltration probabilities for future climate conditions at Yucca Mountain

    International Nuclear Information System (INIS)

    Long, A.; Childs, S.W.

    1993-01-01

    Performance assessment of repository integrity is a task rendered difficult because it requires predicting the future. This challenge has occupied many scientists who realize that the best assessments are required to maximize the probability of successful repository sitting and design. As part of a performance assessment effort directed by the EPRI, the authors have used probabilistic methods to assess the magnitude and timing of net infiltration at Yucca Mountain. A mathematical model for net infiltration previously published incorporated a probabilistic treatment of climate, surface hydrologic processes and a mathematical model of the infiltration process. In this paper, we present the details of the climatological analysis. The precipitation model is event-based, simulating characteristics of modern rainfall near Yucca Mountain, then extending the model to most likely values for different degrees of pluvial climates. Next the precipitation event model is fed into a process-based infiltration model that considers spatial variability in parameters relevant to net infiltration of Yucca Mountain. The model predicts that average annual net infiltration at Yucca Mountain will range from a mean of about 1 mm under present climatic conditions to a mean of at least 2.4 mm under full glacial (pluvial) conditions. Considerable variations about these means are expected to occur from year-to-year

  14. Path Loss, Shadow Fading, and Line-Of-Sight Probability Models for 5G Urban Macro-Cellular Scenarios

    DEFF Research Database (Denmark)

    Sun, Shu; Thomas, Timothy; Rappaport, Theodore S.

    2015-01-01

    This paper presents key parameters including the line-of-sight (LOS) probability, large-scale path loss, and shadow fading models for the design of future fifth generation (5G) wireless communication systems in urban macro-cellular (UMa) scenarios, using the data obtained from propagation...... measurements in Austin, US, and Aalborg, Denmark, at 2, 10, 18, and 38 GHz. A comparison of different LOS probability models is performed for the Aalborg environment. Both single-slope and dual-slope omnidirectional path loss models are investigated to analyze and contrast their root-mean-square (RMS) errors...

  15. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  16. Probable damage to tundra biota through sulphur dioxide destruction of lichens

    Energy Technology Data Exchange (ETDEWEB)

    Schofield, E; Hamilton, W L

    1970-01-01

    Lichens, which are important components of many Arctic ecosystems, are extremely sensitive to SO/sub 2/ pollution. Recent oilfield development in Arctic North America seems likely to eliminate lichens from large areas because of a unique combination of biological and meteorological factors. Probable future oilfield development in Greenland and the Soviet Union indicates that SO/sub 2/ pollution will become an increasingly serious threat to Arctic ecosystems. Therefore, uncontrolled burning of crude oil, fuel oil, and natural gas, should be avoided, and adequate sulphur-extraction facilities should be installed.

  17. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  18. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  19. Projection of Korean Probable Maximum Precipitation under Future Climate Change Scenarios

    Directory of Open Access Journals (Sweden)

    Okjeong Lee

    2016-01-01

    Full Text Available According to the IPCC Fifth Assessment Report, air temperature and humidity of the future are expected to gradually increase over the current. In this study, future PMPs are estimated by using future dew point temperature projection data which are obtained from RCM data provided by the Korea Meteorological Administration. First, bias included in future dew point temperature projection data which is provided on a daily basis is corrected through a quantile-mapping method. Next, using a scale-invariance technique, 12-hour duration 100-year return period dew point temperatures which are essential input data for PMPs estimation are estimated from bias-corrected future dew point temperature data. After estimating future PMPs, it can be shown that PMPs in all future climate change scenarios (AR5 RCP2.6, RCP 4.5, RCP 6.0, and RCP 8.5 are very likely to increase.

  20. Expected Signal Observability at Future Experiments

    CERN Document Server

    Bartsch, Valeria

    2005-01-01

    Several methods to quantify the ''significance'' of an expected signal at future experiments have been used or suggested in literature. In this note, comparisons are presented with a method based on the likelihood ratio of the ''background hypothesis'' and the ''signal-plus-background hypothesis''. A large number of Monte Carlo experiments are performed to investigate the properties of the various methods and to check whether the probability of a background fluctuation having produced the claimed significance of the discovery is properly described. In addition, the best possible separation between the two hypotheses should be provided, in other words, the discovery potential of a future experiment be maximal. Finally, a practical method to apply a likelihood-based definition of the significance is suggested in this note. Signal and background contributions are determined from a likelihoo d fit based on shapes only, and the probability density distributions of the significance thus determined are found to be o...

  1. The 2011 M = 9.0 Tohoku oki earthquake more than doubled the probability of large shocks beneath Tokyo

    Science.gov (United States)

    Toda, Shinji; Stein, Ross S.

    2013-01-01

    1] The Kanto seismic corridor surrounding Tokyo has hosted four to five M ≥ 7 earthquakes in the past 400 years. Immediately after the Tohoku earthquake, the seismicity rate in the corridor jumped 10-fold, while the rate of normal focal mechanisms dropped in half. The seismicity rate decayed for 6–12 months, after which it steadied at three times the pre-Tohoku rate. The seismicity rate jump and decay to a new rate, as well as the focal mechanism change, can be explained by the static stress imparted by the Tohoku rupture and postseismic creep to Kanto faults. We therefore fit the seismicity observations to a rate/state Coulomb model, which we use to forecast the time-dependent probability of large earthquakes in the Kanto seismic corridor. We estimate a 17% probability of a M ≥ 7.0 shock over the 5 year prospective period 11 March 2013 to 10 March 2018, two-and-a-half times the probability had the Tohoku earthquake not struck

  2. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  3. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  4. Assessing changes in failure probability of dams in a changing climate

    Science.gov (United States)

    Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.

    2017-12-01

    Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.

  5. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  6. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  7. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  8. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    Science.gov (United States)

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  9. The future of large old trees in urban landscapes.

    Science.gov (United States)

    Le Roux, Darren S; Ikin, Karen; Lindenmayer, David B; Manning, Adrian D; Gibbons, Philip

    2014-01-01

    Large old trees are disproportionate providers of structural elements (e.g. hollows, coarse woody debris), which are crucial habitat resources for many species. The decline of large old trees in modified landscapes is of global conservation concern. Once large old trees are removed, they are difficult to replace in the short term due to typically prolonged time periods needed for trees to mature (i.e. centuries). Few studies have investigated the decline of large old trees in urban landscapes. Using a simulation model, we predicted the future availability of native hollow-bearing trees (a surrogate for large old trees) in an expanding city in southeastern Australia. In urban greenspace, we predicted that the number of hollow-bearing trees is likely to decline by 87% over 300 years under existing management practices. Under a worst case scenario, hollow-bearing trees may be completely lost within 115 years. Conversely, we predicted that the number of hollow-bearing trees will likely remain stable in semi-natural nature reserves. Sensitivity analysis revealed that the number of hollow-bearing trees perpetuated in urban greenspace over the long term is most sensitive to the: (1) maximum standing life of trees; (2) number of regenerating seedlings ha(-1); and (3) rate of hollow formation. We tested the efficacy of alternative urban management strategies and found that the only way to arrest the decline of large old trees requires a collective management strategy that ensures: (1) trees remain standing for at least 40% longer than currently tolerated lifespans; (2) the number of seedlings established is increased by at least 60%; and (3) the formation of habitat structures provided by large old trees is accelerated by at least 30% (e.g. artificial structures) to compensate for short term deficits in habitat resources. Immediate implementation of these recommendations is needed to avert long term risk to urban biodiversity.

  10. Predicting future changes in Muskegon River Watershed game fish distributions under future land cover alteration and climate change scenarios

    Science.gov (United States)

    Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.

    2010-01-01

    Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.

  11. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  12. Actual growth and probable future of the worldwide nuclear industry

    International Nuclear Information System (INIS)

    Bupp, I.C.

    1981-01-01

    Worldwide nuclear-power-reactor manufacturing capacity will exceed worldwide demand by a factor of two or more during the 1980s. Only in France and the Soviet bloc countries is it likely that the ambitious nuclear-power programs formulated in the mid-1970s will be implemented. In all other developed countries and in most developing countries, further delays and cancellations of previously announced programs are all but certain. The stalemate over the future of nuclear power is particularly deep in America. Administrative and personnel problems in the Nuclear Regulatory Commission, slow progress on radioactive waste disposal by the Department of Energy, severe financial problems for most electric utilities, and drastic reductions in the rate of electricity demand growth combine to make continuation of the five-year-old moratorium on reactor orders inevitable. Many of the ninety plants under construction may never operate, and some of the seventy in operation may shut down before the end of their economic life. Contrary to widespread belief, further oil price increases may not speed up world-wide reactor sales. It is possible that the world is heading for a worst of all possible outcomes: a large number of small nuclear power programs that do little to meet real energy needs but substantially complicate the problem of nuclear weapons proliferation. 24 references, 4 tables

  13. Planning of technical flood retention measures in large river basins under consideration of imprecise probabilities of multivariate hydrological loads

    Directory of Open Access Journals (Sweden)

    D. Nijssen

    2009-08-01

    Full Text Available As a result of the severe floods in Europe at the turn of the millennium, the ongoing shift from safety oriented flood control towards flood risk management was accelerated. With regard to technical flood control measures it became evident that the effectiveness of flood control measures depends on many different factors, which cannot be considered with single events used as design floods for planning. The multivariate characteristics of the hydrological loads have to be considered to evaluate complex flood control measures. The effectiveness of spatially distributed flood control systems differs for varying flood events. Event-based characteristics such as the spatial distribution of precipitation, the shape and volume of the resulting flood waves or the interactions of flood waves with the technical elements, e.g. reservoirs and flood polders, result in varying efficiency of these systems. Considering these aspects a flood control system should be evaluated with a broad range of hydrological loads to get a realistic assessment of its performance under different conditions. The consideration of this variety in flood control planning design was one particular aim of this study. Hydrological loads were described by multiple criteria. A statistical characterization of these criteria is difficult, since the data base is often not sufficient to analyze the variety of possible events. Hydrological simulations were used to solve this problem. Here a deterministic-stochastic flood generator was developed and applied to produce a large quantity of flood events which can be used as scenarios of possible hydrological loads. However, these simulations imply many uncertainties. The results will be biased by the basic assumptions of the modeling tools. In flood control planning probabilities are applied to characterize uncertainties. The probabilities of the simulated flood scenarios differ from probabilities which would be derived from long time series

  14. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  15. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  16. Evaluation of Presumed Probability-Density-Function Models in Non-Premixed Flames by using Large Eddy Simulation

    International Nuclear Information System (INIS)

    Cao Hong-Jun; Zhang Hui-Qiang; Lin Wen-Yi

    2012-01-01

    Four kinds of presumed probability-density-function (PDF) models for non-premixed turbulent combustion are evaluated in flames with various stoichiometric mixture fractions by using large eddy simulation (LES). The LES code is validated by the experimental data of a classical turbulent jet flame (Sandia flame D). The mean and rms temperatures obtained by the presumed PDF models are compared with the LES results. The β-function model achieves a good prediction for different flames. The predicted rms temperature by using the double-δ function model is very small and unphysical in the vicinity of the maximum mean temperature. The clip-Gaussian model and the multi-δ function model make a worse prediction of the extremely fuel-rich or fuel-lean side due to the clip at the boundary of the mixture fraction space. The results also show that the overall prediction performance of presumed PDF models is better at mediate stoichiometric mixture fractions than that at very small or very large ones. (fundamental areas of phenomenology(including applications))

  17. Using crowdsourcing to compare temporal, social temporal, and probability discounting among obese and non-obese individuals.

    Science.gov (United States)

    Bickel, Warren K; George Wilson, A; Franck, Christopher T; Terry Mueller, E; Jarmolowicz, David P; Koffarnus, Mikhail N; Fede, Samantha J

    2014-04-01

    Previous research comparing obese and non-obese samples on the delayed discounting procedure has produced mixed results. The aim of the current study was to clarify these discrepant findings by comparing a variety of temporal discounting measures in a large sample of internet users (n=1163) obtained from a crowdsourcing service, Amazon Mechanical Turk (AMT). Measures of temporal, social-temporal (a combination of standard and social temporal), and probability discounting were obtained. Significant differences were obtained on all discounting measures except probability discounting, but the obtained effect sizes were small. These data suggest that larger-N studies will be more likely to detect differences between obese and non-obese samples, and may afford the opportunity, in future studies, to decompose a large obese sample into different subgroups to examine the effect of other relevant measures, such as the reinforcing value of food, on discounting. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    Science.gov (United States)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  19. A future large-aperture UVOIR space observatory: reference designs

    Science.gov (United States)

    Rioux, Norman; Thronson, Harley; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-09-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  20. Observing trans-Planckian ripples in the primordial power spectrum with future large scale structure probes

    DEFF Research Database (Denmark)

    Hamann, Jan; Hannestad, Steen; Sloth, Martin Snoager

    2008-01-01

    We revisit the issue of ripples in the primordial power spectra caused by trans-Planckian physics, and the potential for their detection by future cosmological probes. We find that for reasonably large values of the first slow-roll parameter epsilon (> 0.001), a positive detection of trans......-Planckian ripples can be made even if the amplitude is as low as 10^-4. Data from the Large Synoptic Survey Telescope (LSST) and the proposed future 21 cm survey with the Fast Fourier Transform Telescope (FFTT) will be particularly useful in this regard. If the scale of inflation is close to its present upper bound...

  1. Knotting probability of self-avoiding polygons under a topological constraint

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-01

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius rex. For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius rex. It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius rex corresponds to the screening length.

  2. Knotting probability of self-avoiding polygons under a topological constraint.

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-07

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius r ex . For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius r ex . It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius r ex corresponds to the screening length.

  3. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    Science.gov (United States)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  4. Status and Future Developments in Large Accelerator Control Systems

    International Nuclear Information System (INIS)

    Karen S. White

    2006-01-01

    Over the years, accelerator control systems have evolved from small hardwired systems to complex computer controlled systems with many types of graphical user interfaces and electronic data processing. Today's control systems often include multiple software layers, hundreds of distributed processors, and hundreds of thousands of lines of code. While it is clear that the next generation of accelerators will require much bigger control systems, they will also need better systems. Advances in technology will be needed to ensure the network bandwidth and CPU power can provide reasonable update rates and support the requisite timing systems. Beyond the scaling problem, next generation systems face additional challenges due to growing cyber security threats and the likelihood that some degree of remote development and operation will be required. With a large number of components, the need for high reliability increases and commercial solutions can play a key role towards this goal. Future control systems will operate more complex machines and need to present a well integrated, interoperable set of tools with a high degree of automation. Consistency of data presentation and exception handling will contribute to efficient operations. From the development perspective, engineers will need to provide integrated data management in the beginning of the project and build adaptive software components around a central data repository. This will make the system maintainable and ensure consistency throughout the inevitable changes during the machine lifetime. Additionally, such a large project will require professional project management and disciplined use of well-defined engineering processes. Distributed project teams will make the use of standards, formal requirements and design and configuration control vital. Success in building the control system of the future may hinge on how well we integrate commercial components and learn from best practices used in other industries

  5. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  6. Large-scale theoretical calculations in molecular science - design of a large computer system for molecular science and necessary conditions for future computers

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwagi, H [Institute for Molecular Science, Okazaki, Aichi (Japan)

    1982-06-01

    A large computer system was designed and established for molecular science under the leadership of molecular scientists. Features of the computer system are an automated operation system and an open self-service system. Large-scale theoretical calculations have been performed to solve many problems in molecular science, using the computer system. Necessary conditions for future computers are discussed on the basis of this experience.

  7. Large-scale theoretical calculations in molecular science - design of a large computer system for molecular science and necessary conditions for future computers

    International Nuclear Information System (INIS)

    Kashiwagi, H.

    1982-01-01

    A large computer system was designed and established for molecular science under the leadership of molecular scientists. Features of the computer system are an automated operation system and an open self-service system. Large-scale theoretical calculations have been performed to solve many problems in molecular science, using the computer system. Necessary conditions for future computers are discussed on the basis of this experience. (orig.)

  8. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  9. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  10. Precise lim sup behavior of probabilities of large deviations for sums of i.i.d. random variables

    Directory of Open Access Journals (Sweden)

    Andrew Rosalsky

    2004-12-01

    Full Text Available Let {X,Xn;n≥1} be a sequence of real-valued i.i.d. random variables and let Sn=∑i=1nXi, n≥1. In this paper, we study the probabilities of large deviations of the form P(Sn>tn1/p, P(Sntn1/p, where t>0 and 0x1/p/ϕ(x=1, then for every t>0, limsupn→∞P(|Sn|>tn1/p/(nϕ(n=tpα.

  11. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  12. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  13. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  14. The future of the Large Hadron Collider and CERN.

    Science.gov (United States)

    Heuer, Rolf-Dieter

    2012-02-28

    This paper presents the Large Hadron Collider (LHC) and its current scientific programme and outlines options for high-energy colliders at the energy frontier for the years to come. The immediate plans include the exploitation of the LHC at its design luminosity and energy, as well as upgrades to the LHC and its injectors. This may be followed by a linear electron-positron collider, based on the technology being developed by the Compact Linear Collider and the International Linear Collider collaborations, or by a high-energy electron-proton machine. This contribution describes the past, present and future directions, all of which have a unique value to add to experimental particle physics, and concludes by outlining key messages for the way forward.

  15. Study for Safeguards Challenges to the Most Probably First Indonesian Future Power Plant of the Pebble Bed Modular Reactor

    International Nuclear Information System (INIS)

    Susilowati, E.

    2015-01-01

    In the near future Indonesia, the fourth most populous country, plans to build a small size power plant most probably a Pebble Bed Modular Reactor PBMR. This first nuclear power plant (NPP) is aimed to provide clear picture to the society in regard to performance and safety of nuclear power plant operation. Selection to the PBMR based on several factor including the combination of small size of the reactor and type of fuel allowing the use of passive safety systems, resulting in essential advantages in nuclear plant design and less dependence on plant operators for safety. In the light of safeguards perspective this typical reactor is also quite difference with previous light water reactor (LWR) design. From the fact that there are a small size large number of elements present in the reactor produced without individual serial numbers combine to on-line refueling same as the CANDU reactor, enforcing a new challenge to safeguards approach for this typical reactor. This paper discusses a bunch of safeguards measures have to be prepared by facility operator to support successfully international nuclear material and facility verification including elements of design relevant to safeguards need to be accomplished in consultation to the regulatory body, supplier or designer and the Agency/IAEA such as nuclear material balance area and key measurement point; possible diversion scenarios and safeguards strategy; and design features relevant to the IAEA equipment have to be installed at the reactor facility. It is deemed that result of discussion will alleviate and support the Agency approaching safeguards measure that may be applied to the purpose Indonesian first power plant of PBMR construction and operation. (author)

  16. Assessing the present and future probability of Hurricane Harvey’s rainfall

    OpenAIRE

    Emanuel, Kerry

    2017-01-01

    Significance Natural disasters such as the recent Hurricanes Harvey, Irma, and Maria highlight the need for quantitative estimates of the risk of such disasters. Statistically based risk assessment suffers from short records of often poor quality, and in the case of meteorological hazards, from the fact that the underlying climate is changing. This study shows how a recently developed physics-based risk assessment method can be applied to assessing the probabilities of extreme hurricane rainf...

  17. Safety related requirements on future nuclear power plants

    International Nuclear Information System (INIS)

    Niehaus, F.

    1991-01-01

    Nuclear power has the potential to significantly contribute to the future energy supply. However, this requires continuous improvements in nuclear safety. Technological advancements and implementation of safety culture will achieve a safety level for future reactors of the present generation of a probability of core-melt of less than 10 -5 per year, and less than 10 -6 per year for large releases of radioactive materials. There are older reactors which do not comply with present safety thinking. The paper reviews findings of a recent design review of WWER 440/230 plants. Advanced evolutionary designs might be capable of reducing the probability of significant off-site releases to less than 10 -7 per year. For such reactors there are inherent limitations to increase safety further due to the human element, complexity of design and capability of the containment function. Therefore, revolutionary designs are being explored with the aim of eliminating the potential for off-site releases. In this context it seems to be advisable to explore concepts where the ultimate safety barrier is the fuel itself. (orig.) [de

  18. A large set of potential past, present and future hydro-meteorological time series for the UK

    Science.gov (United States)

    Guillod, Benoit P.; Jones, Richard G.; Dadson, Simon J.; Coxon, Gemma; Bussi, Gianbattista; Freer, James; Kay, Alison L.; Massey, Neil R.; Sparrow, Sarah N.; Wallom, David C. H.; Allen, Myles R.; Hall, Jim W.

    2018-01-01

    Hydro-meteorological extremes such as drought and heavy precipitation can have large impacts on society and the economy. With potentially increasing risks associated with such events due to climate change, properly assessing the associated impacts and uncertainties is critical for adequate adaptation. However, the application of risk-based approaches often requires large sets of extreme events, which are not commonly available. Here, we present such a large set of hydro-meteorological time series for recent past and future conditions for the United Kingdom based on weather@home 2, a modelling framework consisting of a global climate model (GCM) driven by observed or projected sea surface temperature (SST) and sea ice which is downscaled to 25 km over the European domain by a regional climate model (RCM). Sets of 100 time series are generated for each of (i) a historical baseline (1900-2006), (ii) five near-future scenarios (2020-2049) and (iii) five far-future scenarios (2070-2099). The five scenarios in each future time slice all follow the Representative Concentration Pathway 8.5 (RCP8.5) and sample the range of sea surface temperature and sea ice changes from CMIP5 (Coupled Model Intercomparison Project Phase 5) models. Validation of the historical baseline highlights good performance for temperature and potential evaporation, but substantial seasonal biases in mean precipitation, which are corrected using a linear approach. For extremes in low precipitation over a long accumulation period ( > 3 months) and shorter-duration high precipitation (1-30 days), the time series generally represents past statistics well. Future projections show small precipitation increases in winter but large decreases in summer on average, leading to an overall drying, consistently with the most recent UK Climate Projections (UKCP09) but larger in magnitude than the latter. Both drought and high-precipitation events are projected to increase in frequency and intensity in most regions

  19. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay.

    Science.gov (United States)

    Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus

    2017-01-01

    Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.

  20. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay.

    Directory of Open Access Journals (Sweden)

    Kelsey L Ruckert

    Full Text Available Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90% estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability coastal flood height (storm surge including SLR in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability. Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.

  1. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  2. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kenison, LaVesta [URS, Pittsburgh, PA (United States); Flanigan, Thomas [URS, Pittsburgh, PA (United States); Hagerty, Gregg [URS, Pittsburgh, PA (United States); Gorrie, James [Air Liquide, Kennesaw, GA (United States); Leclerc, Mathieu [Air Liquide, Kennesaw, GA (United States); Lockwood, Frederick [Air Liquide, Kennesaw, GA (United States); Falla, Lyle [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Macinnis, Jim [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Fedak, Mathew [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Yakle, Jeff [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Williford, Mark [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States); Wood, Paul [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States)

    2016-04-01

    The primary objectives of the FutureGen 2.0 CO2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO2 capture in steady-state operations. The project was to be fully integrated in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will

  3. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  4. Production of 147Eu for gamma-ray emission probability measurement

    International Nuclear Information System (INIS)

    Katoh, Keiji; Marnada, Nada; Miyahara, Hiroshi

    2002-01-01

    Gamma-ray emission probability is one of the most important decay parameters of radionuclide and many researchers are paying efforts to improve the certainty of it. The certainties of γ-ray emission probabilities for neutron-rich nuclides are being improved little by little, but the improvements of those for proton-rich nuclides are still insufficient. Europium-147 that decays by electron capture or β + -particle emission is a proton-rich nuclide and the γ-ray emission probabilities evaluated by Mateosian and Peker have large uncertainties. They referred to only one report concerning with γ-ray emission probabilities. Our final purpose is to determine the precise γ-ray emission probabilities of 147 Eu from disintegration rates and γ-ray intensities by using a 4πβ-γ coincidence apparatus. Impurity nuclides affect largely to the determination of disintegration rate; therefore, a highly pure 147 Eu source is required. This short note will describe the most proper energy for 147 Eu production through 147 Sm(p, n) reaction. (author)

  5. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  6. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    Science.gov (United States)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning

  7. Scenarios for future agriculture in Finland: a Delphi study among agri-food sector stakeholders

    Directory of Open Access Journals (Sweden)

    P. RIKKONEN

    2008-12-01

    Full Text Available This article presents alternative scenarios for future agriculture in Finland up to 2025. These scenarios are the results of a large Delphi study carried out among Finnish agri-food sector stakeholders. The Delphi panel members gave their future view on desirable and probable futures. From these two dimensions, three scenarios were elaborated through the future images – the subjective future path and the importance analysis. The scenarios represent a technology optimistic “day-dream agriculture”, a probable future as “industrialised agriculture” and an undesirable future path as “drifting agriculture”. Two mini-scenarios are also presented. They are based on a discontinuity event as an unexpected impact of climate change and an analogy event as an ecological breakdown due to the expansive animal disease epidemics. In both mini-scenarios, the directions of storylines are dramatically changed. The scenarios support strategic planning introducing not only one forecast but alternative outcomes as a basis for future strategy and decisions. In this study the scenarios were constructed to address the opportunities as a desired vision and also the threats as to an undesirable future in the agricultural sector. These results bring to the table a Finnish agri-food expert community view of the future directions of relevant key issues in the agricultural policy agenda.;

  8. Failure frequencies and probabilities applicable to BWR and PWR piping

    International Nuclear Information System (INIS)

    Bush, S.H.; Chockie, A.D.

    1996-03-01

    This report deals with failure probabilities and failure frequencies of nuclear plant piping and the failure frequencies of flanges and bellows. Piping failure probabilities are derived from Piping Reliability Analysis Including Seismic Events (PRAISE) computer code calculations based on fatigue and intergranular stress corrosion as failure mechanisms. Values for both failure probabilities and failure frequencies are cited from several sources to yield a better evaluation of the spread in mean and median values as well as the widths of the uncertainty bands. A general conclusion is that the numbers from WASH-1400 often used in PRAs are unduly conservative. Failure frequencies for both leaks and large breaks tend to be higher than would be calculated using the failure probabilities, primarily because the frequencies are based on a relatively small number of operating years. Also, failure probabilities are substantially lower because of the probability distributions used in PRAISE calculations. A general conclusion is that large LOCA probability values calculated using PRAISE will be quite small, on the order of less than 1E-8 per year (<1E-8/year). The values in this report should be recognized as having inherent limitations and should be considered as estimates and not absolute values. 24 refs 24 refs

  9. Expected Future Conditions for Secure Power Operation with Large Scale of RES Integration

    International Nuclear Information System (INIS)

    Majstrovic, G.; Majstrovic, M.; Sutlovic, E.

    2015-01-01

    EU energy strategy is strongly focused on the large scale integration of renewable energy sources. The most dominant part here is taken by variable sources - wind power plants. Grid integration of intermittent sources along with keeping the system stable and secure is one of the biggest challenges for the TSOs. This part is often neglected by the energy policy makers, so this paper deals with expected future conditions for secure power system operation with large scale wind integration. It gives an overview of expected wind integration development in EU, as well as expected P/f regulation and control needs. The paper is concluded with several recommendations. (author).

  10. Large acceptance spectrometers for invariant mass spectroscopy of exotic nuclei and future developments

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, T.; Kondo, Y.

    2016-06-01

    Large acceptance spectrometers at in-flight RI separators have played significant roles in investigating the structure of exotic nuclei. Such spectrometers are in particular useful for probing unbound states of exotic nuclei, using invariant mass spectroscopy with reactions at intermediate and high energies. We discuss here the key characteristic features of such spectrometers, by introducing the recently commissioned SAMURAI facility at the RIBF, RIKEN. We also investigate the issue of cross talk in the detection of multiple neutrons, which has become crucial for exploring further unbound states and nuclei beyond the neutron drip line. Finally we discuss future perspectives for large acceptance spectrometers at the new-generation RI-beam facilities.

  11. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    Energy Technology Data Exchange (ETDEWEB)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui; Pinson, Pierre

    2017-07-01

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of wind power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.

  12. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  13. A large set of potential past, present and future hydro-meteorological time series for the UK

    Directory of Open Access Journals (Sweden)

    B. P. Guillod

    2018-01-01

    Full Text Available Hydro-meteorological extremes such as drought and heavy precipitation can have large impacts on society and the economy. With potentially increasing risks associated with such events due to climate change, properly assessing the associated impacts and uncertainties is critical for adequate adaptation. However, the application of risk-based approaches often requires large sets of extreme events, which are not commonly available. Here, we present such a large set of hydro-meteorological time series for recent past and future conditions for the United Kingdom based on weather@home 2, a modelling framework consisting of a global climate model (GCM driven by observed or projected sea surface temperature (SST and sea ice which is downscaled to 25 km over the European domain by a regional climate model (RCM. Sets of 100 time series are generated for each of (i a historical baseline (1900–2006, (ii five near-future scenarios (2020–2049 and (iii five far-future scenarios (2070–2099. The five scenarios in each future time slice all follow the Representative Concentration Pathway 8.5 (RCP8.5 and sample the range of sea surface temperature and sea ice changes from CMIP5 (Coupled Model Intercomparison Project Phase 5 models. Validation of the historical baseline highlights good performance for temperature and potential evaporation, but substantial seasonal biases in mean precipitation, which are corrected using a linear approach. For extremes in low precipitation over a long accumulation period ( > 3 months and shorter-duration high precipitation (1–30 days, the time series generally represents past statistics well. Future projections show small precipitation increases in winter but large decreases in summer on average, leading to an overall drying, consistently with the most recent UK Climate Projections (UKCP09 but larger in magnitude than the latter. Both drought and high-precipitation events are projected to increase in frequency and

  14. Large deviations

    CERN Document Server

    Varadhan, S R S

    2016-01-01

    The theory of large deviations deals with rates at which probabilities of certain events decay as a natural parameter in the problem varies. This book, which is based on a graduate course on large deviations at the Courant Institute, focuses on three concrete sets of examples: (i) diffusions with small noise and the exit problem, (ii) large time behavior of Markov processes and their connection to the Feynman-Kac formula and the related large deviation behavior of the number of distinct sites visited by a random walk, and (iii) interacting particle systems, their scaling limits, and large deviations from their expected limits. For the most part the examples are worked out in detail, and in the process the subject of large deviations is developed. The book will give the reader a flavor of how large deviation theory can help in problems that are not posed directly in terms of large deviations. The reader is assumed to have some familiarity with probability, Markov processes, and interacting particle systems.

  15. Estimation and asymptotic theory for transition probabilities in Markov Renewal Multi–state models

    NARCIS (Netherlands)

    Spitoni, C.; Verduijn, M.; Putter, H.

    2012-01-01

    In this paper we discuss estimation of transition probabilities for semi–Markov multi–state models. Non–parametric and semi–parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional

  16. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  17. Fixation Probability in a Haploid-Diploid Population.

    Science.gov (United States)

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  18. Modeling future power plant location patterns. Final report

    International Nuclear Information System (INIS)

    Eagles, T.W.; Cohon, J.L.; ReVelle, C.

    1979-04-01

    The locations of future energy facilities must be specified to assess the potential environmental impact of those facilities. A computer model was developed to generate probable locations for the energy facilities needed to meet postulated future energy requirements. The model is designed to cover a very large geographical region. The regional demand for baseload electric generating capacity associated with a postulated demand growth rate over any desired time horizon is specified by the user as an input to the model. The model uses linear programming to select the most probable locations within the region, based on physical and political factors. The linear program is multi-objective, with four objective functions based on transmission, coal supply, population proximity, and water supply considerations. Minimizing each objective function leads to a distinct set of locations. The user can select the objective function or weighted combination of objective functions most appropriate to his interest. Users with disparate interests can use the model to see the locational changes which result from varying weighting of the objective functions. The model has been implemented in a six-state mid-Atlantic region. The year 2000 was chosen as the study year, and a test scenario postulating 2.25% growth in baseload generating capacity between 1977 and 2000 was chosen. The scenario stipulatedthat this capacity be 50% nuclear and 50% coal-fired. Initial utility reaction indicates the objective based on transmission costs is most important for such a large-scale analysis

  19. Defining Baconian Probability for Use in Assurance Argumentation

    Science.gov (United States)

    Graydon, Patrick J.

    2016-01-01

    The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.

  20. A fluctuation relation for the probability of energy backscatter

    Science.gov (United States)

    Vela-Martin, Alberto; Jimenez, Javier

    2017-11-01

    We simulate the large scales of an inviscid turbulent flow in a triply periodic box using a dynamic Smagorinsky model for the sub-grid stresses. The flow, which is forced to constant kinetic energy, is fully reversible and can develop a sustained inverse energy cascade. However, due to the large number of degrees freedom, the probability of spontaneous mean inverse energy flux is negligible. In order to quantify the probability of inverse energy cascades, we test a local fluctuation relation of the form log P(A) = - c(V , t) A , where P(A) = p(| Cs|V,t = A) / p(| Cs|V , t = - A) , p is probability, and | Cs|V,t is the average of the least-squared dynamic model coefficient over volume V and time t. This is confirmed when Cs is averaged over sufficiently large domains and long times, and c is found to depend linearly on V and t. In the limit in which V 1 / 3 is of the order of the integral scale and t is of the order of the eddy-turnover time, we recover a global fluctuation relation that predicts a negligible probability of a sustained inverse energy cascade. For smaller V and t, the local fluctuation relation provides useful predictions on the occurrence of local energy backscatter. Funded by the ERC COTURB project.

  1. The Everett-Wheeler interpretation and the open future

    International Nuclear Information System (INIS)

    Sudbery, Anthony

    2011-01-01

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  2. Moxie matters: associations of future orientation with active life expectancy.

    Science.gov (United States)

    Laditka, Sarah B; Laditka, James N

    2017-10-01

    Being oriented toward the future has been associated with better future health. We studied associations of future orientation with life expectancy and the percentage of life with disability. We used the Panel Study of Income Dynamics (n = 5249). Participants' average age in 1968 was 33.0. Six questions repeatedly measured future orientation, 1968-1976. Seven waves (1999-2011, 33,331 person-years) measured disability in activities of daily living for the same individuals, whose average age in 1999 was 64.0. We estimated monthly probabilities of disability and death with multinomial logistic Markov models adjusted for age, sex, race/ethnicity, childhood health, and education. Using the probabilities, we created large populations with microsimulation, measuring disability in each month for each individual, age 55 through death. Life expectancy from age 55 for white men with high future orientation was age 77.6 (95% confidence interval 75.5-79.0), 6.9% (4.9-7.2) of those years with disability; results with low future orientation were 73.6 (72.2-75.4) and 9.6% (7.7-10.7). Comparable results for African American men were 74.8 (72.9-75.3), 8.1 (5.6-9.3), 71.0 (69.6-72.8), and 11.3 (9.1-11.7). For women, there were no significant differences associated with levels of future orientation for life expectancy. For white women with high future orientation 9.1% of remaining life from age 55 was disabled (6.3-9.9), compared to 12.4% (10.2-13.2) with low future orientation. Disability results for African American women were similar but statistically significant only at age 80 and over. High future orientation during early to middle adult ages may be associated with better health in older age.

  3. Rendering Future Vegetation Change across Large Regions of the US

    Science.gov (United States)

    Sant'Anna Dias, Felipe; Gu, Yuting; Agarwalla, Yashika; Cheng, Yiwei; Patil, Sopan; Stieglitz, Marc; Turk, Greg

    2015-04-01

    We use two Machine Learning techniques, Decision Trees (DT) and Neural Networks (NN), to provide classified images and photorealistic renderings of future vegetation cover at three large regions in the US. The training data used to generate current vegetation cover include Landsat surface reflectance images, USGS Land Cover maps, 50 years of mean annual temperature and precipitation for the period 1950 - 2000, elevation, aspect and slope data. Present vegetation cover was generated on a 100m grid. Future vegetation cover for the period 2061- 2080 was predicted using the 1 km resolution bias corrected data from the NASA Goddard Institute for Space Studies Global Climate Model E simulation. The three test regions encompass a wide range of climatic gradients, topographic variation, and vegetation cover. The central Oregon site covers 19,182 square km and includes the Ochoco and Malheur National Forest. Vegetation cover is 50% evergreen forest and 50% shrubs and scrubland. The northwest Washington site covers 14,182 square km. Vegetation cover is 60% evergreen forest, 14% scrubs, 7% grassland, and 7% barren land. The remainder of the area includes deciduous forest, perennial snow cover, and wetlands. The third site, the Jemez mountain region of north central New Mexico, covers 5,500 square km. Vegetation cover is 47% evergreen forest, 31% shrubs, 13% grasses, and 3% deciduous forest. The remainder of the area includes developed and cultivated areas and wetlands. Using the above mentioned data sets we first trained our DT and NN models to reproduce current vegetation. The land cover classified images were compared directly to the USGS land cover data. The photorealistic generated vegetation images were compared directly to the remotely sensed surface reflectance maps. For all three sites, similarity between generated and observed vegetation cover was quite remarkable. The three trained models were then used to explore what the equilibrium vegetation would look like for

  4. Estimation of probability of failure for damage-tolerant aerospace structures

    Science.gov (United States)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  5. Uncertainty plus Prior Equals Rational Bias: An Intuitive Bayesian Probability Weighting Function

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-01-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several…

  6. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  7. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  8. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  9. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    % in those with low probability. The prevalence of PE in patients with intermediate clinical probability was 41%. These results underscore the importance of incorporating the standardized reading of the electrocardiogram and of the chest radiograph into the clinical evaluation of patients with suspected PE. The interpretation of these laboratory data, however, requires experience. Future research is needed to develop standardized models, of varying degree of complexity, which may find application in different clinical settings to predict the probability of PE

  10. Non-equilibrium random matrix theory. Transition probabilities

    International Nuclear Information System (INIS)

    Pedro, Francisco Gil; Westphal, Alexander

    2016-06-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  11. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  12. Implications of synergetic indirect effects and increased flexibility for municipal solid waste management within future framework conditions

    DEFF Research Database (Denmark)

    Cimpan, Ciprian; Rothmann, Marianne; Wenzel, Henrik

    and compared against a large variety of background system scenarios, consisting of the most probable future development of the Danish energy system (and surrounding countries) towards 2050. Specific focus was placed on identification and modelling of possible indirect effects on adjoining systems that would......Life cycle assessments addressing municipal solid waste management systems (MSWMS) most often represent and evaluate these systems or compare isolated technological and management solutions in a much too simplistic interaction with their surroundings, accounting for a minimum of probable future...... potential (GWP) of different waste management strategies. Within the study reported here, a number of alternative MSWMS were simulated and evaluated, comprising combinations of separate collection and different downstream treatment/handling approaches for remaining residual waste, including advanced...

  13. Determination of stability of epimetamorphic rock slope using Minimax Probability Machine

    Directory of Open Access Journals (Sweden)

    Manoj Kumar

    2016-01-01

    Full Text Available The article employs Minimax Probability Machine (MPM for the prediction of the stability status of epimetamorphic rock slope. The MPM gives a worst-case bound on the probability of misclassification of future data points. Bulk density (d, height (H, inclination (β, cohesion (c and internal friction angle (φ have been used as input of the MPM. This study uses the MPM as a classification technique. Two models {Linear Minimax Probability Machine (LMPM and Kernelized Minimax Probability Machine (KMPM} have been developed. The generalization capability of the developed models has been checked by a case study. The experimental results demonstrate that MPM-based approaches are promising tools for the prediction of the stability status of epimetamorphic rock slope.

  14. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  15. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  16. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  17. Linker-dependent Junction Formation Probability in Single-Molecule Junctions

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Pil Sun; Kim, Taekyeong [HankukUniversity of Foreign Studies, Yongin (Korea, Republic of)

    2015-01-15

    We compare the junction formation probabilities of single-molecule junctions with different linker molecules by using a scanning tunneling microscope-based break-junction technique. We found that the junction formation probability varies as SH > SMe > NH2 for the benzene backbone molecule with different types of anchoring groups, through quantitative statistical analysis. These results are attributed to different bonding forces according to the linker groups formed with Au atoms in the electrodes, which is consistent with previous works. Our work allows a better understanding of the contact chemistry in the metal.molecule junction for future molecular electronic devices.

  18. Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds

    Science.gov (United States)

    Conway, C.J.; Gibbs, J.P.

    2011-01-01

    Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.

  19. Regional modeling of large wildfires under current and potential future climates in Colorado and Wyoming, USA

    Science.gov (United States)

    West, Amanda; Kumar, Sunil; Jarnevich, Catherine S.

    2016-01-01

    Regional analysis of large wildfire potential given climate change scenarios is crucial to understanding areas most at risk in the future, yet wildfire models are not often developed and tested at this spatial scale. We fit three historical climate suitability models for large wildfires (i.e. ≥ 400 ha) in Colorado andWyoming using topography and decadal climate averages corresponding to wildfire occurrence at the same temporal scale. The historical models classified points of known large wildfire occurrence with high accuracies. Using a novel approach in wildfire modeling, we applied the historical models to independent climate and wildfire datasets, and the resulting sensitivities were 0.75, 0.81, and 0.83 for Maxent, Generalized Linear, and Multivariate Adaptive Regression Splines, respectively. We projected the historic models into future climate space using data from 15 global circulation models and two representative concentration pathway scenarios. Maps from these geospatial analyses can be used to evaluate the changing spatial distribution of climate suitability of large wildfires in these states. April relative humidity was the most important covariate in all models, providing insight to the climate space of large wildfires in this region. These methods incorporate monthly and seasonal climate averages at a spatial resolution relevant to land management (i.e. 1 km2) and provide a tool that can be modified for other regions of North America, or adapted for other parts of the world.

  20. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  1. Reactor materials program process water component failure probability

    International Nuclear Information System (INIS)

    Daugherty, W. L.

    1988-01-01

    The maximum rate loss of coolant accident for the Savannah River Production Reactors is presently specified as the abrupt double-ended guillotine break (DEGB) of a large process water pipe. This accident is not considered credible in light of the low applied stresses and the inherent ductility of the piping materials. The Reactor Materials Program was initiated to provide the technical basis for an alternate, credible maximum rate LOCA. The major thrust of this program is to develop an alternate worst case accident scenario by deterministic means. In addition, the probability of a DEGB is also being determined; to show that in addition to being mechanistically incredible, it is also highly improbable. The probability of a DEGB of the process water piping is evaluated in two parts: failure by direct means, and indirectly-induced failure. These two areas have been discussed in other reports. In addition, the frequency of a large bread (equivalent to a DEGB) in other process water system components is assessed. This report reviews the large break frequency for each component as well as the overall large break frequency for the reactor system

  2. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  3. Global pyrogeography: the current and future distribution of wildfire.

    Directory of Open Access Journals (Sweden)

    Meg A Krawchuk

    Full Text Available Climate change is expected to alter the geographic distribution of wildfire, a complex abiotic process that responds to a variety of spatial and environmental gradients. How future climate change may alter global wildfire activity, however, is still largely unknown. As a first step to quantifying potential change in global wildfire, we present a multivariate quantification of environmental drivers for the observed, current distribution of vegetation fires using statistical models of the relationship between fire activity and resources to burn, climate conditions, human influence, and lightning flash rates at a coarse spatiotemporal resolution (100 km, over one decade. We then demonstrate how these statistical models can be used to project future changes in global fire patterns, highlighting regional hotspots of change in fire probabilities under future climate conditions as simulated by a global climate model. Based on current conditions, our results illustrate how the availability of resources to burn and climate conditions conducive to combustion jointly determine why some parts of the world are fire-prone and others are fire-free. In contrast to any expectation that global warming should necessarily result in more fire, we find that regional increases in fire probabilities may be counter-balanced by decreases at other locations, due to the interplay of temperature and precipitation variables. Despite this net balance, our models predict substantial invasion and retreat of fire across large portions of the globe. These changes could have important effects on terrestrial ecosystems since alteration in fire activity may occur quite rapidly, generating ever more complex environmental challenges for species dispersing and adjusting to new climate conditions. Our findings highlight the potential for widespread impacts of climate change on wildfire, suggesting severely altered fire regimes and the need for more explicit inclusion of fire in research

  4. Global pyrogeography: the current and future distribution of wildfire.

    Science.gov (United States)

    Krawchuk, Meg A; Moritz, Max A; Parisien, Marc-André; Van Dorn, Jeff; Hayhoe, Katharine

    2009-01-01

    Climate change is expected to alter the geographic distribution of wildfire, a complex abiotic process that responds to a variety of spatial and environmental gradients. How future climate change may alter global wildfire activity, however, is still largely unknown. As a first step to quantifying potential change in global wildfire, we present a multivariate quantification of environmental drivers for the observed, current distribution of vegetation fires using statistical models of the relationship between fire activity and resources to burn, climate conditions, human influence, and lightning flash rates at a coarse spatiotemporal resolution (100 km, over one decade). We then demonstrate how these statistical models can be used to project future changes in global fire patterns, highlighting regional hotspots of change in fire probabilities under future climate conditions as simulated by a global climate model. Based on current conditions, our results illustrate how the availability of resources to burn and climate conditions conducive to combustion jointly determine why some parts of the world are fire-prone and others are fire-free. In contrast to any expectation that global warming should necessarily result in more fire, we find that regional increases in fire probabilities may be counter-balanced by decreases at other locations, due to the interplay of temperature and precipitation variables. Despite this net balance, our models predict substantial invasion and retreat of fire across large portions of the globe. These changes could have important effects on terrestrial ecosystems since alteration in fire activity may occur quite rapidly, generating ever more complex environmental challenges for species dispersing and adjusting to new climate conditions. Our findings highlight the potential for widespread impacts of climate change on wildfire, suggesting severely altered fire regimes and the need for more explicit inclusion of fire in research on global

  5. Some uses of predictive probability of success in clinical drug development

    Directory of Open Access Journals (Sweden)

    Mauro Gasparini

    2013-03-01

    Full Text Available Predictive probability of success is a (subjective Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations imposed in the world of pharmaceutical development.Within a single trial, predictive probability of success can be identified with expected power, i.e. the evaluation of the success probability of the trial. Success means, for example, obtaining a significant result of a standard superiority test.Across trials, predictive probability of success can be the probability of a successful completion of an entire part of clinical development, for example a successful phase III development in the presence of phase II data.Calculations of predictive probability of success in the presence of normal data with known variance will be illustrated, both for within-trial and across-trial predictions.

  6. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  7. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  8. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  9. Introduction to probability and statistics for ecosystem managers simulation and resampling

    CERN Document Server

    Haas, Timothy C

    2013-01-01

    Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...

  10. Analysis of blocking probability for OFDM-based variable bandwidth optical network

    Science.gov (United States)

    Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi

    2011-12-01

    Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.

  11. Changes in the probability of co-occurring extreme climate events

    Science.gov (United States)

    Diffenbaugh, N. S.

    2017-12-01

    Extreme climate events such as floods, droughts, heatwaves, and severe storms exert acute stresses on natural and human systems. When multiple extreme events co-occur, either in space or time, the impacts can be substantially compounded. A diverse set of human interests - including supply chains, agricultural commodities markets, reinsurance, and deployment of humanitarian aid - have historically relied on the rarity of extreme events to provide a geographic hedge against the compounded impacts of co-occuring extremes. However, changes in the frequency of extreme events in recent decades imply that the probability of co-occuring extremes is also changing, and is likely to continue to change in the future in response to additional global warming. This presentation will review the evidence for historical changes in extreme climate events and the response of extreme events to continued global warming, and will provide some perspective on methods for quantifying changes in the probability of co-occurring extremes in the past and future.

  12. Array coding for large data memories

    Science.gov (United States)

    Tranter, W. H.

    1982-01-01

    It is pointed out that an array code is a convenient method for storing large quantities of data. In a typical application, the array consists of N data words having M symbols in each word. The probability of undetected error is considered, taking into account three symbol error probabilities which are of interest, and a formula for determining the probability of undetected error. Attention is given to the possibility of reading data into the array using a digital communication system with symbol error probability p. Two different schemes are found to be of interest. The conducted analysis of array coding shows that the probability of undetected error is very small even for relatively large arrays.

  13. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  14. Theoretical-probability evaluation of the fire hazard of coal accumulations

    Energy Technology Data Exchange (ETDEWEB)

    Sikora, F F

    1978-01-01

    An evaluation is suggested for the fire hazard of coal accumulations, based on determining the probability of an endogenic fire. This probability is computed by using the statistical characteristics of the temperature distribution of spontaneous heating in large accumulations, and the criteria of Gluzberg's fire hazard that is determined by the coal's physico-chemical properties, oxygen concentration, and the size of the accumulations. 4 references.

  15. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  16. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  17. Detection probability of least tern and piping plover chicks in a large river system

    Science.gov (United States)

    Roche, Erin A.; Shaffer, Terry L.; Anteau, Michael J.; Sherfy, Mark H.; Stucker, Jennifer H.; Wiltermuth, Mark T.; Dovichin, Colin M.

    2014-01-01

    Monitoring the abundance and stability of populations of conservation concern is often complicated by an inability to perfectly detect all members of the population. Mark-recapture offers a flexible framework in which one may identify factors contributing to imperfect detection, while at the same time estimating demographic parameters such as abundance or survival. We individually color-marked, recaptured, and re-sighted 1,635 federally listed interior least tern (Sternula antillarum; endangered) chicks and 1,318 piping plover (Charadrius melodus; threatened) chicks from 2006 to 2009 at 4 study areas along the Missouri River and investigated effects of observer-, subject-, and site-level covariates suspected of influencing detection. Increasing the time spent searching and crew size increased the probability of detecting both species regardless of study area and detection methods were not associated with decreased survival. However, associations between detection probability and the investigated covariates were highly variable by study area and species combinations, indicating that a universal mark-recapture design may not be appropriate.

  18. Potential economic benefits of adapting agricultural production systems to future climate change

    Science.gov (United States)

    Fagre, Daniel B.; Pederson, Gregory; Bengtson, Lindsey E.; Prato, Tony; Qui, Zeyuan; Williams, Jimmie R.

    2010-01-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960–2005) and future climate period (2006–2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting

  19. Potential Economic Benefits of Adapting Agricultural Production Systems to Future Climate Change

    Science.gov (United States)

    Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E.; Williams, Jimmy R.

    2010-03-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to

  20. Potential economic benefits of adapting agricultural production systems to future climate change.

    Science.gov (United States)

    Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E; Williams, Jimmy R

    2010-03-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO(2) emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs

  1. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. Limited test data: The choice between confidence limits and inverse probability

    International Nuclear Information System (INIS)

    Nichols, P.

    1975-01-01

    For a unit which has been successfully designed to a high standard of reliability, any test programme of reasonable size will result in only a small number of failures. In these circumstances the failure rate estimated from the tests will depend on the statistical treatment applied. When a large number of units is to be manufactured, an unexpected high failure rate will certainly result in a large number of failures, so it is necessary to guard against optimistic unrepresentative test results by using a confidence limit approach. If only a small number of production units is involved, failures may not occur even with a higher than expected failure rate, and so one may be able to accept a method which allows for the possibility of either optimistic or pessimistic test results, and in this case an inverse probability approach, based on Bayes' theorem, might be used. The paper first draws attention to an apparently significant difference in the numerical results from the two methods, particularly for the overall probability of several units arranged in redundant logic. It then discusses a possible objection to the inverse method, followed by a demonstration that, for a large population and a very reasonable choice of prior probability, the inverse probability and confidence limit methods give the same numerical result. Finally, it is argued that a confidence limit approach is overpessimistic when a small number of production units is involved, and that both methods give the same answer for a large population. (author)

  3. Quantum Zeno and anti-Zeno effects measured by transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenxian, E-mail: wxzhang@whu.edu.cn [School of Physics and Technology, Wuhan University, Wuhan, Hubei 430072 (China); Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Kavli Institute for Theoretical Physics China, CAS, Beijing 100190 (China); Kofman, A.G. [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States); Zhuang, Jun [Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); You, J.Q. [Beijing Computational Science Research Center, Beijing 10084 (China); Department of Physics, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Nori, Franco [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States)

    2013-10-30

    Using numerical calculations, we compare the transition probabilities of many spins in random magnetic fields, subject to either frequent projective measurements, frequent phase modulations, or a mix of modulations and measurements. For various distribution functions, we find the transition probability under frequent modulations is suppressed most if the pulse delay is short and the evolution time is larger than a critical value. Furthermore, decay freezing occurs only under frequent modulations as the pulse delay approaches zero. In the large pulse-delay region, however, the transition probabilities under frequent modulations are highest among the three control methods.

  4. Experimental evidence for the reducibility of multifragment emission probabilities

    International Nuclear Information System (INIS)

    Wozniak, G.J.; Tso, K.; Phair, L.

    1995-01-01

    Multifragmentation has been studied for 36 Ar-induced reactions on a 197 Au target at E/A = 80 and 110 MeV and for 129 Xe-induced reactions on several targets ( nat Cu, 89 y, 165 ho, 197 Au) and E/A = 40, 50 and 60 MeV. The probability of emitting n intermediate-mass-fragments is shown to be binomial at each transversal energy and reducible to an elementary binary probability p. For each target and at each bombarding energy, this probability p shows a thermal nature by giving linear Arrhenius plots. For the 129 Xe-induced reactions, a nearly universal linear Arrhenius plot is observed at each bombarding energy, indicating a large degree of target independence

  5. Use of probabilistic methods for estimating failure probabilities and directing ISI-efforts

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, F; Brickstad, B [University of Uppsala, (Switzerland)

    1988-12-31

    Some general aspects of the role of Non Destructive Testing (NDT) efforts on the resulting probability of core damage is discussed. A simple model for the estimation of the pipe break probability due to IGSCC is discussed. It is partly based on analytical procedures, partly on service experience from the Swedish BWR program. Estimates of the break probabilities indicate that further studies are urgently needed. It is found that the uncertainties about the initial crack configuration are large contributors to the total uncertainty. Some effects of the inservice inspection are studied and it is found that the detection probabilities influence the failure probabilities. (authors).

  6. Probability distribution of long-run indiscriminate felling of trees in ...

    African Journals Online (AJOL)

    The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...

  7. Future fire probability modeling with climate change data and physical chemistry

    Science.gov (United States)

    Richard P. Guyette; Frank R. Thompson; Jodi Whittier; Michael C. Stambaugh; Daniel C. Dey

    2014-01-01

    Climate has a primary influence on the occurrence and rate of combustion in ecosystems with carbon-based fuels such as forests and grasslands. Society will be confronted with the effects of climate change on fire in future forests. There are, however, few quantitative appraisals of how climate will affect wildland fire in the United States. We demonstrated a method for...

  8. Analysis of future nuclear power plants competitiveness with stochastic methods

    International Nuclear Information System (INIS)

    Feretic, D.; Tomsic, Z.

    2004-01-01

    To satisfy the increased demand it is necessary to build new electrical power plants, which could in an optimal way meet, the imposed acceptability criteria. The main criteria are potential to supply the required energy, to supply this energy with minimal (or at least acceptable) costs, to satisfy licensing requirements and be acceptable to public. The main competitors for unlimited electricity production in next few decades are fossil power plants (coal and gas) and nuclear power plants. New renewable power plants (solar, wind, biomass) are also important but due to limited energy supply potential and high costs can be only supplement to the main generating units. Large hydropower plans would be competitive under condition of existence of suitable sites for construction of such plants. The paper describes the application of a stochastic method for comparing economic parameters of future electrical power generating systems including conventional and nuclear power plants. The method is applied to establish competitive specific investment costs of future nuclear power plants when compared with combined cycle gas fired units combined with wind electricity generators using best estimated and optimistic input data. The bases for economic comparison of potential options are plant life time levelized electricity generating costs. The purpose is to assess the uncertainty of several key performance and cost of electricity produced in coal fired power plant, gas fired power plant and nuclear power plant developing probability distribution of levelized price of electricity from different Power Plants, cumulative probability of levelized price of electricity for each technology and probability distribution of cost difference between the technologies. The key parameters evaluated include: levelized electrical energy cost USD/kWh,, discount rate, interest rate for credit repayment, rate of expected increase of fuel cost, plant investment cost , fuel cost , constant annual

  9. Future equivalent of 2010 Russian heatwave intensified by weakening soil moisture constraints

    Science.gov (United States)

    Rasmijn, L. M.; van der Schrier, G.; Bintanja, R.; Barkmeijer, J.; Sterl, A.; Hazeleger, W.

    2018-05-01

    The 2010 heatwave in eastern Europe and Russia ranks among the hottest events ever recorded in the region1,2. The excessive summer warmth was related to an anomalously widespread and intense quasi-stationary anticyclonic circulation anomaly over western Russia, reinforced by depletion of spring soil moisture1,3-5. At present, high soil moisture levels and strong surface evaporation generally tend to cap maximum summer temperatures6-8, but these constraints may weaken under future warming9,10. Here, we use a data assimilation technique in which future climate model simulations are nudged to realistically represent the persistence and strength of the 2010 blocked atmospheric flow. In the future, synoptically driven extreme warming under favourable large-scale atmospheric conditions will no longer be suppressed by abundant soil moisture, leading to a disproportional intensification of future heatwaves. This implies that future mid-latitude heatwaves analogous to the 2010 event will become even more extreme than previously thought, with temperature extremes increasing by 8.4 °C over western Russia. Thus, the socioeconomic impacts of future heatwaves will probably be amplified beyond current estimates.

  10. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  11. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  12. Large scale scenario analysis of future low carbon energy options

    International Nuclear Information System (INIS)

    Olaleye, Olaitan; Baker, Erin

    2015-01-01

    In this study, we use a multi-model framework to examine a set of possible future energy scenarios resulting from R&D investments in Solar, Nuclear, Carbon Capture and Storage (CCS), Bio-fuels, Bio-electricity, and Batteries for Electric Transportation. Based on a global scenario analysis, we examine the impact on the economy of advancement in energy technologies, considering both individual technologies and the interactions between pairs of technologies, with a focus on the role of uncertainty. Nuclear and CCS have the most impact on abatement costs, with CCS mostly important at high levels of abatement. We show that CCS and Bio-electricity are complements, while most of the other energy technology pairs are substitutes. We also examine for stochastic dominance between R&D portfolios: given the uncertainty in R&D outcomes, we examine which portfolios would be preferred by all decision-makers, regardless of their attitude toward risk. We observe that portfolios with CCS tend to stochastically dominate those without CCS; and portfolios lacking CCS and Nuclear tend to be stochastically dominated by others. We find that the dominance of CCS becomes even stronger as uncertainty in climate damages increases. Finally, we show that there is significant value in carefully choosing a portfolio, as relatively small portfolios can dominate large portfolios. - Highlights: • We examine future energy scenarios in the face of R&D and climate uncertainty. • We examine the impact of advancement in energy technologies and pairs of technologies. • CCS complements Bio-electricity while most technology pairs are substitutes. • R&D portfolios without CCS are stochastically dominated by portfolios with CCS. • Higher damage uncertainty favors R&D development of CCS and Bio-electricity

  13. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  14. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    Science.gov (United States)

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  15. Gravity and count probabilities in an expanding universe

    Science.gov (United States)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  16. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  17. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  18. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  19. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  20. Statistical complexity without explicit reference to underlying probabilities

    Science.gov (United States)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  1. Projecting the self into the future in individuals with schizophrenia: a preliminary cross-sectional study.

    Science.gov (United States)

    Raffard, Stéphane; Bortolon, Catherine; D'Argembeau, Arnaud; Gardes, Jeanne; Gely-Nargeot, Marie-Christine; Capdevielle, Delphine; Van der Linden, Martial

    2016-07-01

    The ability to project oneself into the future contributes to development and maintenance of a coherent sense of identity. If recent research has revealed that schizophrenia is associated with difficulties envisioning the future, little is known about patients' future self-representations. In this study, 27 participants with schizophrenia and 26 healthy controls were asked to simulate mental representations of plausible and highly significant future events (self-defining future projections, SDFPs) that they anticipate to happen in their personal future. Main results showed that schizophrenia patients had difficulties in reflecting on the broader meaning and implications of imagined future events. In addition, and contrary to our hypothesis, a large majority of SDFPs in schizophrenia patients were positive events, including achievements, relationship, and leisure contents. Interestingly, patients and controls did not differ on the perceived probability that these events will occur in the future. Our results suggest that schizophrenia patients have an exaggerated positive perception of their future selves. Together, these findings lend support to the idea that past and future self-defining representations have both similar and distinct characteristics in schizophrenia.

  2. Evidence of prehistoric flooding and the potential for future extreme flooding at Coyote Wash, Yucca Mountain, Nye County, Nevada

    International Nuclear Information System (INIS)

    Glancy, P.A.

    1994-01-01

    Coyote Wash, an approximately 0.3-square-mile drainage on the eastern flank of Yucca Mountain, is the potential location for an exploratory shaft to evaluate the suitability of Yucca Mountain for construction of an underground repository for the storage of high-level radioactive wastes. An ongoing investigation is addressing the potential for hazards to the site and surrounding areas from flooding and related fluvial-debris movement. Unconsolidated sediments in and adjacent to the channel of North Fork Coyote Wash were examined for evidence of past floods. Trenches excavated across and along the valley bottom exposed multiple flood deposits, including debris-flow deposits containing boulders as large as 2 to 3 feet in diameter. Most of the alluvial deposition probably occurred during the late Quaternary. Deposits at the base of the deepest trench overlie bedrock and underlie stream terraces adjacent to the channel; these sediments are moderately indurated and probably were deposited during the late Pleistocene. Overlying nonindurated deposits clearly are younger and may be of Holocene age. This evidence of intense flooding during the past indicates that severe flooding and debris movement are possible in the future. Empirical estimates of large floods of the past range from 900 to 2,600 cubic feet per second from the 0.094-square-mile drainage area of North Fork Coyote Wash drainage at two proposed shaft sites. Current knowledge indicates that mixtures of water and debris are likely to flow from North Fork Coyote Wash at rates up to 2,500 cubic feet per second. South Fork Coyote Wash, which has similar basin area and hydraulic characteristics, probably will have concurrent floods of similar magnitudes. The peak flow of the two tributaries probably would combine near the potential sites for the exploratory shaft to produce future flow of water and accompanying debris potentially as large as 5,000 cubic feet per second

  3. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  4. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    Smith, D.L.; Naberejnev, D.G.

    2004-01-01

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  5. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  6. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  7. Probable Maximum Precipitation in the U.S. Pacific Northwest in a Changing Climate

    Science.gov (United States)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    2017-11-01

    The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate, and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics have not been fully investigated and thus differing PMP estimates are sometimes obtained without physics-based interpretations. In this study, we present a hybrid approach that takes advantage of both traditional engineering practice and modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is modified and applied to five statistically downscaled CMIP5 model outputs, producing an ensemble of PMP estimates in the Pacific Northwest (PNW) during the historical (1970-2016) and future (2050-2099) time periods. The hybrid approach produced consistent historical PMP estimates as the traditional estimates. PMP in the PNW will increase by 50% ± 30% of the current design PMP by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability through increased sea surface temperature, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, PMP exhibits higher internal variability. Thus, long-time records of high-quality data in both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.

  8. Marine mimivirus relatives are probably large algal viruses

    Directory of Open Access Journals (Sweden)

    Claverie Jean-Michel

    2008-01-01

    Full Text Available Abstract Background Acanthamoeba polyphaga mimivirus is the largest known ds-DNA virus and its 1.2 Mb-genome sequence has revealed many unique features. Mimivirus occupies an independent lineage among eukaryotic viruses and its known hosts include only species from the Acanthamoeba genus. The existence of mimivirus relatives was first suggested by the analysis of the Sargasso Sea metagenomic data. Results We now further demonstrate the presence of numerous "mimivirus-like" sequences using a larger marine metagenomic data set. We also show that the DNA polymerase sequences from three algal viruses (CeV01, PpV01, PoV01 infecting different marine algal species (Chrysochromulina ericina, Phaeocystis pouchetii, Pyramimonas orientalis are very closely related to their homolog in mimivirus. Conclusion Our results suggest that the numerous mimivirus-related sequences identified in marine environments are likely to originate from diverse large DNA viruses infecting phytoplankton. Micro-algae thus constitute a new category of potential hosts in which to look for new species of Mimiviridae.

  9. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  10. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  11. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  12. Bounds on survival probability given mean probability of failure per demand; and the paradoxical advantages of uncertainty

    International Nuclear Information System (INIS)

    Strigini, Lorenzo; Wright, David

    2014-01-01

    When deciding whether to accept into service a new safety-critical system, or choosing between alternative systems, uncertainty about the parameters that affect future failure probability may be a major problem. This uncertainty can be extreme if there is the possibility of unknown design errors (e.g. in software), or wide variation between nominally equivalent components. We study the effect of parameter uncertainty on future reliability (survival probability), for systems required to have low risk of even only one failure or accident over the long term (e.g. their whole operational lifetime) and characterised by a single reliability parameter (e.g. probability of failure per demand – pfd). A complete mathematical treatment requires stating a probability distribution for any parameter with uncertain value. This is hard, so calculations are often performed using point estimates, like the expected value. We investigate conditions under which such simplified descriptions yield reliability values that are sure to be pessimistic (or optimistic) bounds for a prediction based on the true distribution. Two important observations are (i) using the expected value of the reliability parameter as its true value guarantees a pessimistic estimate of reliability, a useful property in most safety-related decisions; (ii) with a given expected pfd, broader distributions (in a formally defined meaning of “broader”), that is, systems that are a priori “less predictable”, lower the risk of failures or accidents. Result (i) justifies the simplification of using a mean in reliability modelling; we discuss within which scope this justification applies, and explore related scenarios, e.g. how things improve if we can test the system before operation. Result (ii) not only offers more flexible ways of bounding reliability predictions, but also has important, often counter-intuitive implications for decision making in various areas, like selection of components, project management

  13. Left passage probability of Schramm-Loewner Evolution

    Science.gov (United States)

    Najafi, M. N.

    2013-06-01

    SLE(κ,ρ⃗) is a variant of Schramm-Loewner Evolution (SLE) which describes the curves which are not conformal invariant, but are self-similar due to the presence of some other preferred points on the boundary. In this paper we study the left passage probability (LPP) of SLE(κ,ρ⃗) through field theoretical framework and find the differential equation governing this probability. This equation is numerically solved for the special case κ=2 and hρ=0 in which hρ is the conformal weight of the boundary changing (bcc) operator. It may be referred to loop erased random walk (LERW) and Abelian sandpile model (ASM) with a sink on its boundary. For the curve which starts from ξ0 and conditioned by a change of boundary conditions at x0, we find that this probability depends significantly on the factor x0-ξ0. We also present the perturbative general solution for large x0. As a prototype, we apply this formalism to SLE(κ,κ-6) which governs the curves that start from and end on the real axis.

  14. Large-Scale Controls and Characteristics of Fire Activity in Central Chile, 2001-2015

    Science.gov (United States)

    McWethy, D. B.; Pauchard, A.; García, R.; Holz, A.; González, M.; Veblen, T. T.; Stahl, J.

    2016-12-01

    In recent decades, fire activity has increased in many ecosystems worldwide, even where fuel conditions and natural ignitions historically limited fire activity, and this increase begs questions of whether climate change, land-use change, and/or altered vegetation are responsible. Increased frequency of large fires in these settings has been attributed to drier-than-average summers and longer fire seasons as well as fuel accumulation related to ENSO events, raising concerns about the trajectory of post-fire vegetation dynamics and future fire regimes. In temperate and Mediterranean forests of central Chile, recent large fires associated with altered ecosystems, climate variability and land-use change highlight the risk and hazard of increasing fire activity yet the causes and consequences are poorly understood. To better understand characteristics of recent fire activity, key drivers of fire occurrence and the spatial probability of wildfire we examined the relationship between fire activity derived from MODIS satellite imagery and biophysical, land-cover and land-use variables. The probability of fire occurrence and annual area burned was best predicted by seasonal precipitation, annual temperature and land cover type. The likelihood of fire occurrence was greatest in Matorral shrublands, agricultural lands (including pasture lands) and Pinus and Eucalyptus plantations, highlighting the importance of vegetation type and fuel flammability as a critical control on fire activity. Our results suggest that land-use change responsible for the widespread presence of highly flammable vegetation and projections for continued warming and drying will likely combine to promote the occurrence of large fires in central Chile in the future.

  15. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower

  16. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  17. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  18. Framing of decision problem in short and long term and probability perception

    Directory of Open Access Journals (Sweden)

    Anna Wielicka-Regulska

    2010-01-01

    Full Text Available Consumer preferences are dependent on problem framing and time perspective. For experiment’s participants avoiding of losses was less probable in distant time perspective than in near term. On the contrary, achieving gains in near future was less probable than in remote time. One may expect different reactions when presenting problem in terms of gains than in terms of losses. This can be exploited in promotion of highly desired social behaviours like savings for retirement, keeping good diet, investing in learning, and other advantageous activities that are usually put forward by consumers.

  19. Predictive probability methods for interim monitoring in clinical trials with longitudinal outcomes.

    Science.gov (United States)

    Zhou, Ming; Tang, Qi; Lang, Lixin; Xing, Jun; Tatsuoka, Kay

    2018-04-17

    In clinical research and development, interim monitoring is critical for better decision-making and minimizing the risk of exposing patients to possible ineffective therapies. For interim futility or efficacy monitoring, predictive probability methods are widely adopted in practice. Those methods have been well studied for univariate variables. However, for longitudinal studies, predictive probability methods using univariate information from only completers may not be most efficient, and data from on-going subjects can be utilized to improve efficiency. On the other hand, leveraging information from on-going subjects could allow an interim analysis to be potentially conducted once a sufficient number of subjects reach an earlier time point. For longitudinal outcomes, we derive closed-form formulas for predictive probabilities, including Bayesian predictive probability, predictive power, and conditional power and also give closed-form solutions for predictive probability of success in a future trial and the predictive probability of success of the best dose. When predictive probabilities are used for interim monitoring, we study their distributions and discuss their analytical cutoff values or stopping boundaries that have desired operating characteristics. We show that predictive probabilities utilizing all longitudinal information are more efficient for interim monitoring than that using information from completers only. To illustrate their practical application for longitudinal data, we analyze 2 real data examples from clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  1. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.

  2. Transition probabilities of health states for workers in Malaysia using a Markov chain model

    Science.gov (United States)

    Samsuddin, Shamshimah; Ismail, Noriszura

    2017-04-01

    The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.

  3. Culture and Probability Judgment Accuracy: The Influence of Holistic Reasoning.

    Science.gov (United States)

    Lechuga, Julia; Wiebe, John S

    2011-08-01

    A well-established phenomenon in the judgment and decision-making tradition is the overconfidence one places in the amount of knowledge that one possesses. Overconfidence or probability judgment accuracy varies not only individually but also across cultures. However, research efforts to explain cross-cultural variations in the overconfidence phenomenon have seldom been made. In Study 1, the authors compared the probability judgment accuracy of U.S. Americans (N = 108) and Mexican participants (N = 100). In Study 2, they experimentally primed culture by randomly assigning English/Spanish bilingual Mexican Americans (N = 195) to response language. Results of both studies replicated the cross-cultural variation of probability judgment accuracy previously observed in other cultural groups. U.S. Americans displayed less overconfidence when compared to Mexicans. These results were then replicated in bilingual participants, when culture was experimentally manipulated with language priming. Holistic reasoning did not account for the cross-cultural variation of overconfidence. Suggestions for future studies are discussed.

  4. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  5. Prospective detection of large prediction errors: a hypothesis testing approach

    International Nuclear Information System (INIS)

    Ruan, Dan

    2010-01-01

    Real-time motion management is important in radiotherapy. In addition to effective monitoring schemes, prediction is required to compensate for system latency, so that treatment can be synchronized with tumor motion. However, it is difficult to predict tumor motion at all times, and it is critical to determine when large prediction errors may occur. Such information can be used to pause the treatment beam or adjust monitoring/prediction schemes. In this study, we propose a hypothesis testing approach for detecting instants corresponding to potentially large prediction errors in real time. We treat the future tumor location as a random variable, and obtain its empirical probability distribution with the kernel density estimation-based method. Under the null hypothesis, the model probability is assumed to be a concentrated Gaussian centered at the prediction output. Under the alternative hypothesis, the model distribution is assumed to be non-informative uniform, which reflects the situation that the future position cannot be inferred reliably. We derive the likelihood ratio test (LRT) for this hypothesis testing problem and show that with the method of moments for estimating the null hypothesis Gaussian parameters, the LRT reduces to a simple test on the empirical variance of the predictive random variable. This conforms to the intuition to expect a (potentially) large prediction error when the estimate is associated with high uncertainty, and to expect an accurate prediction when the uncertainty level is low. We tested the proposed method on patient-derived respiratory traces. The 'ground-truth' prediction error was evaluated by comparing the prediction values with retrospective observations, and the large prediction regions were subsequently delineated by thresholding the prediction errors. The receiver operating characteristic curve was used to describe the performance of the proposed hypothesis testing method. Clinical implication was represented by miss

  6. Electricity distribution within the future residence

    Energy Technology Data Exchange (ETDEWEB)

    Breeze, J.E.

    1981-11-01

    This study examined present residential wiring systems and identified their shortcomings. A list of the desirable attributes for future wiring systems is proposed. The outlook for the application to wiring systems of solid-state electronic devices is assessed. As further background for a proposed new wiring concept, the residential use of energy today and probable future trends are reviewed. Lastly, the concept of a distributed bus is proposed and developed on a conceptual basis for the residential wiring system of the future. The distributed bus concept can lead to the development of a residential wiring system to meet the following requirements: adaptable to meet probable future energy requirements for residences including alternative energy sources and energy storage; flexibility for servicing loads both in respect to location in the residence and to the size of the load; improved economy in the use of materials; capability for development as a designed or engineered system with factory assembled components and wiring harness; capability for expansion through the attachment of legs or auxillary rings; adaptable to any probable architectural residential development; capability for development to meet the requirements for ease of use and maintenance and with recognition of the growing importance of do-it-yourself repairs and alterations; and adaptable to the full range of solid-state electronics and micro-computer devices and controls including the concept of load control and management through the use of a central control module. 66 refs., 15 figs., 1 tab.

  7. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  8. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    Science.gov (United States)

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  9. Trending in Probability of Collision Measurements

    Science.gov (United States)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  10. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  11. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  12. Talking probabilities: communicating probabilistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to

  13. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  14. The large deviation approach to statistical mechanics

    International Nuclear Information System (INIS)

    Touchette, Hugo

    2009-01-01

    The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein's theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

  15. The large deviation approach to statistical mechanics

    Science.gov (United States)

    Touchette, Hugo

    2009-07-01

    The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein’s theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

  16. A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia

    Science.gov (United States)

    Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

    2012-12-01

    We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of

  17. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  18. Heads or tails an introduction to limit theorems in probability

    CERN Document Server

    Lesigne, Emmanuel

    2005-01-01

    Everyone knows some of the basics of probability, perhaps enough to play cards. Beyond the introductory ideas, there are many wonderful results that are unfamiliar to the layman, but which are well within our grasp to understand and appreciate. Some of the most remarkable results in probability are those that are related to limit theorems--statements about what happens when the trial is repeated many times. The most famous of these is the Law of Large Numbers, which mathematicians, engineers, economists, and many others use every day. In this book, Lesigne has made these limit theorems accessible by stating everything in terms of a game of tossing of a coin: heads or tails. In this way, the analysis becomes much clearer, helping establish the reader's intuition about probability. Moreover, very little generality is lost, as many situations can be modelled from combinations of coin tosses. This book is suitable for anyone who would like to learn more about mathematical probability and has had a one-year underg...

  19. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  20. Talking probabilities: communicating probalistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to provide

  1. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  2. “Swimming Ducks Forecast the Coming of Spring”—The predictability of aggregate insider trading on future market returns in the Chinese market

    Directory of Open Access Journals (Sweden)

    Chafen Zhu

    2014-09-01

    Full Text Available This study systematically examines the ability of aggregate insider trading to predict future market returns in the Chinese A-share market. After controlling for the contrarian investment strategy, aggregate executive (large shareholder trading conducted over the past six months can predict 66% (72.7% of market returns twelve months in advance. Aggregate insider trading predicts future market returns very accurately and is stronger for insiders who have a greater information advantage (e.g., executives and controlling shareholders. Corporate governance also affects the predictability of insider trading. The predictability of executive trading is weakest in central state-owned companies, probably because the “quasi-official” status of the executives in those companies effectively curbs their incentives to benefit from insider trading. The predictive power of large shareholder trading in private-owned companies is higher than that in state-owned companies, probably due to their stronger profit motivation and higher involvement in business operations. This study complements the literature by examining an emerging market and investigating how the institutional context and corporate governance affect insider trading.

  3. “Swimming Ducks Forecast the Coming of Spring”—The predictability of aggregate insider trading on future market returns in the Chinese market

    Institute of Scientific and Technical Information of China (English)

    Chafen; Zhu; Li; Wang; Tengfei; Yang

    2014-01-01

    This study systematically examines the ability of aggregate insider trading to predict future market returns in the Chinese A-share market. After controlling for the contrarian investment strategy, aggregate executive(large shareholder)trading conducted over the past six months can predict 66%(72.7%) of market returns twelve months in advance. Aggregate insider trading predicts future market returns very accurately and is stronger for insiders who have a greater information advantage(e.g., executives and controlling shareholders).Corporate governance also affects the predictability of insider trading. The predictability of executive trading is weakest in central state-owned companies,probably because the "quasi-official" status of the executives in those companies effectively curbs their incentives to benefit from insider trading.The predictive power of large shareholder trading in private-owned companies is higher than that in state-owned companies, probably due to their stronger profit motivation and higher involvement in business operations. This study complements the literature by examining an emerging market and investigating how the institutional context and corporate governance affect insider trading.

  4. Compact baby universe model in ten dimension and probability function of quantum gravity

    International Nuclear Information System (INIS)

    Yan Jun; Hu Shike

    1991-01-01

    The quantum probability functions are calculated for ten-dimensional compact baby universe model. The authors find that the probability for the Yang-Mills baby universe to undergo a spontaneous compactification down to a four-dimensional spacetime is greater than that to remain in the original homogeneous multidimensional state. Some questions about large-wormhole catastrophe are also discussed

  5. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  6. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  7. Targets of DNA-binding proteins in bacterial promoter regions present enhanced probabilities for spontaneous thermal openings

    International Nuclear Information System (INIS)

    Apostolaki, Angeliki; Kalosakas, George

    2011-01-01

    We mapped promoter regions of double-stranded DNA with respect to the probabilities of appearance of relatively large bubble openings exclusively due to thermal fluctuations at physiological temperatures. We analyzed five well-studied promoter regions of procaryotic type and found a spatial correlation between the binding sites of transcription factors and the position of peaks in the probability pattern of large thermal openings. Other distinct peaks of the calculated patterns correlate with potential binding sites of DNA-binding proteins. These results suggest that a DNA molecule would more frequently expose the bases that participate in contacts with proteins, which would probably enhance the probability of the latter to reach their targets. It also stands for using this method as a means to analyze DNA sequences based on their intrinsic thermal properties

  8. Efficient simulation of tail probabilities of sums of correlated lognormals

    DEFF Research Database (Denmark)

    Asmussen, Søren; Blanchet, José; Juneja, Sandeep

    We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown to be eff......We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown...... optimize the scaling parameter of the covariance. The second estimator decomposes the probability of interest in two contributions and takes advantage of the fact that large deviations for a sum of correlated lognormals are (asymptotically) caused by the largest increment. Importance sampling...

  9. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  10. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-01-01

    of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs

  11. Reliability-Based Optimal Design for Very Large Floating Structure

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shu-hua(张淑华); FUJIKUBO Masahiko

    2003-01-01

    Costs and losses induced by possible future extreme environmental conditions and difficulties in repairing post-yielding damage strongly suggest the need for proper consideration in design rather than just life loss prevention. This can be addressed through the development of design methodology that balances the initial cost of the very large floating structure (VLFS) against the expected potential losses resulting from future extreme wave-induced structural damage. Here, the development of a methodology for determining optimal, cost-effective design will be presented and applied to a VLFS located in the Tokyo bay. Optimal design criteria are determined based on the total expected life-cycle cost and acceptable damage probability and curvature of the structure, and a set of sizes of the structure are obtained. The methodology and applications require expressions of the initial cost and the expected life-cycle damage cost as functions of the optimal design variables. This study includes the methodology, total life-cycle cost function, structural damage modeling, and reliability analysis.

  12. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  13. Stereotactic radiation therapy for large vestibular schwannomas

    International Nuclear Information System (INIS)

    Mandl, Ellen S.; Meijer, Otto W.M.; Slotman, Ben J.; Vandertop, W. Peter; Peerdeman, Saskia M.

    2010-01-01

    Background and purpose: To evaluate the morbidity and tumor-control rate in the treatment of large vestibular schwannomas (VS) after stereotactic radiation therapy in our institution. Material and methods: Twenty-five consecutive patients (17 men, 8 women) with large VS (diameter 3.0 cm or larger), treated with stereotactic radiotherapy (SRT) or stereotactic radiosurgery (SRS) between 1992 and 2007, were retrospectively studied after a mean follow-up period of three years with respect to tumor-control rate and complications. Results: Actuarial 5-year maintenance of pre-treatment hearing level probability of 30% was achieved. Five of 17 patients suffered permanent new facial nerve dysfunction. The actuarial 5-year facial nerve preservation probability was 80%. Permanent new trigeminal nerve neuropathy occurred in two of 15 patients, resulting in an actuarial 5-year trigeminal nerve preservation probability of 85%. Tumor progression occurred in four of 25 (16%) patients. The overall 5-year tumor control probability was 82%. Conclusion: Increased morbidity rates were found in patients with large VS treated with SRT or SRS compared to the published series on regular sized VS and other smaller retrospective studies on large VS.

  14. Mapping fire probability and severity in a Mediterranean area using different weather and fuel moisture scenarios

    Science.gov (United States)

    Arca, B.; Salis, M.; Bacciu, V.; Duce, P.; Pellizzaro, G.; Ventura, A.; Spano, D.

    2009-04-01

    Although in many countries lightning is the main cause of ignition, in the Mediterranean Basin the forest fires are predominantly ignited by arson, or by human negligence. The fire season peaks coincide with extreme weather conditions (mainly strong winds, hot temperatures, low atmospheric water vapour content) and high tourist presence. Many works reported that in the Mediterranean Basin the projected impacts of climate change will cause greater weather variability and extreme weather conditions, with drier and hotter summers and heat waves. At long-term scale, climate changes could affect the fuel load and the dead/live fuel ratio, and therefore could change the vegetation flammability. At short-time scale, the increase of extreme weather events could directly affect fuel water status, and it could increase large fire occurrence. In this context, detecting the areas characterized by both high probability of large fire occurrence and high fire severity could represent an important component of the fire management planning. In this work we compared several fire probability and severity maps (fire occurrence, rate of spread, fireline intensity, flame length) obtained for a study area located in North Sardinia, Italy, using FlamMap simulator (USDA Forest Service, Missoula). FlamMap computes the potential fire behaviour characteristics over a defined landscape for given weather, wind and fuel moisture data. Different weather and fuel moisture scenarios were tested to predict the potential impact of climate changes on fire parameters. The study area, characterized by a mosaic of urban areas, protected areas, and other areas subject to anthropogenic disturbances, is mainly composed by fire-prone Mediterranean maquis. The input themes needed to run FlamMap were input as grid of 10 meters; the wind data, obtained using a computational fluid-dynamic model, were inserted as gridded file, with a resolution of 50 m. The analysis revealed high fire probability and severity in

  15. Probability-neighbor method of accelerating geometry treatment in reactor Monte Carlo code RMC

    International Nuclear Information System (INIS)

    She, Ding; Li, Zeguang; Xu, Qi; Wang, Kan; Yu, Ganglin

    2011-01-01

    Probability neighbor method (PNM) is proposed in this paper to accelerate geometry treatment of Monte Carlo (MC) simulation and validated in self-developed reactor Monte Carlo code RMC. During MC simulation by either ray-tracking or delta-tracking method, large amounts of time are spent in finding out which cell one particle is located in. The traditional way is to search cells one by one with certain sequence defined previously. However, this procedure becomes very time-consuming when the system contains a large number of cells. Considering that particles have different probability to enter different cells, PNM method optimizes the searching sequence, i.e., the cells with larger probability are searched preferentially. The PNM method is implemented in RMC code and the numerical results show that the considerable time of geometry treatment in MC calculation for complicated systems is saved, especially effective in delta-tracking simulation. (author)

  16. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  17. Impact of MCNP Unresolved Resonance Probability-Table Treatment on Uranium and Plutonium Benchmarks

    International Nuclear Information System (INIS)

    Mosteller, R.D.; Little, R.C.

    1999-01-01

    A probability-table treatment recently has been incorporated into an intermediate version of the MCNP Monte Carlo code named MCNP4XS. This paper presents MCNP4XS results for a variety of uranium and plutonium criticality benchmarks, calculated with and without the probability-table treatment. It is shown that the probability-table treatment can produce small but significant reactivity changes for plutonium and 233 U systems with intermediate spectra. More importantly, it can produce substantial reactivity increases for systems with large amounts of 238 U and intermediate spectra

  18. The future of primordial features with large-scale structure surveys

    International Nuclear Information System (INIS)

    Chen, Xingang; Namjoo, Mohammad Hossein; Dvorkin, Cora; Huang, Zhiqi; Verde, Licia

    2016-01-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  19. The future of primordial features with large-scale structure surveys

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang; Namjoo, Mohammad Hossein [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Dvorkin, Cora [Department of Physics, Harvard University, Cambridge, MA 02138 (United States); Huang, Zhiqi [School of Physics and Astronomy, Sun Yat-Sen University, 135 Xingang Xi Road, Guangzhou, 510275 (China); Verde, Licia, E-mail: xingang.chen@cfa.harvard.edu, E-mail: dvorkin@physics.harvard.edu, E-mail: huangzhq25@sysu.edu.cn, E-mail: mohammad.namjoo@cfa.harvard.edu, E-mail: liciaverde@icc.ub.edu [ICREA and ICC-UB, University of Barcelona (IEEC-UB), Marti i Franques, 1, Barcelona 08028 (Spain)

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  20. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  1. Anticipating future innovation pathways through large data analysis

    CERN Document Server

    Chiavetta, Denise; Porter, Alan; Saritas, Ozcan

    2016-01-01

    This book aims to identify promising future developmental opportunities and applications for Tech Mining. Specifically, the enclosed contributions will pursue three converging themes: The increasing availability of electronic text data resources relating to Science, Technology & Innovation (ST&I) The multiple methods that are able to treat this data effectively and incorporate means to tap into human expertise and interests Translating those analyses to provide useful intelligence on likely future developments of particular emerging S&T targets. Tech Mining can be defined as text analyses of ST&I information resources to generate Competitive Technical Intelligence (CTI). It combines bibliometrics and advanced text analytic, drawing on specialized knowledge pertaining to ST&I. Tech Mining may also be viewed as a special form of “Big Data” analytics because it searches on a target emerging technology (or key organization) of interest in global databases. One then downloads, typically, th...

  2. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    Science.gov (United States)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  3. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  4. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  5. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  6. Analysis of Drop Call Probability in Well Established Cellular ...

    African Journals Online (AJOL)

    Technology in Africa has increased over the past decade. The increase in modern cellular networks requires stringent quality of service (QoS). Drop call probability is one of the most important indices of QoS evaluation in a large scale well-established cellular network. In this work we started from an accurate statistical ...

  7. Future impact of new technologies: Three scenarios, their competence gaps and research implications

    DEFF Research Database (Denmark)

    Harmsen, Hanne; Sonne, Anne-Mette; Jensen, Birger Boutrup

    the 'technology push' and 'market pull representatives', whom we feel are both very important basic driving forces. The aim is to get an idea of the very different roles science or technology can take in the near future for a specific industry, in this case the Danish food industry and present a methodological......What will the impact of science be ten years from now in the food industry? Large or overwhelming most people will probably agree. But before we can be any more specific, we need to address the questions of what type or aspect of science or technology we have in mind and secondly, what kind...... in the future. Our approach is to construct a number of likely pictures of the future and then look at the role and impact of technology and science in each of the pictures. We do this by using an industry level scenario technique, in which we rely heavily on expert and industry inputs representing both...

  8. Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs.

    Science.gov (United States)

    Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang

    2017-06-17

    Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12-1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15-1.26 times the storage utilization efficiency compared with other schemes.

  9. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  10. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  11. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  12. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  13. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  14. Supernova relic electron neutrinos and anti-neutrinos in future large-scale observatories

    International Nuclear Information System (INIS)

    Volpe, C.; Welzel, J.

    2007-01-01

    We investigate the signal from supernova relic neutrinos in future large scale observatories, such as MEMPHYS (UNO, Hyper-K), LENA and GLACIER, at present under study. We discuss that complementary information might be gained from the observation of supernova relic electron antineutrinos and neutrinos using the scattering on protons on one hand, and on nuclei such as oxygen, carbon or argon on the other hand. When determining the relic neutrino fluxes we also include, for the first time, the coupling of the neutrino magnetic moment to magnetic fields within the core collapse supernova. We present numerical results on both the relic ν e and ν-bar e fluxes and on the number of events for ν e + C 12 , ν e + O 16 , ν e + Ar 40 and ν-bar e + p for various oscillation scenarios. The observation of supernova relic neutrinos might provide us with unique information on core-collapse supernova explosions, on the star formation history and on neutrino properties, that still remain unknown. (authors)

  15. Nuclear fusion and its large potential for the future world energy supply

    Directory of Open Access Journals (Sweden)

    Ongena Jef

    2016-12-01

    Full Text Available An overview of the energy problem in the world is presented. The colossal task of ‘decarbonizing’ the current energy system, with ~85% of the primary energy produced from fossil sources is discussed. There are at the moment only two options that can contribute to a solution: renewable energy (sun, wind, hydro, etc. or nuclear fission. Their contributions, ~2% for sun and wind, ~6% for hydro and ~5% for fission, will need to be enormously increased in a relatively short time, to meet the targets set by policy makers. The possible role and large potential for fusion to contribute to a solution in the future as a safe, nearly inexhaustible and environmentally compatible energy source is discussed. The principles of magnetic and inertial confinement are outlined, and the two main options for magnetic confinement, tokamak and stellarator, are explained. The status of magnetic fusion is summarized and the next steps in fusion research, ITER and DEMO, briefly presented.

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  17. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  18. A Short History of Probability Theory and Its Applications

    Science.gov (United States)

    Debnath, Lokenath; Basu, Kanadpriya

    2015-01-01

    This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…

  19. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  20. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  1. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  2. Future Expectation for China's Nuclear Power

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    @@ China:the future of nuclear power Wang Yonggan:In terms of the highlighted issue of energy security,oil is of paramount importance,coal is the foundation and electricity is the pivot according to China's energy strategy.The national total installed power capacity will hit a record high of 900 GW in 2010,and will probably approach 1 500 GW in 2020 when coal-fired power will continue to dominate,and alternative energy such as nuclear energy,hydroenergy,wind energy,and others will take up only 30% at most.Therefore,China remains in dire need to create more room for alternative energy.To solve this problem,solutions should be found in the diversification of energy,especially large-scale development of alternative energy,by which a lowered-and ultimately zeroed-growth of coal-fired generating units could be realized,and the target of low,even zero carbon emission could come true.

  3. Probability of causation tables and their possible implications for the practice of diagnostic radiology

    International Nuclear Information System (INIS)

    Gur, D.; Wald, N.

    1986-01-01

    In compliance with requirements in the Orphan Drug Act (97-414) of 1983, tables were recently constructed by an ad hoc committee of the National Institutes of Health (NIH) in which the probabilities that certain specific cancers are caused by previous radiation exposure are estimated. The reports of the NIH committee and a National Academy of Science oversight committee may have broad implications for the future practice of diagnostic radiology. The basis on which the probability of causation tables were established and some of the possible implications for diagnostic radiology are discussed

  4. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  5. Application of escape probability to line transfer in laser-produced plasmas

    International Nuclear Information System (INIS)

    Lee, Y.T.; London, R.A.; Zimmerman, G.B.; Haglestein, P.L.

    1989-01-01

    In this paper the authors apply the escape probability method to treat transfer of optically thick lines in laser-produced plasmas in plan-parallel geometry. They investigate the effect of self-absorption on the ionization balance and ion level populations. In addition, they calculate such effect on the laser gains in an exploding foil target heated by an optical laser. Due to the large ion streaming motion in laser-produced plasmas, absorption of an emitted photon occurs only over the length in which the Doppler shift is equal to the line width. They find that the escape probability calculated with the Doppler shift is larger compared to the escape probability for a static plasma. Therefore, the ion streaming motion contributes significantly to the line transfer process in laser-produced plasmas. As examples, they have applied escape probability to calculate transfer of optically thick lines in both ablating slab and exploding foil targets under irradiation of a high-power optical laser

  6. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  7. Thinking about the future

    International Nuclear Information System (INIS)

    Lashof, D.; Schipper, L.

    1990-01-01

    The future evolution of global change and the atmosphere will depend largely on the paths of economic development and technological change, as well as on the physical, chemical, and biological processes of the Earth-atmosphere system. While we have no control over this system once gases enter the atmosphere, economic and technological change will be influenced by policy choices made at local, national, and international levels. This paper explores some of the paths the world might follow in the decades ahead and provides an indication of the relative climatic consequences under these alternatives. After a discussion of the economic and social factors that determine emissions, four scenarios of economic and technological development are presented. These scenarios cannot capture all the possibilities, of course; rather, they have been developed in order to explore the probable climatic effects under significantly different, but plausible, economic and technological conditions. The climatic implications of these scenarios are analyzed using an integrated framework. The chapter concludes with the results of this analysis and a comparison of these results with other studies

  8. The method of modular characteristic direction probabilities in MPACT

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Z. [School of Nuclear Science and Technology, Xi' an Jiaotong University, No. 28 Xianning west road, Xi' an, Shaanxi 710049 (China); Kochunas, B.; Collins, B.; Downar, T. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2200 Bonisteel, Ann Arbor, MI 48109 (United States); Wu, H. [School of Nuclear Science and Technology, Xi' an Jiaotong University, No. 28 Xianning west road, Xi' an, Shaanxi 710049 (China)

    2013-07-01

    The method of characteristic direction probabilities (CDP) is based on a modular ray tracing technique which combines the benefits of the collision probability method (CPM) and the method of characteristics (MOC). This past year CDP was implemented in the transport code MPACT for 2-D and 3-D transport calculations. By only coupling the fine mesh regions passed by the characteristic rays in the particular direction, the scale of the probabilities matrix is much smaller compared to the CPM. At the same time, the CDP has the same capacity of dealing with the complicated geometries with the MOC, because the same modular ray tracing techniques are used. Results from the C5G7 benchmark problems are given for different cases to show the accuracy and efficiency of the CDP compared to MOC. For the cases examined, the CDP and MOC methods were seen to differ in k{sub eff} by about 1-20 pcm, and the computational efficiency of the CDP appears to be better than the MOC for some problems. However, in other problems, particularly when the CDP matrices have to be recomputed from changing cross sections, the CDP does not perform as well. This indicates an area of future work. (authors)

  9. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  10. On the probability distribution of the stochastic saturation scale in QCD

    International Nuclear Information System (INIS)

    Marquet, C.; Soyez, G.; Xiao Bowen

    2006-01-01

    It was recently noticed that high-energy scattering processes in QCD have a stochastic nature. An event-by-event scattering amplitude is characterised by a saturation scale which is a random variable. The statistical ensemble of saturation scales formed with all the events is distributed according to a probability law whose cumulants have been recently computed. In this work, we obtain the probability distribution from the cumulants. We prove that it can be considered as Gaussian over a large domain that we specify and our results are confirmed by numerical simulations

  11. Large mass storage facility

    International Nuclear Information System (INIS)

    Peskin, A.M.

    1978-01-01

    The report of a committee to study the questions surrounding possible acquisition of a large mass-storage device is presented. The current computing environment at BNL and justification for an online large mass storage device are briefly discussed. Possible devices to meet the requirements of large mass storage are surveyed, including future devices. The future computing needs of BNL are prognosticated. 2 figures, 4 tables

  12. Land use planning and wildfire: development policies influence future probability of housing loss

    Science.gov (United States)

    Syphard, Alexandra D.; Massada, Avi Bar; Butsic, Van; Keeley, Jon E.

    2013-01-01

    Increasing numbers of homes are being destroyed by wildfire in the wildland-urban interface. With projections of climate change and housing growth potentially exacerbating the threat of wildfire to homes and property, effective fire-risk reduction alternatives are needed as part of a comprehensive fire management plan. Land use planning represents a shift in traditional thinking from trying to eliminate wildfires, or even increasing resilience to them, toward avoiding exposure to them through the informed placement of new residential structures. For land use planning to be effective, it needs to be based on solid understanding of where and how to locate and arrange new homes. We simulated three scenarios of future residential development and projected landscape-level wildfire risk to residential structures in a rapidly urbanizing, fire-prone region in southern California. We based all future development on an econometric subdivision model, but we varied the emphasis of subdivision decision-making based on three broad and common growth types: infill, expansion, and leapfrog. Simulation results showed that decision-making based on these growth types, when applied locally for subdivision of individual parcels, produced substantial landscape-level differences in pattern, location, and extent of development. These differences in development, in turn, affected the area and proportion of structures at risk from burning in wildfires. Scenarios with lower housing density and larger numbers of small, isolated clusters of development, i.e., resulting from leapfrog development, were generally predicted to have the highest predicted fire risk to the largest proportion of structures in the study area, and infill development was predicted to have the lowest risk. These results suggest that land use planning should be considered an important component to fire risk management and that consistently applied policies based on residential pattern may provide substantial benefits for

  13. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  14. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    clinical covariates, the main effects of 98 biomarkers and 24 biomarker-by-treatment interactions, but there was high variability of the expected survival probabilities, with very large confidence intervals. Conclusion Based on our simulations, we propose a unified framework for: developing a prediction model with biomarker-by-treatment interactions in a high-dimensional setting and validating it in absence of external data; accurately estimating the expected survival probability of future patients with associated confidence intervals; and graphically visualizing the developed prediction model. All the methods are implemented in the R package biospear, publicly available on the CRAN.

  15. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  16. Large Instrument Development for Radio Astronomy

    Science.gov (United States)

    Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo

    2009-03-01

    This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.

  17. Estimating migratory connectivity of birds when re-encounter probabilities are heterogeneous

    Science.gov (United States)

    Cohen, Emily B.; Hostelter, Jeffrey A.; Royle, J. Andrew; Marra, Peter P.

    2014-01-01

    Understanding the biology and conducting effective conservation of migratory species requires an understanding of migratory connectivity – the geographic linkages of populations between stages of the annual cycle. Unfortunately, for most species, we are lacking such information. The North American Bird Banding Laboratory (BBL) houses an extensive database of marking, recaptures and recoveries, and such data could provide migratory connectivity information for many species. To date, however, few species have been analyzed for migratory connectivity largely because heterogeneous re-encounter probabilities make interpretation problematic. We accounted for regional variation in re-encounter probabilities by borrowing information across species and by using effort covariates on recapture and recovery probabilities in a multistate capture–recapture and recovery model. The effort covariates were derived from recaptures and recoveries of species within the same regions. We estimated the migratory connectivity for three tern species breeding in North America and over-wintering in the tropics, common (Sterna hirundo), roseate (Sterna dougallii), and Caspian terns (Hydroprogne caspia). For western breeding terns, model-derived estimates of migratory connectivity differed considerably from those derived directly from the proportions of re-encounters. Conversely, for eastern breeding terns, estimates were merely refined by the inclusion of re-encounter probabilities. In general, eastern breeding terns were strongly connected to eastern South America, and western breeding terns were strongly linked to the more western parts of the nonbreeding range under both models. Through simulation, we found this approach is likely useful for many species in the BBL database, although precision improved with higher re-encounter probabilities and stronger migratory connectivity. We describe an approach to deal with the inherent biases in BBL banding and re-encounter data to demonstrate

  18. Probability of success for phase III after exploratory biomarker analysis in phase II.

    Science.gov (United States)

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  19. The influence of initial beliefs on judgments of probability.

    Science.gov (United States)

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  20. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  1. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe.

    Science.gov (United States)

    Blaas, Harry; Kroeze, Carolien

    2014-10-15

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed. To this end, scenarios for the year 2050 are analysed, assuming that in the 27 countries of the European Union fossil diesel will be replaced by biodiesel from algae. Estimates are made for the required fertiliser inputs to algae parks, and how this may increase concentrations of nitrogen and phosphorus in coastal waters, potentially leading to eutrophication. The Global NEWS (Nutrient Export from WaterSheds) model has been used to estimate the transport of nitrogen and phosphorus to the European coastal waters. The results indicate that the amount of nitrogen and phosphorus in the coastal waters may increase considerably in the future as a result of large-scale production of algae for the production of biodiesel, even in scenarios assuming effective waste water treatment and recycling of waste water in algae production. To ensure sustainable production of biodiesel from micro-algae, it is important to develop cultivation systems with low nutrient losses to the environment. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  3. An Engineering Design Reference Mission for a Future Large-Aperture UVOIR Space Observatory

    Science.gov (United States)

    Thronson, Harley A.; Bolcar, Matthew R.; Clampin, Mark; Crooke, Julie A.; Redding, David; Rioux, Norman; Stahl, H. Philip

    2016-01-01

    From the 2010 NRC Decadal Survey and the NASA Thirty-Year Roadmap, Enduring Quests, Daring Visions, to the recent AURA report, From Cosmic Birth to Living Earths, multiple community assessments have recommended development of a large-aperture UVOIR space observatory capable of achieving a broad range of compelling scientific goals. Of these priority science goals, the most technically challenging is the search for spectroscopic biomarkers in the atmospheres of exoplanets in the solar neighborhood. Here we present an engineering design reference mission (EDRM) for the Advanced Technology Large-Aperture Space Telescope (ATLAST), which was conceived from the start as capable of breakthrough science paired with an emphasis on cost control and cost effectiveness. An EDRM allows the engineering design trade space to be explored in depth to determine what are the most demanding requirements and where there are opportunities for margin against requirements. Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. The ATLAST observatory is designed to operate at a Sun-Earth L2 orbit, which provides a stable thermal environment and excellent field of regard. Our reference designs have emphasized a serviceable 36-segment 9.2 m aperture telescope that stows within a five-meter diameter launch vehicle fairing. As part of our cost-management effort, this particular reference mission builds upon the engineering design for JWST. Moreover, it is scalable to a variety of launch vehicle fairings. Performance needs developed under the study are traceable to a variety of additional reference designs, including options for a monolithic primary mirror.

  4. Probability of criminal acts of violence: a test of jury predictive accuracy.

    Science.gov (United States)

    Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D

    2013-01-01

    The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Measuring CP nature of top-Higgs couplings at the future Large Hadron electron Collider

    Directory of Open Access Journals (Sweden)

    Baradhwaj Coleppa

    2017-07-01

    Full Text Available We investigate the sensitivity of top-Higgs coupling by considering the associated vertex as CP phase (ζt dependent through the process pe−→t¯hνe in the future Large Hadron electron Collider. In particular the decay modes are taken to be h→bb¯ and t¯ → leptonic mode. Several distinct ζt dependent features are demonstrated by considering observables like cross sections, top-quark polarisation, rapidity difference between h and t¯ and different angular asymmetries. Luminosity (L dependent exclusion limits are obtained for ζt by considering significance based on fiducial cross sections at different σ-levels. For electron and proton beam-energies of 60 GeV and 7 TeV respectively, at L=100 fb−1, the regions above π/5<ζt≤π are excluded at 2σ confidence level, which reflects better sensitivity expected at the Large Hadron Collider. With appropriate error fitting methodology we find that the accuracy of SM top-Higgs coupling could be measured to be κ=1.00±0.17(0.08 at s=1.3(1.8 TeV for an ultimate L=1ab−1.

  6. Predicting the current and future potential distributions of lymphatic filariasis in Africa using maximum entropy ecological niche modelling.

    Directory of Open Access Journals (Sweden)

    Hannah Slater

    Full Text Available Modelling the spatial distributions of human parasite species is crucial to understanding the environmental determinants of infection as well as for guiding the planning of control programmes. Here, we use ecological niche modelling to map the current potential distribution of the macroparasitic disease, lymphatic filariasis (LF, in Africa, and to estimate how future changes in climate and population could affect its spread and burden across the continent. We used 508 community-specific infection presence data collated from the published literature in conjunction with five predictive environmental/climatic and demographic variables, and a maximum entropy niche modelling method to construct the first ecological niche maps describing potential distribution and burden of LF in Africa. We also ran the best-fit model against climate projections made by the HADCM3 and CCCMA models for 2050 under A2a and B2a scenarios to simulate the likely distribution of LF under future climate and population changes. We predict a broad geographic distribution of LF in Africa extending from the west to the east across the middle region of the continent, with high probabilities of occurrence in the Western Africa compared to large areas of medium probability interspersed with smaller areas of high probability in Central and Eastern Africa and in Madagascar. We uncovered complex relationships between predictor ecological niche variables and the probability of LF occurrence. We show for the first time that predicted climate change and population growth will expand both the range and risk of LF infection (and ultimately disease in an endemic region. We estimate that populations at risk to LF may range from 543 and 804 million currently, and that this could rise to between 1.65 to 1.86 billion in the future depending on the climate scenario used and thresholds applied to signify infection presence.

  7. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures

    Science.gov (United States)

    Sloma, Michael F.; Mathews, David H.

    2016-01-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. PMID:27852924

  8. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  9. Potential Future Igneous Activity at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Cline, M.; Perry, F.; Valentine, G.; Smistad, E.

    2005-01-01

    Location, timing, and volumes of post-Miocene volcanic activity, along with expert judgment, provide the basis for assessing the probability of future volcanism intersecting a proposed repository for nuclear waste at Yucca Mountain, Nevada. Analog studies of eruptive centers in the region that may represent the style and extent of possible future igneous activity at Yucca Mountain have aided in defining the consequence scenarios for intrusion into and eruption through a proposed repository. Modeling of magmatic processes related to magma/proposed repository interactions has been used to assess the potential consequences of a future igneous event through a proposed repository at Yucca Mountain. Results of work to date indicate future igneous activity in the Yucca Mountain region has a very low probability of intersecting the proposed repository. Probability of a future event intersecting a proposed repository at Yucca Mountain is approximately 1.7 x 10 -8 per year. Since completion of the Probabilistic Volcanic Hazard Assessment (PVHA) in 1996, anomalies representing potential buried volcanic centers have been identified from aeromagnetic surveys. A re-assessment of the hazard is currently underway to evaluate the probability of intersection in light of new information and to estimate the probability of one or more volcanic conduits located in the proposed repository along a dike that intersects the proposed repository. US Nuclear Regulatory Commission regulations for siting and licensing a proposed repository require that the consequences of a disruptive event (igneous event) with annual probability greater than 1 x 10 -8 be evaluated. Two consequence scenarios are considered: (1) igneous intrusion-poundwater transport case and (2) volcanic eruptive case. These scenarios equate to a dike or dike swarm intersecting repository drifts containing waste packages, formation of a conduit leading to a volcanic eruption through the repository that carries the contents of

  10. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  11. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  12. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Science.gov (United States)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  13. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  14. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  15. Optimism as a Prior Belief about the Probability of Future Reward

    Science.gov (United States)

    Kalra, Aditi; Seriès, Peggy

    2014-01-01

    Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly. PMID:24853098

  16. Optimism as a prior belief about the probability of future reward.

    Directory of Open Access Journals (Sweden)

    Aistis Stankevicius

    2014-05-01

    Full Text Available Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly.

  17. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  18. Sample-path large deviations in credit risk

    NARCIS (Netherlands)

    Leijdekker, V.J.G.; Mandjes, M.R.H.; Spreij, P.J.C.

    2011-01-01

    The event of large losses plays an important role in credit risk. As these large losses are typically rare, and portfolios usually consist of a large number of positions, large deviation theory is the natural tool to analyze the tail asymptotics of the probabilities involved. We first derive a

  19. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  20. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  1. Supernova relic electron neutrinos and anti-neutrinos in future large-scale observatories

    Energy Technology Data Exchange (ETDEWEB)

    Volpe, C.; Welzel, J. [Institut de Physique Nuclueaire, 91 - Orsay (France)

    2007-07-01

    We investigate the signal from supernova relic neutrinos in future large scale observatories, such as MEMPHYS (UNO, Hyper-K), LENA and GLACIER, at present under study. We discuss that complementary information might be gained from the observation of supernova relic electron antineutrinos and neutrinos using the scattering on protons on one hand, and on nuclei such as oxygen, carbon or argon on the other hand. When determining the relic neutrino fluxes we also include, for the first time, the coupling of the neutrino magnetic moment to magnetic fields within the core collapse supernova. We present numerical results on both the relic {nu}{sub e} and {nu}-bar{sub e} fluxes and on the number of events for {nu}{sub e} + C{sup 12}, {nu}{sub e} + O{sup 16}, {nu}{sub e} + Ar{sup 40} and {nu}-bar{sub e} + p for various oscillation scenarios. The observation of supernova relic neutrinos might provide us with unique information on core-collapse supernova explosions, on the star formation history and on neutrino properties, that still remain unknown. (authors)

  2. WKB theory of large deviations in stochastic populations

    Science.gov (United States)

    Assaf, Michael; Meerson, Baruch

    2017-06-01

    Stochasticity can play an important role in the dynamics of biologically relevant populations. These span a broad range of scales: from intra-cellular populations of molecules to population of cells and then to groups of plants, animals and people. Large deviations in stochastic population dynamics—such as those determining population extinction, fixation or switching between different states—are presently in a focus of attention of statistical physicists. We review recent progress in applying different variants of dissipative WKB approximation (after Wentzel, Kramers and Brillouin) to this class of problems. The WKB approximation allows one to evaluate the mean time and/or probability of population extinction, fixation and switches resulting from either intrinsic (demographic) noise, or a combination of the demographic noise and environmental variations, deterministic or random. We mostly cover well-mixed populations, single and multiple, but also briefly consider populations on heterogeneous networks and spatial populations. The spatial setting also allows one to study large fluctuations of the speed of biological invasions. Finally, we briefly discuss possible directions of future work.

  3. WKB theory of large deviations in stochastic populations

    International Nuclear Information System (INIS)

    Assaf, Michael; Meerson, Baruch

    2017-01-01

    Stochasticity can play an important role in the dynamics of biologically relevant populations. These span a broad range of scales: from intra-cellular populations of molecules to population of cells and then to groups of plants, animals and people. Large deviations in stochastic population dynamics—such as those determining population extinction, fixation or switching between different states—are presently in a focus of attention of statistical physicists. We review recent progress in applying different variants of dissipative WKB approximation (after Wentzel, Kramers and Brillouin) to this class of problems. The WKB approximation allows one to evaluate the mean time and/or probability of population extinction, fixation and switches resulting from either intrinsic (demographic) noise, or a combination of the demographic noise and environmental variations, deterministic or random. We mostly cover well-mixed populations, single and multiple, but also briefly consider populations on heterogeneous networks and spatial populations. The spatial setting also allows one to study large fluctuations of the speed of biological invasions. Finally, we briefly discuss possible directions of future work. (topical review)

  4. Towards a large deviation theory for strongly correlated systems

    International Nuclear Information System (INIS)

    Ruiz, Guiomar; Tsallis, Constantino

    2012-01-01

    A large-deviation connection of statistical mechanics is provided by N independent binary variables, the (N→∞) limit yielding Gaussian distributions. The probability of n≠N/2 out of N throws is governed by e −Nr , r related to the entropy. Large deviations for a strong correlated model characterized by indices (Q,γ) are studied, the (N→∞) limit yielding Q-Gaussians (Q→1 recovers a Gaussian). Its large deviations are governed by e q −Nr q (∝1/N 1/(q−1) , q>1), q=(Q−1)/(γ[3−Q])+1. This illustration opens the door towards a large-deviation foundation of nonextensive statistical mechanics. -- Highlights: ► We introduce the formalism of relative entropy for a single random binary variable and its q-generalization. ► We study a model of N strongly correlated binary random variables and their large-deviation probabilities. ► Large-deviation probability of strongly correlated model exhibits a q-exponential decay whose argument is proportional to N, as extensivity requires. ► Our results point to a q-generalized large deviation theory and suggest a large-deviation foundation of nonextensive statistical mechanics.

  5. Future climate and wildfire: ecosystem projections of area burned in the western US

    Science.gov (United States)

    Littell, J. S.; Duffy, P.; Battisti, D. S.; McKenzie, D.; Peterson, D. L.

    2010-12-01

    The area burned by fire in ecosystems of the western United States has been closely linked to climate in the paleoecological record and in the modern record. Statistical models of area burned show that the climatic controls on area burned vary with vegetation type (Littell et al. 2009). In more arid or systems (grasslands, shrublands, woodlands), antecedent climatic controls on fire were associated first with the production of fuels and secondarily with drought in the year of fire. These relationships typically manifested as wetter and sometimes cooler conditions in the seasons prior to the fire season. Area burned in forest ecosystems and some woodlands was primarily associated with drought conditions, specifically increased temperature and decreased precipitation in the year of fire and the seasons leading up to the fire season. These climatic controls indicate the role of climate in drying existing fuels. Statistical fire models trained on the late 20th century for ecoprovinces in the West would be useful for projecting area burned, at least until vegetation type conversion driven by climate and disturbance occurs. To that end, we used ~ 2.5 degree gridded future climate fields derived for a multi-GCM ensemble of 1C and 2C temperature increase forcing to develop future ecoprovince monthly and seasonal average temperature and associated precipitation and used these as predictors in statistical fire models of future projected area burned. We also conducted modeling scenarios with the ensemble temperature increase paired with historical precipitation. Most ecoprovinces had increases in area burned, with a range of ~ 67% to over 600% . Ecoprovinces that are primarily sensitive to precipitation changes exhibit smaller increases than those most sensitive to temperature (forest systems). We also developed exceedance probabilities. Some ecoprovinces show large increases in area burned but low exceedance probabilities, suggest that the area burned is concentrated more

  6. A new proposed approach for future large-scale de-carbonization coal-fired power plants

    International Nuclear Information System (INIS)

    Xu, Gang; Liang, Feifei; Wu, Ying; Yang, Yongping; Zhang, Kai; Liu, Wenyi

    2015-01-01

    The post-combustion CO 2 capture technology provides a feasible and promising method for large-scale CO 2 capture in coal-fired power plants. However, the large-scale CO 2 capture in conventionally designed coal-fired power plants is confronted with various problems, such as the selection of the steam extraction point and steam parameter mismatch. To resolve these problems, an improved design idea for the future coal-fired power plant with large-scale de-carbonization is proposed. A main characteristic of the proposed design is the adoption of a back-pressure steam turbine, which extracts the suitable steam for CO 2 capture and ensures the stability of the integrated system. A new let-down steam turbine generator is introduced to retrieve the surplus energy from the exhaust steam of the back-pressure steam turbine when CO 2 capture is cut off. Results show that the net plant efficiency of the improved design is 2.56% points higher than that of the conventional one when CO 2 capture ratio reaches 80%. Meanwhile, the net plant efficiency of the improved design maintains the same level to that of the conventional design when CO 2 capture is cut off. Finally, the match between the extracted steam and the heat demand of the reboiler is significantly increased, which solves the steam parameter mismatch problem. The techno-economic analysis indicates that the proposed design is a cost-effective approach for the large-scale CO 2 capture in coal-fired power plants. - Highlights: • Problems caused by CO 2 capture in the power plant are deeply analyzed. • An improved design idea for coal-fired power plants with CO 2 capture is proposed. • Thermodynamic, exergy and techno-economic analyses are quantitatively conducted. • Energy-saving effects are found in the proposed coal-fired power plant design idea

  7. Beyond the Large Hadron Collider: A First Look at Cryogenics for CERN Future Circular Colliders

    Science.gov (United States)

    Lebrun, Philippe; Tavian, Laurent

    Following the first experimental discoveries at the Large Hadron Collider (LHC) and the recent update of the European strategy in particle physics, CERN has undertaken an international study of possible future circular colliders beyond the LHC. The study, conducted with the collaborative participation of interested institutes world-wide, considers several options for very high energy hadron-hadron, electron-positron and hadron-electron colliders to be installed in a quasi-circular underground tunnel in the Geneva basin, with a circumference of 80 km to 100 km. All these machines would make intensive use of advanced superconducting devices, i.e. high-field bending and focusing magnets and/or accelerating RF cavities, thus requiring large helium cryogenic systems operating at 4.5 K or below. Based on preliminary sets of parameters and layouts for the particle colliders under study, we discuss the main challenges of their cryogenic systems and present first estimates of the cryogenic refrigeration capacities required, with emphasis on the qualitative and quantitative steps to be accomplished with respect to the present state-of-the-art.

  8. Camera-Model Identification Using Markovian Transition Probability Matrix

    Science.gov (United States)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  9. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  10. Actions and Beliefs : Estimating Distribution-Based Preferences Using a Large Scale Experiment with Probability Questions on Expectations

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2005-01-01

    We combine the choice data of proposers and responders in the ultimatum game, their expectations elicited in the form of subjective probability questions, and the choice data of proposers ("dictator") in a dictator game to estimate a structural model of decision making under uncertainty.We use a

  11. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  12. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  13. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  14. Time dependent non-extinction probability for prompt critical systems

    International Nuclear Information System (INIS)

    Gregson, M. W.; Prinja, A. K.

    2009-01-01

    The time dependent non-extinction probability equation is presented for slab geometry. Numerical solutions are provided for a nested inner/outer iteration routine where the fission terms (both linear and non-linear) are updated and then held fixed over the inner scattering iteration. Time dependent results are presented highlighting the importance of the injection position and angle. The iteration behavior is also described as the steady state probability of initiation is approached for both small and large time steps. Theoretical analysis of the nested iteration scheme is shown and highlights poor numerical convergence for marginally prompt critical systems. An acceleration scheme for the outer iterations is presented to improve convergence of such systems. Theoretical analysis of the acceleration scheme is also provided and the associated decrease in computational run time addressed. (authors)

  15. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  16. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  17. Towards a sustainably certifiable futures contract for biofuels

    International Nuclear Information System (INIS)

    Mathews, John A.

    2008-01-01

    How are biofuels to be certified as produced in a sustainable and responsible fashion? In the global debate over this issue, one party to the proceedings seems rarely to be mentioned-namely the commodities exchanges through which a global biofuels market is being created. In this contribution, I propose a solution to the problem of sustainability certification through a biofuels futures contract equipped with 'proof of origin' documentation. The proposal does not call for any radical break with current practice, extending existing certification procedures with a requirement for the vendor to provide documentation, probably in barcoded form, of the history of the biofuel offered for sale, including plantation and biorefinery where the biofuel was produced and subsequent blendings it may have undergone. The proposal is thus compatible with the blending practices of large global traders, whose activities are the source of the difficulties of other approaches to certification. It is argued that if such a sustainable futures contract for bioethanol (in the first instance) were to be introduced, then it would likely trade at a premium and become the primary vehicle for North-South trade in biofuels

  18. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Probability distributions for Markov chain based quantum walks

    Science.gov (United States)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  20. QBism the future of quantum physics

    CERN Document Server

    von Baeyer, Hans Christian

    2016-01-01

    Measured by the accuracy of its predictions and the scope of its technological applications, quantum mechanics is one of the most successful theories in science--as well as one of the most misunderstood. The deeper meaning of quantum mechanics remains controversial almost a century after its invention. Providing a way past quantum theory's paradoxes and puzzles, QBism offers a strikingly new interpretation that opens up for the nonspecialist reader the profound implications of quantum mechanics for how we understand and interact with the world. Short for Quantum Bayesianism, QBism adapts many of the conventional features of quantum mechanics in light of a revised understanding of probability. Bayesian probability, unlike the standard "frequentist probability," is defined as a numerical measure of the degree of an observer's belief that a future event will occur or that a particular proposition is true. Bayesianism's advantages over frequentist probability are that it is applicable to singular events, its pro...

  1. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  2. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  3. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  4. Incidents in nuclear research reactor examined by deterministic probability and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Lopes, Valdir Maciel

    2010-01-01

    This study aims to evaluate the potential risks submitted by the incidents in nuclear research reactors. For its development, two databases of the International Atomic Energy Agency, IAEA, were used, the Incident Report System for Research Reactor and Research Reactor Data Base. For this type of assessment was used the Probabilistic Safety Analysis (PSA), within a confidence level of 90% and the Deterministic Probability Analysis (DPA). To obtain the results of calculations of probabilities for PSA, were used the theory and equations in the paper IAEA TECDOC - 636. The development of the calculations of probabilities for PSA was used the program Scilab version 5.1.1, free access, executable on Windows and Linux platforms. A specific program to get the results of probability was developed within the main program Scilab 5.1.1., for two distributions Fischer and Chi-square, both with the confidence level of 90%. Using the Sordi equations and Origin 6.0 program, were obtained the maximum admissible doses related to satisfy the risk limits established by the International Commission on Radiological Protection, ICRP, and were also obtained these maximum doses graphically (figure 1) resulting from the calculations of probabilities x maximum admissible doses. It was found that the reliability of the results of probability is related to the operational experience (reactor x year and fractions) and that the larger it is, greater the confidence in the outcome. Finally, a suggested list of future work to complement this paper was gathered. (author)

  5. Probability-of-Superiority SEM (PS-SEM—Detecting Probability-Based Multivariate Relationships in Behavioral Research

    Directory of Open Access Journals (Sweden)

    Johnson Ching-Hong Li

    2018-06-01

    Full Text Available In behavioral research, exploring bivariate relationships between variables X and Y based on the concept of probability-of-superiority (PS has received increasing attention. Unlike the conventional, linear-based bivariate relationship (e.g., Pearson's correlation, PS defines that X and Y can be related based on their likelihood—e.g., a student who is above mean in SAT has 63% likelihood of achieving an above-mean college GPA. Despite its increasing attention, the concept of PS is restricted to a simple bivariate scenario (X-Y pair, which hinders the development and application of PS in popular multivariate modeling such as structural equation modeling (SEM. Therefore, this study addresses an empirical-based simulation study that explores the potential of detecting PS-based relationship in SEM, called PS-SEM. The simulation results showed that the proposed PS-SEM method can detect and identify PS-based when data follow PS-based relationships, thereby providing a useful method for researchers to explore PS-based SEM in their studies. Conclusions, implications, and future directions based on the findings are also discussed.

  6. Future impact of new technologies: Three scenarios, their competence gab and research implications

    DEFF Research Database (Denmark)

    Harmsen, Hanne; Sonne, Anne-Mette; Jensen, Birger Boutrup

    2004-01-01

    What will the impact of science in the food industry be 10 years from now? Large or overwhelming most people will probably agree. But of we want to be more specific, we might start by defining, what type of technology we have in mind and secondly, what kind of impact we are talking about. Since......, who represents both "technology push" and "market pull", which we feel are both very important basic driving forces. The aim is to get an idea of the very different roles acience or technology can play in the near future for a specific industry, in this case the Danish food industry, and present...

  7. Assumed Probability Density Functions for Shallow and Deep Convection

    OpenAIRE

    Steven K Krueger; Peter A Bogenschutz; Marat Khairoutdinov

    2010-01-01

    The assumed joint probability density function (PDF) between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS) parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PD...

  8. Probability of expected climate stresses in North America in the next one My

    International Nuclear Information System (INIS)

    Kukla, G.

    1979-01-01

    Climates one million years ahead were predicted upon the assumption that the natural climate variability during the past My will continue. Response of environment and climate in the Basin and Range province of the western USA to global fluctuations was reconstructed; the most remarkable change was the filling of closed basins with large freshwater lakes. Probabilities of permanent ice cover and floods are discussed. It is believed that a site with minimal probability of climate-related breach can be selected

  9. The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise

    Science.gov (United States)

    Plag, H.; Bye, B.

    2011-12-01

    would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.

  10. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  11. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  12. The probability of containment failure by steam explosion in a PWR

    International Nuclear Information System (INIS)

    Briggs, A.J.

    1983-12-01

    The study of the risk associated with operation of a PWR includes assessment of severe accidents in which a combination of faults results in melting of the core. Probabilistic methods are used in such assessment, hence it is necessary to estimate the probability of key events. One such event is the occurrence of a large steam explosion when molten core debris slumps into the base of the reactor vessel. This report considers recent information, and recommends an upper limit to the range of probability values for containment failure by steam explosion for risk assessment for a plant such as the proposed Sizewell B station. (U.K.)

  13. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  14. Future energy, exotic energy

    Energy Technology Data Exchange (ETDEWEB)

    Dumon, R

    1974-01-01

    The Detroit Energy Conference has highlighted the declining oil reserves, estimated worldwide at 95 billion tons vs. an annual rate of consumption of over 3 billion tons. The present problem is one of price; also, petroleum seems too valuable to be simply burned. New sources must come into action before 1985. The most abundant is coal, with 600 billion tons of easily recoverable reserves; then comes oil shale with a potential of 400 billion tons of oil. Exploitation at the rate of 55 go 140 million tons/yr is planned in the U.S. after 1985. More exotic and impossible to estimate quantitatively are such sources as wind, tides, and the thermal energy of the oceans--these are probably far in the future. The same is true of solar and geothermal energy in large amounts. The only other realistic energy source is nuclear energy: the European Economic Community looks forward to covering 60% of its energy needs from nuclear energy in the year 2000. Even today, from 400 mw upward, a nuclear generating plant is more economical than a fossil fueled one. Conservation will become the byword, and profound changes in society are to be expected.

  15. Main factors for fatigue failure probability of pipes subjected to fluid thermal fluctuation

    International Nuclear Information System (INIS)

    Machida, Hideo; Suzuki, Masaaki; Kasahara, Naoto

    2015-01-01

    It is very important to grasp failure probability and failure mode appropriately to carry out risk reduction measures of nuclear power plants. To clarify the important factors for failure probability and failure mode of pipes subjected to fluid thermal fluctuation, failure probability analyses were performed by changing the values of a stress range, stress ratio, stress components and threshold of stress intensity factor range. The important factors for the failure probability are range, stress ratio (mean stress condition) and threshold of stress intensity factor range. The important factor for the failure mode is a circumferential angle range of fluid thermal fluctuation. When a large fluid thermal fluctuation acts on the entire circumferential surface of the pipe, the probability of pipe breakage increases, calling for measures to prevent such a failure and reduce the risk to the plant. When the circumferential angle subjected to fluid thermal fluctuation is small, the failure mode of piping is leakage and the corrective maintenance might be applicable from the viewpoint of risk to the plant. (author)

  16. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  17. Large-Ensemble modeling of past and future variations of the Antarctic Ice Sheet with a coupled ice-Earth-sea level model

    Science.gov (United States)

    Pollard, David; DeConto, Robert; Gomez, Natalya

    2016-04-01

    To date, most modeling of the Antarctic Ice Sheet's response to future warming has been calibrated using recent and modern observations. As an alternate approach, we apply a hybrid 3-D ice sheet-shelf model to the last deglacial retreat of Antarctica, making use of geologic data of the last ~20,000 years to test the model against the large-scale variations during this period. The ice model is coupled to a global Earth-sea level model to improve modeling of the bedrock response and to capture ocean-ice gravitational interactions. Following several recent ice-sheet studies, we use Large Ensemble (LE) statistical methods, performing sets of 625 runs from 30,000 years to present with systematically varying model parameters. Objective scores for each run are calculated using modern data and past reconstructed grounding lines, relative sea level records, cosmogenic elevation-age data and uplift rates. The LE results are analyzed to calibrate 4 particularly uncertain model parameters that concern marginal ice processes and interaction with the ocean. LE's are extended into the future with climates following RCP scenarios. An additional scoring criterion tests the model's ability to reproduce estimated sea-level high stands in the warm mid-Pliocene, for which drastic retreat mechanisms of hydrofracturing and ice-cliff failure are needed in the model. The LE analysis provides future sea-level-rise envelopes with well-defined parametric uncertainty bounds. Sensitivities of future LE results to Pliocene sea-level estimates, coupling to the Earth-sea level model, and vertical profiles of Earth properties, will be presented.

  18. Large deviations

    CERN Document Server

    Deuschel, Jean-Dominique; Deuschel, Jean-Dominique

    2001-01-01

    This is the second printing of the book first published in 1988. The first four chapters of the volume are based on lectures given by Stroock at MIT in 1987. They form an introduction to the basic ideas of the theory of large deviations and make a suitable package on which to base a semester-length course for advanced graduate students with a strong background in analysis and some probability theory. A large selection of exercises presents important material and many applications. The last two chapters present various non-uniform results (Chapter 5) and outline the analytic approach that allow

  19. System Geometries and Transit/Eclipse Probabilities

    Directory of Open Access Journals (Sweden)

    Howard A.

    2011-02-01

    Full Text Available Transiting exoplanets provide access to data to study the mass-radius relation and internal structure of extrasolar planets. Long-period transiting planets allow insight into planetary environments similar to the Solar System where, in contrast to hot Jupiters, planets are not constantly exposed to the intense radiation of their parent stars. Observations of secondary eclipses additionally permit studies of exoplanet temperatures and large-scale exo-atmospheric properties. We show how transit and eclipse probabilities are related to planet-star system geometries, particularly for long-period, eccentric orbits. The resulting target selection and observational strategies represent the principal ingredients of our photometric survey of known radial-velocity planets with the aim of detecting transit signatures (TERMS.

  20. Phenomenology of future neutrino experiments with large θ13

    International Nuclear Information System (INIS)

    Minakata, Hisakazu

    2013-01-01

    The question “how small is the lepton mixing angle θ 13 ?” had a convincing answer in a surprisingly short time, θ 13 ≃9 ° , a large value comparable to the Chooz limit. It defines a new epoch in the program of determining the lepton mixing parameters, opening the door to search for lepton CP violation of the Kobayashi-Maskawa-type. I discuss influences of the large value of θ 13 to search for CP violation and determination of the neutrino mass hierarchy, the remaining unknowns in the standard three-flavor mixing scheme of neutrinos. I emphasize the following two points: (1) Large θ 13 makes determination of the mass hierarchy easier. It stimulates to invent new ideas and necessitates quantitative reexamination of practical ways to explore it. (2) However, large θ 13 does not quite make CP measurement easier so that we do need a “guaranteeing machine” to measure CP phase δ

  1. Probability and uncertainty in nuclear safety decisions

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1986-01-01

    In this paper, we examine some problems posed by the use of probabilities in Nuclear Safety decisions. We discuss some of the theoretical difficulties due to the collective nature of regulatory decisions, and, in particular, the calibration and the aggregation of risk information (e.g., experts opinions). We argue that, if one chooses numerical safety goals as a regulatory basis, one can reduce the constraints to an individual safety goal and a cost-benefit criterion. We show the relevance of risk uncertainties in this kind of regulatory framework. We conclude that, whereas expected values of future failure frequencies are adequate to show compliance with economic constraints, the use of a fractile (e.g., 95%) to be specified by the regulatory agency is justified to treat hazard uncertainties for the individual safety goal. (orig.)

  2. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  3. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  4. Hyperbolic Discounting of the Far-Distant Future

    OpenAIRE

    Anchugina, Nina; Ryan, Matthew; Slinko, Arkadii

    2017-01-01

    We prove an analogue of Weitzman's (1998) famous result that an exponential discounter who is uncertain of the appropriate exponential discount rate should discount the far-distant future using the lowest (i.e., most patient) of the possible discount rates. Our analogous result applies to a hyperbolic discounter who is uncertain about the appropriate hyperbolic discount rate. In this case, the far-distant future should be discounted using the probability-weighted harmonic mean of the possible...

  5. On the ratio probability of the smallest eigenvalues in the Laguerre unitary ensemble

    Science.gov (United States)

    Atkin, Max R.; Charlier, Christophe; Zohren, Stefan

    2018-04-01

    We study the probability distribution of the ratio between the second smallest and smallest eigenvalue in the Laguerre unitary ensemble. The probability that this ratio is greater than r  >  1 is expressed in terms of an Hankel determinant with a perturbed Laguerre weight. The limiting probability distribution for the ratio as is found as an integral over containing two functions q 1(x) and q 2(x). These functions satisfy a system of two coupled Painlevé V equations, which are derived from a Lax pair of a Riemann-Hilbert problem. We compute asymptotic behaviours of these functions as and , as well as large n asymptotics for the associated Hankel determinants in several regimes of r and x.

  6. How to Commission, Operate and Maintain a Large Future Accelerator Complex From Far Remote Sites

    International Nuclear Information System (INIS)

    Phinney, Nan

    2001-01-01

    A study on future large accelerators [1] has considered a facility, which is designed, built and operated by a worldwide collaboration of equal partner institutions, and which is remote from most of these institutions. The full range of operation was considered including commissioning, machine development, maintenance, trouble shooting and repair. Experience from existing accelerators confirms that most of these activities are already performed remotely. The large high-energy physics experiments and astronomy projects, already involve international collaborations of distant institutions. Based on this experience, the prospects for a machine operated remotely from far sites are encouraging. Experts from each laboratory would remain at their home institution but continue to participate in the operation of the machine after construction. Experts are required to be on site only during initial commissioning and for particularly difficult problems. Repairs require an on-site non-expert maintenance crew. Most of the interventions can be made without an expert and many of the rest resolved with remote assistance. There appears to be no technical obstacle to controlling an accelerator from a distance. The major challenge is to solve the complex management and communication problems

  7. How to Commission, Operate and Maintain a Large Future Accelerator Complex From Far Remote Sites

    Energy Technology Data Exchange (ETDEWEB)

    Phinney, Nan

    2001-12-07

    A study on future large accelerators [1] has considered a facility, which is designed, built and operated by a worldwide collaboration of equal partner institutions, and which is remote from most of these institutions. The full range of operation was considered including commissioning, machine development, maintenance, troubleshooting and repair. Experience from existing accelerators confirms that most of these activities are already performed 'remotely'. The large high-energy physics experiments and astronomy projects, already involve international collaborations of distant institutions. Based on this experience, the prospects for a machine operated remotely from far sites are encouraging. Experts from each laboratory would remain at their home institution but continue to participate in the operation of the machine after construction. Experts are required to be on site only during initial commissioning and for particularly difficult problems. Repairs require an on-site non-expert maintenance crew. Most of the interventions can be made without an expert and many of the rest resolved with remote assistance. There appears to be no technical obstacle to controlling an accelerator from a distance. The major challenge is to solve the complex management and communication problems.

  8. Survival probabilities for branching Brownian motion with absorption

    OpenAIRE

    Harris, John; Harris, Simon

    2007-01-01

    We study a branching Brownian motion (BBM) with absorption, in which particles move as Brownian motions with drift $-\\rho$, undergo dyadic branching at rate $\\beta>0$, and are killed on hitting the origin. In the case $\\rho>\\sqrt{2\\beta}$ the extinction time for this process, $\\zeta$, is known to be finite almost surely. The main result of this article is a large-time asymptotic formula for the survival probability $P^x(\\zeta>t)$ in the case $\\rho>\\sqrt{2\\beta}$, where $P^x$ is...

  9. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  10. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...

  11. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  12. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  13. Domain walls, near-BPS bubbles, and probabilities in the landscape

    International Nuclear Information System (INIS)

    Ceresole, Anna; Dall'Agata, Gianguido; Giryavets, Alexander; Kallosh, Renata; Linde, Andrei

    2006-01-01

    We develop a theory of static Bogomol'nyi-Prasad-Sommerfield (BPS) domain walls in stringy landscape and present a large family of BPS walls interpolating between different supersymmetric vacua. Examples include Kachru, Kallosh, Linde, Trivedi models, STU models, type IIB multiple flux vacua, and models with several Minkowski and anti-de Sitter vacua. After the uplifting, some of the vacua become de Sitter (dS), whereas some others remain anti-de Sitter. The near-BPS walls separating these vacua may be seen as bubble walls in the theory of vacuum decay. As an outcome of our investigation of the BPS walls, we found that the decay rate of dS vacua to a collapsing space with a negative vacuum energy can be quite large. The parts of space that experience a decay to a collapsing space, or to a Minkowski vacuum, never return back to dS space. The channels of irreversible vacuum decay serve as sinks for the probability flow. The existence of such sinks is a distinguishing feature of the landscape. We show that it strongly affects the probability distributions in string cosmology

  14. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  15. Use of cyclotrons in medical research: Past, present, future

    Science.gov (United States)

    Smathers, James B.; Myers, Lee T.

    1985-05-01

    The use of cyclotrons in medical research started in the late 1930s with the most prominent use being neutron irradiation in cancer therapy. Due to a lack of understanding of the biological effect of neutrons, the results were less than encouraging. In the 1940s and 1950s, small cyclotrons were used for isotope production and in the mid 60s, the biological effect of neutrons was more thoroughly studied, with the result that a second trial of neutron therapy was initiated at Hammersmith Hospital, England. Concurrent with this, work on the use of high energy charged particles, initially protons and alphas, was initiated in Sweden and Russia and at Harvard and Berkeley. The English success in neutron therapy led to some pilot studies in the USA using physics cyclotrons of various energies and targets. These results in turn lead to the present series of machines presently being installed at M.D. Anderson Hospital (42 MeV), Seattle (50 MeV) and UCLA (46 MeV). The future probably bodes well for cyclotrons at the two extremes of the energy range. For nuclear medicine the shift is away from the use of multiple isotopes, which requires a large range of particles and energies to 11C, 13N, 15O, and 18F, which can be incorporated in metabolic specific compounds and be made with small 8-10 MeV p+ "table top" cyclotrons. For tumor therapy machines of 60 MeV or so will probably be the choice for the future, as they allow the treatment of deep seated tumors with neutrons and the charged particles have sufficient range to allow the treatment of ocular tumors.

  16. Future climate

    International Nuclear Information System (INIS)

    La Croce, A.

    1991-01-01

    According to George Woodwell, founder of the Woods Hole Research Center, due the combustion of fossil fuels, deforestation and accelerated respiration, the net annual increase of carbon, in the form of carbon dioxide, to the 750 billion tonnes already present in the earth's atmosphere, is in the order of 3 to 5 billion tonnes. Around the world, scientists, investigating the probable effects of this increase on the earth's future climate, are now formulating coupled air and ocean current models which take account of water temperature and salinity dependent carbon dioxide exchange mechanisms acting between the atmosphere and deep layers of ocean waters

  17. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    Science.gov (United States)

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  18. FutureCoast: "Listen to your futures"

    Science.gov (United States)

    Pfirman, S. L.; Eklund, K.; Thacher, S.; Orlove, B. S.; Diane Stovall-Soto, G.; Brunacini, J.; Hernandez, T.

    2014-12-01

    Two science-arts approaches are emerging as effective means to convey "futurethinking" to learners: systems gaming and experiential futures. FutureCoast exemplifies the latter: by engaging participants with voicemails supposedly leaking from the cloud of possible futures, the storymaking game frames the complexities of climate science in relatable contexts. Because participants make the voicemails themselves, FutureCoast opens up creative ways for people to think about possibly climate-changed futures and personal ways to talk about them. FutureCoast is a project of the PoLAR Partnership with a target audience of informal adult learners primarily reached via mobile devices and online platforms. Scientists increasingly use scenarios and storylines as ways to explore the implications of environmental change and societal choices. Stories help people make connections across experiences and disciplines and link large-scale events to personal consequences. By making the future seem real today, FutureCoast's framework helps people visualize and plan for future climate changes. The voicemails contributed to FutureCoast are spread through the game's intended timeframe (2020 through 2065). Based on initial content analysis of voicemail text, common themes include ecosystems and landscapes, weather, technology, societal issues, governance and policy. Other issues somewhat less frequently discussed include security, food, industry and business, health, energy, infrastructure, water, economy, and migration. Further voicemail analysis is examining: temporal dimensions (salient time frames, short vs. long term issues, intergenerational, etc.), content (adaptation vs. mitigation, challenges vs. opportunities, etc.), and emotion (hopeful, resigned, etc. and overall emotional context). FutureCoast also engaged audiences through facilitated in-person experiences, geocaching events, and social media (Tumblr, Twitter, Facebook, YouTube). Analysis of the project suggests story

  19. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  20. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  1. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  2. Stochastic Economic Dispatch with Wind using Versatile Probability Distribution and L-BFGS-B Based Dual Decomposition

    DEFF Research Database (Denmark)

    Huang, Shaojun; Sun, Yuanzhang; Wu, Qiuwei

    2018-01-01

    This paper focuses on economic dispatch (ED) in power systems with intermittent wind power, which is a very critical issue in future power systems. A stochastic ED problem is formed based on the recently proposed versatile probability distribution (VPD) of wind power. The problem is then analyzed...

  3. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  4. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  5. New measurements of spontaneous transition probabilities for beryllium-like ions

    International Nuclear Information System (INIS)

    Lang, J.; Hardcastle, R.A.; McWhirter, R.W.P.; Spurrett, P.H.

    1986-06-01

    The authors describe measurements of spectral line intensities for pairs of transitions having common upper levels and thus derive the branching ratios of their spontaneous radiative transition probabilities. These are then combined with the results of measurements of the radiative lifetimes of the upper levels by other authors to obtain values of the individual transition probabilities. The results are for transitions in NIV, OV and NeVII and are given with a claimed accuracy of between 7% and 38%. These are compared with values calculated theoretically. For some of the simpler electric dipole transitions good agreement is found. On the other hand for some of the other transitions which in certain cases are only possible because of configuration interaction disparities between the present measurements and theory are as large as x5. (author)

  6. Sharp probability estimates for Shor's order-finding algorithm

    OpenAIRE

    Bourdon, P. S.; Williams, H. T.

    2006-01-01

    Let N be a (large positive integer, let b > 1 be an integer relatively prime to N, and let r be the order of b modulo N. Finally, let QC be a quantum computer whose input register has the size specified in Shor's original description of his order-finding algorithm. We prove that when Shor's algorithm is implemented on QC, then the probability P of obtaining a (nontrivial) divisor of r exceeds 0.7 whenever N exceeds 2^{11}-1 and r exceeds 39, and we establish that 0.7736 is an asymptotic lower...

  7. Impact of large scale wind power on the Nordic electricity system

    International Nuclear Information System (INIS)

    Holttinen, Hannele

    2006-01-01

    Integration costs of wind power depend on how much wind power and where, and the power system: load, generation flexibility, interconnections. When wind power is added to a large interconnected power system there is considerable smoothing effect for the production. Increase of reserve requirements will stay at a low level. 10 percent penetration of wind power is not a problem in Nordic countries, as long as wind power is built to all 4 countries. Increasing the share of wind power will increase the integration costs. 20 percent penetration would need more flexibility in the system. That will not happen in the near future for Nordel, and the power system will probably also contain more flexible elements at that stage, like producing fuel for vehicles (ml)

  8. How Often Is p[subscript rep] Close to the True Replication Probability?

    Science.gov (United States)

    Trafimow, David; MacDonald, Justin A.; Rice, Stephen; Clason, Dennis L.

    2010-01-01

    Largely due to dissatisfaction with the standard null hypothesis significance testing procedure, researchers have begun to consider alternatives. For example, Killeen (2005a) has argued that researchers should calculate p[subscript rep] that is purported to indicate the probability that, if the experiment in question were replicated, the obtained…

  9. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  10. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  11. Radiation processing. Current status and future possibilities

    International Nuclear Information System (INIS)

    Woods, R.J.

    2000-01-01

    Radiation processing developed following the Second World War and employees gamma- or electron-irradiation to process polymers, cure alkene-based inks and coatings, sterilize medical supplies, irradiate food, and manage wastes. The current status of these applications is described with the probable direction of future developments. (author)

  12. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  13. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  14. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  15. Global mega forces: Implications for the future of natural resources

    Science.gov (United States)

    George H. Kubik

    2012-01-01

    The purpose of this paper is to provide an overview of leading global mega forces and their importance to the future of natural resource decisionmaking, policy development, and operation. Global mega forces are defined as a combination of major trends, preferences, and probabilities that come together to produce the potential for future high-impact outcomes. These...

  16. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  17. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  18. Measuring inequity aversion in a heterogeneous population using experimental decisions and subjective probabilities

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2008-01-01

    We combine choice data in the ultimatum game with the expectations of proposers elicited by subjective probability questions to estimate a structural model of decision making under uncertainty. The model, estimated using a large representative sample of subjects from the Dutch population, allows

  19. Design and performance of large-pixel-size high-fill-fraction TES arrays for future X-ray astrophysics missions

    International Nuclear Information System (INIS)

    Figueroa-Feliciano, E.; Bandler, S.R.; Chervenak, J.; Finkbeiner, F.; Iyomoto, N.; Kelley, R.L.; Kilbourne, C.A.; Porter, F.S.; Saab, T.; Sadleir, J.; White, J.

    2006-01-01

    We have designed, modeled, fabricated and tested a 600μm high-fill-fraction microcalorimeter array that will be a good match to the requirements of future X-ray missions. Our devices use transition-edge sensors coupled to overhanging bismuth/copper absorbers to produce arrays with 97% or higher fill fraction. An extensive modeling effort was undertaken in order to accommodate large pixel sizes (500-1000μm) and maintain the best energy resolution possible. The finite thermalization time of the large absorber and the associated position dependence of the pulse shape on absorption position constrain the time constants of the system given a desired energy-resolution performance. We show the results of our analysis and our new pixel design, consisting of a novel TES-on-the-side architecture which creates a controllable TES-absorber conductance

  20. p-adic probability prediction of correlations between particles in the two-slit and neutron interferometry experiments

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1998-01-01

    The Author start from Feynman's idea to use negative probabilities to describe the two-slit experiment and other quantum interference experiments. Formally by using negative probability distributions the Author can explain the results of the two-slit experiment on the basis of the pure corpuscular picture of quantum mechanics. However, negative probabilities are absurd objects in the framework of the standard Kolmogorov theory of probability. The Author present a large class of non-Kolmogorovean probability models where negative probabilities are well defined on the frequency basis. These are models with probabilities which belong to the so-called field of p-adic numbers. However, these models are characterized by correlations between trails. Therefore, the Author predict correlations between particles in interference experiments. In fact, the predictions are similar to the predictions of the so-called nonen ergodic interpretation of quantum mechanics, which was proposed by V. Buonomano. The Author propose the concrete experiments (in particular, in the framework of the neutron interferometry) to verify our predictions on the correlations

  1. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  2. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  3. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  4. A framework to estimate probability of diagnosis error in NPP advanced MCR

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Kim, Jong Hyun; Jang, Inseok; Seong, Poong Hyun

    2018-01-01

    Highlights: •As new type of MCR has been installed in NPPs, the work environment is considerably changed. •A new framework to estimate operators’ diagnosis error probabilities should be proposed. •Diagnosis error data were extracted from the full-scope simulator of the advanced MCR. •Using Bayesian inference, a TRC model was updated for use in advanced MCR. -- Abstract: Recently, a new type of main control room (MCR) has been adopted in nuclear power plants (NPPs). The new MCR, known as the advanced MCR, consists of digitalized human-system interfaces (HSIs), computer-based procedures (CPS), and soft controls while the conventional MCR includes many alarm tiles, analog indicators, hard-wired control devices, and paper-based procedures. These changes significantly affect the generic activities of the MCR operators, in relation to diagnostic activities. The aim of this paper is to suggest a framework to estimate the probabilities of diagnosis errors in the advanced MCR by updating a time reliability correlation (TRC) model. Using Bayesian inference, the TRC model was updated with the probabilities of diagnosis errors. Here, the diagnosis error data were collected from a full-scope simulator of the advanced MCR. To do this, diagnosis errors were determined based on an information processing model and their probabilities were calculated. However, these calculated probabilities of diagnosis errors were largely affected by context factors such as procedures, HSI, training, and others, known as PSFs (Performance Shaping Factors). In order to obtain the nominal diagnosis error probabilities, the weightings of PSFs were also evaluated. Then, with the nominal diagnosis error probabilities, the TRC model was updated. This led to the proposal of a framework to estimate the nominal probabilities of diagnosis errors in the advanced MCR.

  5. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Future development of large superconducting generators

    International Nuclear Information System (INIS)

    Singh, S.K.; Mole, C.J.

    1989-01-01

    Large superconducting generators are being developed worldwide. The use of superconductors to reduce the electrical power dissipation in power equipment has been a technological possibility ever since the discovery of superconductivity, even though their use in power equipment remained an impractical dream for a long time. However, scientific and technological progress in superconductivity and cryogenics has brought this dream much closer to reality. Results obtained so far establish the technical feasibility of these machines. Analytical developments have been providing a sound basis for the design of superconducting machines and results of these design studies have shown improvements in power density of up to a factor of 10 higher than the power density for conventional machines. This paper describes the recently completed USA programs, the current foreign and USA programs, and then proposes a USA development program to maintain leadership in the field

  7. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  8. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  9. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  10. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  11. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  12. Game interrupted: The rationality of considering the future

    Directory of Open Access Journals (Sweden)

    Brandon Almy

    2013-09-01

    Full Text Available The ``problem of points'', introduced by Paccioli in 1494 and solved by Pascal and Fermat 160 years later, inspired the modern concept of probability. Incidentally, the problem also shows that rational decision-making requires the consideration of future events. We show that naive responses to the problem of points are more future oriented and thus more rational in this sense when the problem itself is presented in a future frame instead of the canonical past frame. A simple nudge is sufficient to make decisions more rational. We consider the implications of this finding for hypothesis testing and predictions of replicability.

  13. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  15. submitter Training Behavior of the Main Dipoles in the Large Hadron Collider

    CERN Document Server

    Todesco, Ezio; Bajko, Marta; Bottura, Luca; Bruning, Oliver; De Rijk, Gijs; Fessia, Paolo; Hagen, Per; Naour, Sandrine Le; Modena, Michele; Perez, Juan Carlos; Rossi, Lucio; Schmidt, Rudiger; Siemko, Andrzej; Tock, Jean-Philippe; Tommasini, Davide; Verweij, Arjan; Willering, Gerard

    2017-01-01

    In 2015, the 1232 Nb-Ti dipole magnets in the Large Hadron Collider (LHC) have been commissioned to 7.8 T operational field, with 172 quenches. More than 80% of these quenches occurred in the magnets of one of the three cold mass assemblers (3000 series), confirming what was already observed in 2008. In this paper, the recent analysis carried out on the quench performance of the Large Hadron Collider dipole magnets is reported, including the individual reception tests and the 2008 and 2015 commissioning campaigns, to better understand the above-mentioned anomaly and give an outlook for future operation and possible increase of the operational field. The lower part of the quench probability spectrum is compatible with Gaussian distributions; therefore, the training curve can be fit through error functions. An essential ingredient in this analysis is the estimate of the error to be associated with the training data due to sampling of rare events, allowing to test different hypothesis. Using this approach, an es...

  16. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  17. The oscillation probability of GeV solar neutrinos of all active species

    International Nuclear Information System (INIS)

    Gouvea, Andre de

    2001-01-01

    I discuss the oscillation probability of O(GeV) neutrinos of all active flavours produced inside the Sun and detected at the Earth. In the GeV energy regime, matter effects are potentially important both for the ''1-3'' system and the ''1-2'' system. A numerical scan of the multidimensional three-flavour parameter space is presented. One curiosity is that in the three-flavour oscillation case P αβ ≠ P βα for a large portion of the parameter space, even if the MNS matrix is real. Oscillation effects computed here may play a large role in interpreting solar WIMP search data from large neutrino telescopes

  18. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  19. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  20. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  1. Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and Applications of Stochastic Analysis

    Science.gov (United States)

    2015-08-13

    Critical Catalyst Reactant Branching Processes with Controlled Immigration , Annals of Applied Probability (03 2012) Amarjit Budhiraja, Rami Atar ...Markus Fischer. Large Deviation Properties of Weakly Interacting Processes via Weak Convergence Methods, Annals of Probability (10 2010) Rami Atar ...Dimensional Forward-Backward Stochastic Differen- tial Equations and the KPZ Equation Electron. J. Probab., 19 (2014), no. 40, 121. [2] R. Atar and A

  2. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  3. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  4. Large local reactions to insect envenomation.

    Science.gov (United States)

    Carlson, John; Golden, David B K

    2016-08-01

    Insect stings often induce large local reactions (LLRs) that result in morbidity. These reactions do have an immunologic basis; however, patients presenting with LLRs should be managed differently than those with systemic allergic reactions, as described in this review. Morbidity results from the inflammation itself along with the iatrogenic consequences of treatment. The prescription of antihistamine medications and the use of antibiotics are generally not indicated for patients with LLRs because of the risks/side-effects of these medications and the low probability of benefit. Some patients are also concerned over the possibility that a future sting will evolve into a life-threatening reaction. Although these reactions do involve IgE, patients are not at sufficient risk to warrant prescription of autoinjectable epinephrine. Venom-specific immunotherapy can be considered when LLRs are frequent and associated with significant impairment. Clinicians can reduce morbidity from LLRs by reassuring the patients, avoiding medications that result in side-effects when they are not indicated, and referring to an allergist when there are additional concerns, such as frequent impairment.

  5. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  6. Knot probability of polygons subjected to a force: a Monte Carlo study

    International Nuclear Information System (INIS)

    Rensburg, E J Janse van; Orlandini, E; Tesi, M C; Whittington, S G

    2008-01-01

    We use Monte Carlo methods to study the knot probability of lattice polygons on the cubic lattice in the presence of an external force f. The force is coupled to the span of the polygons along a lattice direction, say the z-direction. If the force is negative polygons are squeezed (the compressive regime), while positive forces tend to stretch the polygons along the z-direction (the tensile regime). For sufficiently large positive forces we verify that the Pincus scaling law in the force-extension curve holds. At a fixed number of edges n the knot probability is a decreasing function of the force. For a fixed force the knot probability approaches unity as 1 - exp(-α 0 (f)n + o(n)), where α 0 (f) is positive and a decreasing function of f. We also examine the average of the absolute value of the writhe and we verify the square root growth law (known for f = 0) for all values of f

  7. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  8. Can Probability Maps of Swept-Source Optical Coherence Tomography Predict Visual Field Changes in Preperimetric Glaucoma?

    Science.gov (United States)

    Lee, Won June; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-12-01

    To determine the usefulness of swept-source optical coherence tomography (SS-OCT) probability maps in detecting locations with significant reduction in visual field (VF) sensitivity or predicting future VF changes, in patients with classically defined preperimetric glaucoma (PPG). Of 43 PPG patients, 43 eyes were followed-up on every 6 months for at least 2 years were analyzed in this longitudinal study. The patients underwent wide-field SS-OCT scanning and standard automated perimetry (SAP) at the time of enrollment. With this wide-scan protocol, probability maps originating from the corresponding thickness map and overlapped with SAP VF test points could be generated. We evaluated the vulnerable VF points with SS-OCT probability maps as well as the prevalence of locations with significant VF reduction or subsequent VF changes observed in the corresponding damaged areas of the probability maps. The vulnerable VF points were shown in superior and inferior arcuate patterns near the central fixation. In 19 of 43 PPG eyes (44.2%), significant reduction in baseline VF was detected within the areas of structural change on the SS-OCT probability maps. In 16 of 43 PPG eyes (37.2%), subsequent VF changes within the areas of SS-OCT probability map change were observed over the course of the follow-up. Structural changes on SS-OCT probability maps could detect or predict VF changes using SAP, in a considerable number of PPG eyes. Careful comparison of probability maps with SAP results could be useful in diagnosing and monitoring PPG patients in the clinical setting.

  9. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  10. Smoothing and projecting age-specific probabilities of death by TOPALS

    Directory of Open Access Journals (Sweden)

    Joop de Beer

    2012-10-01

    Full Text Available BACKGROUND TOPALS is a new relational model for smoothing and projecting age schedules. The model is operationally simple, flexible, and transparent. OBJECTIVE This article demonstrates how TOPALS can be used for both smoothing and projecting age-specific mortality for 26 European countries and compares the results of TOPALS with those of other smoothing and projection methods. METHODS TOPALS uses a linear spline to describe the ratios between the age-specific death probabilities of a given country and a standard age schedule. For smoothing purposes I use the average of death probabilities over 15 Western European countries as standard, whereas for projection purposes I use an age schedule of 'best practice' mortality. A partial adjustment model projects how quickly the death probabilities move in the direction of the best-practice level of mortality. RESULTS On average, TOPALS performs better than the Heligman-Pollard model and the Brass relational method in smoothing mortality age schedules. TOPALS can produce projections that are similar to those of the Lee-Carter method, but can easily be used to produce alternative scenarios as well. This article presents three projections of life expectancy at birth for the year 2060 for 26 European countries. The Baseline scenario assumes a continuation of the past trend in each country, the Convergence scenario assumes that there is a common trend across European countries, and the Acceleration scenario assumes that the future decline of death probabilities will exceed that in the past. The Baseline scenario projects that average European life expectancy at birth will increase to 80 years for men and 87 years for women in 2060, whereas the Acceleration scenario projects an increase to 90 and 93 years respectively. CONCLUSIONS TOPALS is a useful new tool for demographers for both smoothing age schedules and making scenarios.

  11. Large Coil Program magnetic system design study

    International Nuclear Information System (INIS)

    Moses, S.D.; Johnson, N.E.

    1977-01-01

    The primary objective of the Large Coil Program (LCP) is to demonstrate the reliable operation of large superconducting coils to provide a basis for the design principles, materials, and fabrication techniques proposed for the toroidal magnets for the THE NEXT STEP (TNS) and other future tokamak devices. This paper documents a design study of the Large Coil Test Facility (LCTF) in which the structural response of the Toroidal Field (TF) Coils and the supporting structure was evaluated under simulated reactor conditions. The LCP test facility structural system consists of six TF Coils, twelve coil-to-coil torsional restraining beams (torque rings), a central bucking post with base, and a Pulse Coil system. The NASTRAN Finite Element Structural Analysis computer Code was utilized to determine the distribution of deflections, forces, and stresses for each of the TF Coils, torque rings, and the central bucking post. Eleven load conditions were selected to represent probable test operations. Pulse Coils suspended in the bore of the test coil were energized to simulate the pulsed field environment characteristic of the TNS reactor system. The TORMAC Computer Code was utilized to develop the magnetic forces in the TF Coils for each of the eleven loading conditions examined, with or without the Pulse Coils energized. The TORMAC computer program output forces were used directly as input load conditions for the NASTRAN analyses. Results are presented which demonstrate the reliability of the LCTF under simulated reactor operating conditions

  12. Entanglement transitions induced by large deviations

    Science.gov (United States)

    Bhosale, Udaysinh T.

    2017-12-01

    The probability of large deviations of the smallest Schmidt eigenvalue for random pure states of bipartite systems, denoted as A and B , is computed analytically using a Coulomb gas method. It is shown that this probability, for large N , goes as exp[-β N2Φ (ζ ) ] , where the parameter β is the Dyson index of the ensemble, ζ is the large deviation parameter, while the rate function Φ (ζ ) is calculated exactly. Corresponding equilibrium Coulomb charge density is derived for its large deviations. Effects of the large deviations of the extreme (largest and smallest) Schmidt eigenvalues on the bipartite entanglement are studied using the von Neumann entropy. Effect of these deviations is also studied on the entanglement between subsystems 1 and 2, obtained by further partitioning the subsystem A , using the properties of the density matrix's partial transpose ρ12Γ. The density of states of ρ12Γ is found to be close to the Wigner's semicircle law with these large deviations. The entanglement properties are captured very well by a simple random matrix model for the partial transpose. The model predicts the entanglement transition across a critical large deviation parameter ζ . Log negativity is used to quantify the entanglement between subsystems 1 and 2. Analytical formulas for it are derived using the simple model. Numerical simulations are in excellent agreement with the analytical results.

  13. Use of soft probabilities in evaluating physical-security systems

    International Nuclear Information System (INIS)

    Green, J.N.

    1982-03-01

    The complexity of evaluating how a physical security system would perform against a broad array of threat situations dictates the use by an inspector of methods which are not completely rigorous. Intuition and judgment based on experience have a large role to play. The use of soft probabilities can give meaningful results when the nature of the situation to which they are applied is sufficiently understood. Although the scoring method proposed is based on complex theory, it is feasible to apply on an intuitive basis. 6 figures

  14. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  15. Estimating the Probabilities of Low-Weight Differential and Linear Approximations on PRESENT-like Ciphers

    DEFF Research Database (Denmark)

    Abdelraheem, Mohamed Ahmed

    2012-01-01

    We use large but sparse correlation and transition-difference-probability submatrices to find the best linear and differential approximations respectively on PRESENT-like ciphers. This outperforms the branch and bound algorithm when the number of low-weight differential and linear characteristics...

  16. Large electrostatic accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1984-01-01

    The paper is divided into four parts: a discussion of the motivation for the construction of large electrostatic accelerators, a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year

  17. Activity in inferior parietal and medial prefrontal cortex signals the accumulation of evidence in a probability learning task.

    Directory of Open Access Journals (Sweden)

    Mathieu d'Acremont

    Full Text Available In an uncertain environment, probabilities are key to predicting future events and making adaptive choices. However, little is known about how humans learn such probabilities and where and how they are encoded in the brain, especially when they concern more than two outcomes. During functional magnetic resonance imaging (fMRI, young adults learned the probabilities of uncertain stimuli through repetitive sampling. Stimuli represented payoffs and participants had to predict their occurrence to maximize their earnings. Choices indicated loss and risk aversion but unbiased estimation of probabilities. BOLD response in medial prefrontal cortex and angular gyri increased linearly with the probability of the currently observed stimulus, untainted by its value. Connectivity analyses during rest and task revealed that these regions belonged to the default mode network. The activation of past outcomes in memory is evoked as a possible mechanism to explain the engagement of the default mode network in probability learning. A BOLD response relating to value was detected only at decision time, mainly in striatum. It is concluded that activity in inferior parietal and medial prefrontal cortex reflects the amount of evidence accumulated in favor of competing and uncertain outcomes.

  18. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Flood protection diversification to reduce probabilities of extreme losses.

    Science.gov (United States)

    Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor

    2012-11-01

    Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.

  20. Covariate-adjusted Spearman's rank correlation with probability-scale residuals.

    Science.gov (United States)

    Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E

    2018-06-01

    It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.

  1. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  2. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  3. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  4. Exact asymptotics of probabilities of large deviations for Markov chains: the Laplace method

    Energy Technology Data Exchange (ETDEWEB)

    Fatalov, Vadim R [M. V. Lomonosov Moscow State University, Faculty of Mechanics and Mathematics, Moscow (Russian Federation)

    2011-08-31

    We prove results on exact asymptotics as n{yields}{infinity} for the expectations E{sub a} exp{l_brace}-{theta}{Sigma}{sub k=0}{sup n-1}g(X{sub k}){r_brace} and probabilities P{sub a}{l_brace}(1/n {Sigma}{sub k=0}{sup n-1}g(X{sub k})=}1, is the corresponding random walk on R, g(x) is a positive continuous function satisfying certain conditions, and d>0, {theta}>0, a element of R are fixed numbers. Our results are obtained using a new method which is developed in this paper: the Laplace method for the occupation time of discrete-time Markov chains. For g(x) one can take |x|{sup p}, log (|x|{sup p}+1), p>0, |x| log (|x|+1), or e{sup {alpha}|x|}-1, 0<{alpha}<1/2, x element of R, for example. We give a detailed treatment of the case when g(x)=|x| using Bessel functions to make explicit calculations.

  5. Impacts of Changing Climatic Drivers and Land use features on Future Stormwater Runoff in the Northwest Florida Basin: A Large-Scale Hydrologic Modeling Assessment

    Science.gov (United States)

    Khan, M.; Abdul-Aziz, O. I.

    2017-12-01

    Potential changes in climatic drivers and land cover features can significantly influence the stormwater budget in the Northwest Florida Basin. We investigated the hydro-climatic and land use sensitivities of stormwater runoff by developing a large-scale process-based rainfall-runoff model for the large basin by using the EPA Storm Water Management Model (SWMM 5.1). Climatic and hydrologic variables, as well as land use/cover features were incorporated into the model to account for the key processes of coastal hydrology and its dynamic interactions with groundwater and sea levels. We calibrated and validated the model by historical daily streamflow observations during 2009-2012 at four major rivers in the basin. Downscaled climatic drivers (precipitation, temperature, solar radiation) projected by twenty GCMs-RCMs under CMIP5, along with the projected future land use/cover features were also incorporated into the model. The basin storm runoff was then simulated for the historical (2000s = 1976-2005) and two future periods (2050s = 2030-2059, and 2080s = 2070-2099). Comparative evaluation of the historical and future scenarios leads to important guidelines for stormwater management in Northwest Florida and similar regions under a changing climate and environment.

  6. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  7. Large area thinned planar sensors for future high-luminosity-LHC upgrades

    International Nuclear Information System (INIS)

    Wittig, T.; Lawerenz, A.; Röder, R.

    2016-01-01

    Planar hybrid silicon sensors are a well proven technology for past and current particle tracking detectors in HEP experiments. However, the future high-luminosity upgrades of the inner trackers at the LHC experiments pose big challenges to the detectors. A first challenge is an expected radiation damage level of up to 2⋅ 10 16 n eq /cm 2 . For planar sensors, one way to counteract the charge loss and thus increase the radiation hardness is to decrease the thickness of their active area. A second challenge is the large detector area which has to be built as cost-efficient as possible. The CiS research institute has accomplished a proof-of-principle run with n-in-p ATLAS-Pixel sensors in which a cavity is etched to the sensor's back side to reduce its thickness. One advantage of this technology is the fact that thick frames remain at the sensor edges and guarantee mechanical stability on wafer level while the sensor is left on the resulting thin membrane. For this cavity etching technique, no handling wafers are required which represents a benefit in terms of process effort and cost savings. The membranes with areas of up to ∼ 4 × 4 cm 2 and thicknesses of 100 and 150 μm feature a sufficiently good homogeneity across the whole wafer area. The processed pixel sensors show good electrical behaviour with an excellent yield for a suchlike prototype run. First sensors with electroless Ni- and Pt-UBM are already successfully assembled with read-out chips.

  8. Large area thinned planar sensors for future high-luminosity-LHC upgrades

    Science.gov (United States)

    Wittig, T.; Lawerenz, A.; Röder, R.

    2016-12-01

    Planar hybrid silicon sensors are a well proven technology for past and current particle tracking detectors in HEP experiments. However, the future high-luminosity upgrades of the inner trackers at the LHC experiments pose big challenges to the detectors. A first challenge is an expected radiation damage level of up to 2ṡ 1016 neq/cm2. For planar sensors, one way to counteract the charge loss and thus increase the radiation hardness is to decrease the thickness of their active area. A second challenge is the large detector area which has to be built as cost-efficient as possible. The CiS research institute has accomplished a proof-of-principle run with n-in-p ATLAS-Pixel sensors in which a cavity is etched to the sensor's back side to reduce its thickness. One advantage of this technology is the fact that thick frames remain at the sensor edges and guarantee mechanical stability on wafer level while the sensor is left on the resulting thin membrane. For this cavity etching technique, no handling wafers are required which represents a benefit in terms of process effort and cost savings. The membranes with areas of up to ~ 4 × 4 cm2 and thicknesses of 100 and 150 μm feature a sufficiently good homogeneity across the whole wafer area. The processed pixel sensors show good electrical behaviour with an excellent yield for a suchlike prototype run. First sensors with electroless Ni- and Pt-UBM are already successfully assembled with read-out chips.

  9. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  10. Future-Oriented LCA

    DEFF Research Database (Denmark)

    Olsen, Stig Irving; Borup, Mads; Andersen, Per Dannemand

    2018-01-01

    LCA is often applied for decision-making that concerns actions reaching near or far into the future. However, traditional life cycle assessment methodology must be adjusted for the prospective and change-oriented purposes, but no standardised way of doing this has emerged yet. In this chapter some...... challenges are described and some learnings are derived. Many of the future-oriented LCAs published so far perform relatively short-term prediction of simple comparisons. But for more long-term time horizons foresight methods can be of help. Scenarios established by qualified experts about future...... technological and economic developments are indispensable in future technology assessments. The uncertainties in future-oriented LCAs are to a large extent qualitative and it is important to emphasise that LCA of future technologies will provide a set of answers and not ‘the’ answer....

  11. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  12. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  13. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  14. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  15. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  16. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  17. The effects of a Danish legal reform on divorce probabilities and pension savings

    DEFF Research Database (Denmark)

    Amilon, Anna

    2012-01-01

    This paper investigates how a recent Danish legal reform that changed how pension savings are shared upon divorce influenced divorce probabilities and pension savings. A simple theoretical model predicts that the reform should cause couples to either get divorced or sign postnuptial agreements...... for the couples in the sample was less than 1 %, the relative increase in divorce probabilities caused by the reform was between 12 and 40 %. The effects are largest when the wife is the one gaining from getting divorced. Moreover, the reform influenced wives', but not husbands', pension savings and caused....... Using a large panel of Danish register data, I find that the probability of getting divorced increased by between 0.1 and 0.3 % for couples that were affected by the reform, compared to couples that were not. Although these increases might appear small, given that the average divorce rates...

  18. Cluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images

    Directory of Open Access Journals (Sweden)

    LI Jian-Wei

    2014-08-01

    Full Text Available On the basis of the cluster validity function based on geometric probability in literature [1, 2], propose a cluster analysis method based on geometric probability to process large amount of data in rectangular area. The basic idea is top-down stepwise refinement, firstly categories then subcategories. On all clustering levels, use the cluster validity function based on geometric probability firstly, determine clusters and the gathering direction, then determine the center of clustering and the border of clusters. Through TM remote sensing image classification examples, compare with the supervision and unsupervised classification in ERDAS and the cluster analysis method based on geometric probability in two-dimensional square which is proposed in literature 2. Results show that the proposed method can significantly improve the classification accuracy.

  19. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  20. Decommissioning of nuclear reprocessing plants French past experience and approach to future large scale operations

    International Nuclear Information System (INIS)

    Jean Jacques, M.; Maurel, J.J.; Maillet, J.

    1994-01-01

    Over the years, France has built up significant experience in dismantling nuclear fuel reprocessing facilities or various types of units representative of a modern reprocessing plant. However, only small or medium scale operations have been carried out so far. To prepare the future decommissioning of large size industrial facilities such as UP1 (Marcoule) and UP2 (La Hague), new technologies must be developed to maximize waste recycling and optimize direct operations by operators, taking the integrated dose and cost aspects into account. The decommissioning and dismantling methodology comprises: a preparation phase for inventory, choice and installation of tools and arrangement of working areas, a dismantling phase with decontamination, and a final contamination control phase. Detailed description of dismantling operations of the MA Pu finishing facility (La Hague) and of the RM2 radio metallurgical laboratory (CEA-Fontenay-aux-Roses) are given as examples. (J.S.). 3 tabs

  1. Vicious random walkers in the limit of a large number of walkers

    International Nuclear Information System (INIS)

    Forrester, P.J.

    1989-01-01

    The vicious random walker problem on a line is studied in the limit of a large number of walkers. The multidimensional integral representing the probability that the p walkers will survive a time t (denoted P t (p) ) is shown to be analogous to the partition function of a particular one-component Coulomb gas. By assuming the existence of the thermodynamic limit for the Coulomb gas, one can deduce asymptotic formulas for P t (p) in the large-p, large-t limit. A straightforward analysis gives rigorous asymptotic formulas for the probability that after a time t the walkers are in their initial configuration (this event is termed a reunion). Consequently, asymptotic formulas for the conditional probability of a reunion, given that all walkers survive, are derived. Also, an asymptotic formula for the conditional probability density that any walker will arrive at a particular point in time t, given that all p walkers survive, is calculated in the limit t >> p

  2. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  3. Energy future 2050

    Energy Technology Data Exchange (ETDEWEB)

    Syri, S; Kainiemi, L; Riikonen, V [Aalto Univ. School of Engineering, Espoo (Finland). Dept. of Energy Technology

    2011-07-01

    The track was organized by the Department of Energy Technology, School of Engineering, at Aalto University. Energy future 2050 -track introduced participants to the global long-term challenges of achieving a sustainable energy supply. According to the Intergovernmental Panel on Climate Change (IPCC), effective climate change mitigation would require the global greenhouse gas emissions to be reduced by 50-85% from the present level by 2050. For industrialized countries, this would probably mean a practically carbon-neutral economy and energy supply, as developing countries need more possibilities for growth and probably enter stricter emission reduction commitments with some delay. In the beginning of the workshop, students were introduced to global energy scenarios and the challenge of climate change mitigation. Students worked in three groups with the following topics: How to gain public acceptance of Carbon (dioxide) Capture and Storage (CCS) ? Personal emissions trading as a tool to achieve deep emission cuts, How to get rid of fossil fuel subsidies? Nordic cases are peat use in Finland and Sweden. (orig.)

  4. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  5. Symposium report: the Waters Bioanalysis World Tour: the broadening impact and future of the DMPK laboratory--addressing large-molecule therapeutics.

    Science.gov (United States)

    De Vooght-Johnson, Ryan

    2011-03-01

    An evening symposium was held at the Museu de Historia de Catalunya (Barcelona, Spain) as a precursor to the European Bioanalysis Forum meeting, as part of the Waters Corporation Bioanalysis World Tour. The symposium was chaired by Robert Plumb and Jing Lin (Waters Corporation, MA, USA) with a focus on the future of the DMPK laboratory and its role in addressing large-molecule therapeutics and biomarkers. Lieve Dillen (Johnson and Johnson, Belgium) spoke on ultra-sensitive peptide quantification, Richard Kay (Quotient Bioresearch, UK) discussed quantifying proteins and peptides in plasma, Ian Wilson (AstraZeneca, UK) covered metabolic biomarkers and Robert Plumb concluded the evening with a presentation on the future of MS in DMPK studies. Following the presentations, all the speakers took questions from the audience and continued lively discussion over a cocktails and canapés reception.

  6. Spacetime quantum probabilities II: Relativized descriptions and Popperian propensities

    Science.gov (United States)

    Mugur-Schächter, M.

    1992-02-01

    In the first part of this work(1) we have explicated the spacetime structure of the probabilistic organization of quantum mechanics. We have shown that each quantum mechanical state, in consequence of the spacetime characteristics of the epistemic operations by which the observer produces the state to be studied and the processes of qualification of these, brings in a tree-like spacetime structure, a “quantum mechanical probability tree,” that transgresses the theory of probabilities as it now stands. In this second part we develop the general implications of these results. Starting from the lowest level of cognitive action and creating an appropriate symbolism, we construct a “relativizing epistemic syntax,” a “general method of relativized conceptualization” where—systematically—each description is explicitly referred to the epistemic operations by which the observer produces the entity to be described and obtains qualifications of it. The method generates a typology of increasingly complex relativized descriptions where the question of realism admits of a particularly clear pronouncement. Inside this typology the epistemic processes that lie—UNIVERSALLY—at the basis of any conceptualization, reveal a tree-like spacetime structure. It appears in particular that the spacetime structure of the relativized representation of a probabilistic description, which transgresses the nowadays theory of probabilities, is the general mould of which the quantum mechanical probability trees are only particular realizations. This entails a clear definition of the descriptional status of quantum mechanics. While the recognition of the universal cognitive content of the quantum mechanical formalism opens up vistas toward mathematical developments of the relativizing epistemic syntax. The relativized representation of a probabilistic description leads with inner necessity to a “morphic” interpretation of probabilities that can be regarded as a formalized and

  7. Probability of Accurate Heart Failure Diagnosis and the Implications for Hospital Readmissions.

    Science.gov (United States)

    Carey, Sandra A; Bass, Kyle; Saracino, Giovanna; East, Cara A; Felius, Joost; Grayburn, Paul A; Vallabhan, Ravi C; Hall, Shelley A

    2017-04-01

    Heart failure (HF) is a complex syndrome with inherent diagnostic challenges. We studied the scope of possibly inaccurately documented HF in a large health care system among patients assigned a primary diagnosis of HF at discharge. Through a retrospective record review and a classification schema developed from published guidelines, we assessed the probability of the documented HF diagnosis being accurate and determined factors associated with HF-related and non-HF-related hospital readmissions. An arbitration committee of 3 experts reviewed a subset of records to corroborate the results. We assigned a low probability of accurate diagnosis to 133 (19%) of the 712 patients. A subset of patients were also reviewed by an expert panel, which concluded that 13% to 35% of patients probably did not have HF (inter-rater agreement, kappa = 0.35). Low-probability HF was predictive of being readmitted more frequently for non-HF causes (p = 0.018), as well as documented arrhythmias (p = 0.023), and age >60 years (p = 0.006). Documented sleep apnea (p = 0.035), percutaneous coronary intervention (p = 0.006), non-white race (p = 0.047), and B-type natriuretic peptide >400 pg/ml (p = 0.007) were determined to be predictive of HF readmissions in this cohort. In conclusion, approximately 1 in 5 patients documented to have HF were found to have a low probability of actually having it. Moreover, the determination of low-probability HF was twice as likely to result in readmission for non-HF causes and, thus, should be considered a determinant for all-cause readmissions in this population. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  9. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  10. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  11. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  12. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  13. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  14. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  15. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  16. Systematics of the breakup probability function for {sup 6}Li and {sup 7}Li projectiles

    Energy Technology Data Exchange (ETDEWEB)

    Capurro, O.A., E-mail: capurro@tandar.cnea.gov.ar [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); Pacheco, A.J.; Arazi, A. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Carnelli, P.F.F. [CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); Fernández Niello, J.O. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); and others

    2016-01-15

    Experimental non-capture breakup cross sections can be used to determine the probability of projectile and ejectile fragmentation in nuclear reactions involving weakly bound nuclei. Recently, the probability of both type of dissociations has been analyzed in nuclear reactions involving {sup 9}Be projectiles onto various heavy targets at sub-barrier energies. In the present work we extend this kind of systematic analysis to the case of {sup 6}Li and {sup 7}Li projectiles with the purpose of investigating general features of projectile-like breakup probabilities for reactions induced by stable weakly bound nuclei. For that purpose we have obtained the probabilities of projectile and ejectile breakup for a large number of systems, starting from a compilation of the corresponding reported non-capture breakup cross sections. We parametrize the results in accordance with the previous studies for the case of beryllium projectiles, and we discuss their systematic behavior as a function of the projectile, the target mass and the reaction Q-value.

  17. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  18. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  19. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  20. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  1. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  2. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  3. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  4. Preliminary Evaluation of the Effects of Buried Volcanoes on Estimates of Volcano Probability for the Proposed Repository Site at Yucca Mountain, Nevada

    Science.gov (United States)

    Hill, B. E.; La Femina, P. C.; Stamatakos, J.; Connor, C. B.

    2002-12-01

    Probability models that calculate the likelihood of new volcano formation in the Yucca Mountain (YM) area depend on the timing and location of past volcanic activity. Previous spatio-temporal patterns indicated a 10-4 to 10-3 probability of volcanic disruption of the proposed radioactive waste repository site at YM during the 10,000 year post-closure performance period (Connor et al. 2000, JGR 105:1). A recent aeromagnetic survey (Blakely et al. 2000, USGS OFR 00-188), however, identified up to 20 anomalies in alluvium-filled basins, which have characteristics indicative of buried basalt (O'Leary et al. 2002, USGS OFR 02-020). Independent evaluation of these data, combined with new ground magnetic surveys, shows that these anomalies may represent at least ten additional buried basaltic volcanoes, which have not been included in previous probability calculations. This interpretation, if true, nearly doubles the number of basaltic volcanoes within 30 km [19 mi] of YM. Moreover, the magnetic signature of about half of the recognized basaltic volcanoes in the YM area cannot be readily identified in areas where bedrock also produces large amplitude magnetic anomalies, suggesting that additional volcanoes may be present but undetected in the YM area. In the absence of direct age information, we evaluate the potential effects of alternative age assumptions on spatio-temporal probability models. Interpreted burial depths of >50 m [164 ft] suggest ages >2 Ma, based on sedimentation rates typical for these alluvial basins (Stamatakos et al., 1997, J. Geol. 105). Defining volcanic events as individual points, previous probability models generally used recurrence rates of 2-5 volcanoes/million years (v/Myr). If the identified anomalies are buried volcanoes that are all >5 Ma or uniformly distributed between 2-10 Ma, calculated probabilities of future volcanic disruption at YM change by <30%. However, a uniform age distribution between 2-5 Ma for the presumed buried volcanoes

  5. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  7. A Future with Hope :China Agriculture Outlook 2007

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    @@ China's macro economy has remained in a good and stable condition overall, experiencing an annual GDP growth of over 10% for several consecutive years. Under this basic condition,the main focus of the Outlook was China's current grain and oil supply, and the demand market with its probable future prices.

  8. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  9. Impact of a Cosmic Body into Earth's Ocean and the Generation of Large Tsunami Waves: Insight from Numerical Modeling

    Science.gov (United States)

    Wünnemann, K.; Collins, G. S.; Weiss, R.

    2010-12-01

    The strike of a cosmic body into a marine environment differs in several respects from impact on land. Oceans cover approximately 70% of the Earth's surface, implying not only that oceanic impact is a very likely scenario for future impacts but also that most impacts in Earth's history must have happened in marine environments. Therefore, the study of oceanic impact is imperative in two respects: (1) to quantify the hazard posed by future oceanic impacts, including the potential threat of large impact-generated tsunami-like waves, and (2) to reconstruct Earth's impact record by accounting for the large number of potentially undiscovered crater structures in the ocean crust. Reconstruction of the impact record is of crucial importance both for assessing the frequency of collision events in the past and for better predicting the probability of future impact. We summarize the advances in the study of oceanic impact over the last decades and focus in particular on how numerical models have improved our understanding of cratering in the oceanic environment and the generation of waves by impact. We focus on insight gleaned from numerical modeling studies into the deceleration of the projectile by the water, cratering of the ocean floor, the late stage modification of the crater due to gravitational collapse, and water resurge. Furthermore, we discuss the generation and propagation of large tsunami-like waves as a result of a strike of a cosmic body in marine environments.

  10. A practical method for accurate quantification of large fault trees

    International Nuclear Information System (INIS)

    Choi, Jong Soo; Cho, Nam Zin

    2007-01-01

    This paper describes a practical method to accurately quantify top event probability and importance measures from incomplete minimal cut sets (MCS) of a large fault tree. The MCS-based fault tree method is extensively used in probabilistic safety assessments. Several sources of uncertainties exist in MCS-based fault tree analysis. The paper is focused on quantification of the following two sources of uncertainties: (1) the truncation neglecting low-probability cut sets and (2) the approximation in quantifying MCSs. The method proposed in this paper is based on a Monte Carlo simulation technique to estimate probability of the discarded MCSs and the sum of disjoint products (SDP) approach complemented by the correction factor approach (CFA). The method provides capability to accurately quantify the two uncertainties and estimate the top event probability and importance measures of large coherent fault trees. The proposed fault tree quantification method has been implemented in the CUTREE code package and is tested on the two example fault trees

  11. The Importance of Studying Past Extreme Floods to Prepare for Uncertain Future Extremes

    Science.gov (United States)

    Burges, S. J.

    2016-12-01

    Hoyt and Langbein, 1955 in their book `Floods' wrote: " ..meteorologic and hydrologic conditions will combine to produce superfloods of unprecedented magnitude. We have every reason to believe that in most rivers past floods may not be an accurate measure of ultimate flood potentialities. It is this superflood with which we are always most concerned". I provide several examples to offer some historical perspective on assessing extreme floods. In one example, flooding in the Miami Valley, OH in 1913 claimed 350 lives. The engineering and socio-economic challenges facing the Morgan Engineering Co in how to mitigate against future flood damage and loss of life when limited information was available provide guidance about ways to face an uncertain hydroclimate future, particularly one of a changed climate. A second example forces us to examine mixed flood populations and illustrates the huge uncertainty in assigning flood magnitude and exceedance probability to extreme floods in such cases. There is large uncertainty in flood frequency estimates; knowledge of the total flood hydrograph, not the peak flood flow rate alone, is what is needed for hazard mitigation assessment or design. Some challenges in estimating the complete flood hydrograph in an uncertain future climate, including demands on hydrologic models and their inputs, are addressed.

  12. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  13. History and future of remote sensing technology and education

    Science.gov (United States)

    Colwell, R. N.

    1980-01-01

    A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.

  14. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  15. Demystifying Managed Futures

    DEFF Research Database (Denmark)

    Hurst,, Brian; Hua Ooi, Yao; Heje Pedersen, Lasse

    2013-01-01

    We show that the returns of Managed Futures funds and CTAs can be explained by time series momentum strategies and we discuss the economic intuition behind these strategies. Time series momentum strategies produce large correlations and high R-squares with Managed Futures indices and individual m...... of implementation issues relevant to time series momentum strategies, including risk management, risk allocation across asset classes and trend horizons, portfolio rebalancing frequency, transaction costs, and fees....

  16. Multi-path transportation futures study : vehicle characterization and scenario analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Plotkin, S. E.; Singh, M. K.; Energy Systems; TA Engineering; ORNL

    2009-12-03

    Projecting the future role of advanced drivetrains and fuels in the light vehicle market is inherently difficult, given the uncertainty (and likely volatility) of future oil prices, inadequate understanding of likely consumer response to new technologies, the relative infancy of several important new technologies with inevitable future changes in their performance and costs, and the importance - and uncertainty - of future government marketplace interventions (e.g., new regulatory standards or vehicle purchase incentives). This Multi-Path Transportation Futures (MP) Study has attempted to improve our understanding of this future role by examining several scenarios of vehicle costs, fuel prices, government subsidies, and other key factors. These are projections, not forecasts, in that they try to answer a series of 'what if' questions without assigning probabilities to most of the basic assumptions.

  17. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  18. Quantifying Detection Probabilities for Proliferation Activities in Undeclared Facilities

    International Nuclear Information System (INIS)

    Listner, C.; Canty, M.; Niemeyer, I.; Rezniczek, A.; Stein, G.

    2015-01-01

    International Safeguards is currently in an evolutionary process to increase effectiveness and efficiency of the verification system. This is an obvious consequence of the inability to detect the Iraq's clandestine nuclear weapons programme in the early 90s. By the adoption of the Programme 93+2, this has led to the development of Integrated Safeguards and the State-level concept. Moreover, the IAEA's focus was extended onto proliferation activities outside the State's declared facilities. The effectiveness of safeguards activities within declared facilities can and have been quantified with respect to costs and detection probabilities. In contrast, when verifying the absence of undeclared facilities this quantification has been avoided in the past because it has been considered to be impossible. However, when balancing the allocation of budget between the declared and the undeclared field, explicit reasoning is needed why safeguards effort is distributed in a given way. Such reasoning can be given by a holistic, information and risk-driven approach to Acquisition Path Analysis comprising declared and undeclared facilities. Regarding the input, this approach relies on the quantification of several factors, i.e., costs of attractiveness values for specific proliferation activities, potential safeguards measures and detection probabilities for these measures also for the undeclared field. In order to overcome the lack of quantification for detection probabilities in undeclared facilities, the authors of this paper propose a general verification error model. Based on this model, four different approaches are explained and assessed with respect to their advantages and disadvantages: the analogy approach, the Bayes approach, the frequentist approach and the process approach. The paper concludes with a summary and an outlook on potential future research activities. (author)

  19. Fixation Probability in a Two-Locus Model by the Ancestral Recombination–Selection Graph

    Science.gov (United States)

    Lessard, Sabin; Kermany, Amir R.

    2012-01-01

    We use the ancestral influence graph (AIG) for a two-locus, two-allele selection model in the limit of a large population size to obtain an analytic approximation for the probability of ultimate fixation of a single mutant allele A. We assume that this new mutant is introduced at a given locus into a finite population in which a previous mutant allele B is already segregating with a wild type at another linked locus. We deduce that the fixation probability increases as the recombination rate increases if allele A is either in positive epistatic interaction with B and allele B is beneficial or in no epistatic interaction with B and then allele A itself is beneficial. This holds at least as long as the recombination fraction and the selection intensity are small enough and the population size is large enough. In particular this confirms the Hill–Robertson effect, which predicts that recombination renders more likely the ultimate fixation of beneficial mutants at different loci in a population in the presence of random genetic drift even in the absence of epistasis. More importantly, we show that this is true from weak negative epistasis to positive epistasis, at least under weak selection. In the case of deleterious mutants, the fixation probability decreases as the recombination rate increases. This supports Muller’s ratchet mechanism to explain the accumulation of deleterious mutants in a population lacking recombination. PMID:22095080

  20. Probability analysis of MCO over-pressurization during staging

    International Nuclear Information System (INIS)

    Pajunen, A.L.

    1997-01-01

    The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience

  1. Modelling detection probabilities to evaluate management and control tools for an invasive species

    Science.gov (United States)

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By

  2. High-efficiency wavefunction updates for large scale Quantum Monte Carlo

    Science.gov (United States)

    Kent, Paul; McDaniel, Tyler; Li, Ying Wai; D'Azevedo, Ed

    Within ab intio Quantum Monte Carlo (QMC) simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunctions. The evaluation of each Monte Carlo move requires finding the determinant of a dense matrix, which is traditionally iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. For calculations with thousands of electrons, this operation dominates the execution profile. We propose a novel rank- k delayed update scheme. This strategy enables probability evaluation for multiple successive Monte Carlo moves, with application of accepted moves to the matrices delayed until after a predetermined number of moves, k. Accepted events grouped in this manner are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency. This procedure does not change the underlying Monte Carlo sampling or the sampling efficiency. For large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude speedups can be obtained on both multi-core CPU and on GPUs, making this algorithm highly advantageous for current petascale and future exascale computations.

  3. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  4. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  5. Changing Drought Risk in a Warming World - Using Event Attribution Methods to Explore Changing Likelihoods of Drought in East Africa in the Past, Present and Future

    Science.gov (United States)

    O'Keefe, S. A.; Li, S.; Otto, F. E. L.

    2017-12-01

    East Africa is particularly vulnerable to potential impacts of anthropogenic climate change, due to the particular climatic forces at play in the region and the population's dependence on rain fed agriculture. However large natural inter-annual variability in the region has made the detection and attribution of anthropogenic forcing a challenge. Making use of the design and implementation of the HAPPI project (happimip.org) in which large ensembles of atmosphere-only models are run under historic, 1.5 and 2 C conditions (Mitchell et al., 2017) we estimate current and future changes in the probability of drought in different regions to occur. Attribution of present day changes are examined alongside future conditions allowing for the first time a seamless analysis of how the risk of droughts in this highly vulnerable region changes. The large ensemble multi-model framework in the HAPPI design allows for a more robust estimation of extremes than ever before while at the same time providing a confidence estimate depending on the specific model used.

  6. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  7. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  8. Using Extreme Tropical Precipitation Statistics to Constrain Future Climate States

    Science.gov (United States)

    Igel, M.; Biello, J. A.

    2017-12-01

    Tropical precipitation is characterized by a rapid growth in mean intensity as the column humidity increases. This behavior is examined in both a cloud resolving model and with high-resolution observations of precipitation and column humidity from CloudSat and AIRS, respectively. The model and the observations exhibit remarkable consistency and suggest a new paradigm for extreme precipitation. We show that the total precipitation can be decomposed into a product of contributions from a mean intensity, a probability of precipitation, and a global PDF of column humidity values. We use the modeling and observational results to suggest simple, analytic forms for each of these functions. The analytic representations are then used to construct a simple expression for the global accumulated precipitation as a function of the parameters of each of the component functions. As the climate warms, extreme precipitation intensity and global precipitation are expected to increase, though at different rates. When these predictions are incorporated into the new analytic expression for total precipitation, predictions for changes due to global warming to the probability of precipitation and the PDF of column humidity can be made. We show that strong constraints can be imposed on the future shape of the PDF of column humidity but that only weak constraints can be set on the probability of precipitation. These are largely imposed by the intensification of extreme precipitation. This result suggests that understanding precisely how extreme precipitation responds to climate warming is critical to predicting other impactful properties of global hydrology. The new framework can also be used to confirm and discount existing theories for shifting precipitation.

  9. Decisions under risk in Parkinson's disease: preserved evaluation of probability and magnitude.

    Science.gov (United States)

    Sharp, Madeleine E; Viswanathan, Jayalakshmi; McKeown, Martin J; Appel-Cresswell, Silke; Stoessl, A Jon; Barton, Jason J S

    2013-11-01

    Unmedicated Parkinson's disease patients tend to be risk-averse while dopaminergic treatment causes a tendency to take risks. While dopamine agonists may result in clinically apparent impulse control disorders, treatment with levodopa also causes shift in behaviour associated with an enhanced response to rewards. Two important determinants in decision-making are how subjects perceive the magnitude and probability of outcomes. Our objective was to determine if patients with Parkinson's disease on or off levodopa showed differences in their perception of value when making decisions under risk. The Vancouver Gambling task presents subjects with a choice between one prospect with larger outcome and a second with higher probability. Eighteen age-matched controls and eighteen patients with Parkinson's disease before and after levodopa were tested. In the Gain Phase subjects chose between one prospect with higher probability and another with larger reward to maximize their gains. In the Loss Phase, subjects played to minimize their losses. Patients with Parkinson's disease, on or off levodopa, were similar to controls when evaluating gains. However, in the Loss Phase before levodopa, they were more likely to avoid the prospect with lower probability but larger loss, as indicated by the steeper slope of their group psychometric function (t(24) = 2.21, p = 0.04). Modelling with prospect theory suggested that this was attributable to a 28% overestimation of the magnitude of loss, rather than an altered perception of its probability. While pre-medicated patients with Parkinson's disease show risk-aversion for large losses, patients on levodopa have normal perception of magnitude and probability for both loss and gain. The finding of accurate and normally biased decisions under risk in medicated patients with PD is important because it indicates that, if there is indeed anomalous risk-seeking behaviour in such a cohort, it may derive from abnormalities in components of

  10. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  11. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  12. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  13. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  14. Future of Atmospheric Neutrino Measurements

    International Nuclear Information System (INIS)

    Choubey, Sandhya

    2013-01-01

    Discovery of large θ 13 has opened up the possibility of determining the neutrino mass hierarchy and θ 23 octant through earth matter effects. The atmospheric neutrinos pick up large earth matter effects both in the ν e and ν μ channels, which if observed could lead to the determination of the mass hierarchy and θ 23 octant using this class of experiments in the near future. In this talk I review the status and prospects of future atmospheric neutrino measurements in determining the mass hierarchy and octant of θ 23

  15. B physics, now and future

    CERN Document Server

    CERN. Geneva

    1995-01-01

    The b-quark surprised us already twice. Unexpectedly large B-meson lifetimes told us that third generation of the quark pair mixed very little with others. The later discovery of large Bo - BO mixing showed us that the top quark mass was much larger than expected. There could be a third surprise, may be in CP violation. In this lectuere series, we quickly review the past and present of B - physics. This is followed by a more detailed discussion on future,in particular CP violation studies with future accelerators.

  16. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  17. Impact of large-scale circulation changes in the North Atlantic sector on the current and future Mediterranean winter hydroclimate

    Science.gov (United States)

    Barcikowska, Monika J.; Kapnick, Sarah B.; Feser, Frauke

    2018-03-01

    The Mediterranean region, located in the transition zone between the dry subtropical and wet European mid-latitude climate, is very sensitive to changes in the global mean climate state. Projecting future changes of the Mediterranean hydroclimate under global warming therefore requires dynamic climate models to reproduce the main mechanisms controlling regional hydroclimate with sufficiently high resolution to realistically simulate climate extremes. To assess future winter precipitation changes in the Mediterranean region we use the Geophysical Fluid Dynamics Laboratory high-resolution general circulation model for control simulations with pre-industrial greenhouse gas and aerosol concentrations which are compared to future scenario simulations. Here we show that the coupled model is able to reliably simulate the large-scale winter circulation, including the North Atlantic Oscillation and Eastern Atlantic patterns of variability, and its associated impacts on the mean Mediterranean hydroclimate. The model also realistically reproduces the regional features of daily heavy rainfall, which are absent in lower-resolution simulations. A five-member future projection ensemble, which assumes comparatively high greenhouse gas emissions (RCP8.5) until 2100, indicates a strong winter decline in Mediterranean precipitation for the coming decades. Consistent with dynamical and thermodynamical consequences of a warming atmosphere, derived changes feature a distinct bipolar behavior, i.e. wetting in the north—and drying in the south. Changes are most pronounced over the northwest African coast, where the projected winter precipitation decline reaches 40% of present values. Despite a decrease in mean precipitation, heavy rainfall indices show drastic increases across most of the Mediterranean, except the North African coast, which is under the strong influence of the cold Canary Current.

  18. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  19. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  20. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.