WorldWideScience

Sample records for analyses generated probability

  1. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  4. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  5. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment

    Science.gov (United States)

    Sprenger, Amber M.; Dougherty, Michael R.; Atkins, Sharona M.; Franco-Watkins, Ana M.; Thomas, Rick P.; Lange, Nicholas; Abbs, Brandon

    2011-01-01

    We tested the predictions of HyGene (Thomas et al., 2008) that both divided attention at encoding and judgment should affect the degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention during encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments. PMID:21734897

  6. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  7. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  8. Sharp Bounds by Probability-Generating Functions and Variable Drift

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten

    2011-01-01

    We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al....... (GECCO 2010) in several respects. First, the upper bound on the expected running time of the most successful quasirandom evolutionary algorithm for the OneMax function is improved from 1.28nln n to 0.982nlnn, which breaks the barrier of nln n posed by coupon-collector processes. Compared to the classical...

  9. A Probability Analysis of the Generating Cost for APR1000+

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Gag-Hyeon; Kim, Dae-Hun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The nuclear power plant market is expected to grow rapidly in order to address issues of global warming, cutting CO{sub 2} emissions and securing stable electricity supplies. Under these circumstances, the main primary goal of the APR1000+ development is to ensure export competitiveness in the developing countries in the Middle East and Southeast Asia. To that end, APR1000+(1,000MWe, 3.5 generation) will be developed based on APR+ (1,500MWe, 3.5 generation). And comparing to OPR1000(Korean Standard Nuclear power Plant, 2.5 generation), APR1000+ have many design features such as the 60 year design life time, comprehensive site requirement of 0.3g seismic design, stability improvement, operability improvement and provisions for severe accidents. In this simulation, the results of generating cost for APR1000+ preliminary conceptual design using a probability method was shown to be 48.37 ~ 74.22 won/kWh(median value 56.51 won/kWh). Those of OPR1000 was 42.08 ~ 61.77 won/kWh(median value 48.63 won/kWh). APR1000+ has -16.2% cost advantage over OPR1000 nuclear power plant. The main reason of this results is due to adding several safety designs.

  10. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  11. The relative impact of sizing errors on steam generator tube failure probability

    International Nuclear Information System (INIS)

    Cizelj, L.; Dvorsek, T.

    1998-01-01

    The Outside Diameter Stress Corrosion Cracking (ODSCC) at tube support plates is currently the major degradation mechanism affecting the steam generator tubes made of Inconel 600. This caused development and licensing of degradation specific maintenance approaches, which addressed two main failure modes of the degraded piping: tube rupture; and excessive leakage through degraded tubes. A methodology aiming at assessing the efficiency of a given set of possible maintenance approaches has already been proposed by the authors. It pointed out better performance of the degradation specific over generic approaches in (1) lower probability of single and multiple steam generator tube rupture (SGTR), (2) lower estimated accidental leak rates and (3) less tubes plugged. A sensitivity analysis was also performed pointing out the relative contributions of uncertain input parameters to the tube rupture probabilities. The dominant contribution was assigned to the uncertainties inherent to the regression models used to correlate the defect size and tube burst pressure. The uncertainties, which can be estimated from the in-service inspections, are further analysed in this paper. The defect growth was found to have significant and to some extent unrealistic impact on the probability of single tube rupture. Since the defect growth estimates were based on the past inspection records they strongly depend on the sizing errors. Therefore, an attempt was made to filter out the sizing errors and to arrive at more realistic estimates of the defect growth. The impact of different assumptions regarding sizing errors on the tube rupture probability was studied using a realistic numerical example. The data used is obtained from a series of inspection results from Krsko NPP with 2 Westinghouse D-4 steam generators. The results obtained are considered useful in safety assessment and maintenance of affected steam generators. (author)

  12. Application of a weighted spatial probability model in GIS to analyse landslides in Penang Island, Malaysia

    Directory of Open Access Journals (Sweden)

    Samy Ismail Elmahdy

    2016-01-01

    Full Text Available In the current study, Penang Island, which is one of the several mountainous areas in Malaysia that is often subjected to landslide hazard, was chosen for further investigation. A multi-criteria Evaluation and the spatial probability weighted approach and model builder was applied to map and analyse landslides in Penang Island. A set of automated algorithms was used to construct new essential geological and morphometric thematic maps from remote sensing data. The maps were ranked using the weighted probability spatial model based on their contribution to the landslide hazard. Results obtained showed that sites at an elevation of 100–300 m, with steep slopes of 10°–37° and slope direction (aspect in the E and SE directions were areas of very high and high probability for the landslide occurrence; the total areas were 21.393 km2 (11.84% and 58.690 km2 (32.48%, respectively. The obtained map was verified by comparing variogram models of the mapped and the occurred landslide locations and showed a strong correlation with the locations of occurred landslides, indicating that the proposed method can successfully predict the unpredictable landslide hazard. The method is time and cost effective and can be used as a reference for geological and geotechnical engineers.

  13. Application of at-site peak-streamflow frequency analyses for very low annual exceedance probabilities

    Science.gov (United States)

    Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.

    2017-07-17

    The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized

  14. Steam generator tubes rupture probability estimation - study of the axially cracked tube case

    International Nuclear Information System (INIS)

    Mavko, B.; Cizelj, L.; Roussel, G.

    1992-01-01

    The objective of the present study is to estimate the probability of a steam generator tube rupture due to the unstable propagation of axial through-wall cracks during a hypothetical accident. For this purpose the probabilistic fracture mechanics model was developed taking into account statistical distributions of influencing parameters. A numerical example considering a typical steam generator seriously affected by axial stress corrosion cracking in the roll transition area, is presented; it indicates the change of rupture probability with different assumptions focusing mostly on tubesheet reinforcing factor, crack propagation rate and crack detection probability. 8 refs., 4 figs., 4 tabs

  15. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  16. Generating prior probabilities for classifiers of brain tumours using belief networks

    Directory of Open Access Journals (Sweden)

    Arvanitis Theodoros N

    2007-09-01

    Full Text Available Abstract Background Numerous methods for classifying brain tumours based on magnetic resonance spectra and imaging have been presented in the last 15 years. Generally, these methods use supervised machine learning to develop a classifier from a database of cases for which the diagnosis is already known. However, little has been published on developing classifiers based on mixed modalities, e.g. combining imaging information with spectroscopy. In this work a method of generating probabilities of tumour class from anatomical location is presented. Methods The method of "belief networks" is introduced as a means of generating probabilities that a tumour is any given type. The belief networks are constructed using a database of paediatric tumour cases consisting of data collected over five decades; the problems associated with using this data are discussed. To verify the usefulness of the networks, an application of the method is presented in which prior probabilities were generated and combined with a classification of tumours based solely on MRS data. Results Belief networks were constructed from a database of over 1300 cases. These can be used to generate a probability that a tumour is any given type. Networks are presented for astrocytoma grades I and II, astrocytoma grades III and IV, ependymoma, pineoblastoma, primitive neuroectodermal tumour (PNET, germinoma, medulloblastoma, craniopharyngioma and a group representing rare tumours, "other". Using the network to generate prior probabilities for classification improves the accuracy when compared with generating prior probabilities based on class prevalence. Conclusion Bayesian belief networks are a simple way of using discrete clinical information to generate probabilities usable in classification. The belief network method can be robust to incomplete datasets. Inclusion of a priori knowledge is an effective way of improving classification of brain tumours by non-invasive methods.

  17. L’Analyse du Risque Géopolitique: du Plausible au Probable

    OpenAIRE

    Adib Bencherif

    2015-01-01

    This paper is going to explore the logical process behind risk analysis, particularly in geopolitics. The main goal is to demonstrate the ambiguities behind risk calculation and to highlight the continuum between plausibility and probability in risk analysis. To demonstrate it, the author introduces two notions: the inference of abduction, often neglected in the social sciences literature, and the Bayesian calculation. Inspired by the works of Louise Amoore, this paper tries to go further by ...

  18. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  19. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  20. Some possible causes and probability of leakages in LMFBR steam generators

    International Nuclear Information System (INIS)

    Bolt, P.R.

    1984-01-01

    Relevant operational experience with steam generators for process and conventional plant and thermal and fast reactors is reviewed. Possible causes of water/steam leakages into sodium/gas are identified and data is given on the conditions necessary for failure, leakage probability and type of leakage path. (author)

  1. Morphometric analyses of hominoid crania, probabilities of conspecificity and an approximation of a biological species constant.

    Science.gov (United States)

    Thackeray, J F; Dykes, S

    2016-02-01

    Thackeray has previously explored the possibility of using a morphometric approach to quantify the "amount" of variation within species and to assess probabilities of conspecificity when two fossil specimens are compared, instead of "pigeon-holing" them into discrete species. In an attempt to obtain a statistical (probabilistic) definition of a species, Thackeray has recognized an approximation of a biological species constant (T=-1.61) based on the log-transformed standard error of the coefficient m (log sem) in regression analysis of cranial and other data from pairs of specimens of conspecific extant species, associated with regression equations of the form y=mx+c where m is the slope and c is the intercept, using measurements of any specimen A (x axis), and any specimen B of the same species (y axis). The log-transformed standard error of the co-efficient m (log sem) is a measure of the degree of similarity between pairs of specimens, and in this study shows central tendency around a mean value of -1.61 and standard deviation 0.10 for modern conspecific specimens. In this paper we focus attention on the need to take into account the range of difference in log sem values (Δlog sem or "delta log sem") obtained from comparisons when specimen A (x axis) is compared to B (y axis), and secondly when specimen A (y axis) is compared to B (x axis). Thackeray's approach can be refined to focus on high probabilities of conspecificity for pairs of specimens for which log sem is less than -1.61 and for which Δlog sem is less than 0.03. We appeal for the adoption of a concept here called "sigma taxonomy" (as opposed to "alpha taxonomy"), recognizing that boundaries between species are not always well defined. Copyright © 2015 Elsevier GmbH. All rights reserved.

  2. Considerations of "Combined Probability of Injury" in the next-generation USA frontal NCAP.

    Science.gov (United States)

    Laituri, Tony R; Henry, Scott; Sullivan, Kaye; Nutt, Marvin

    2010-08-01

    The numerical basis for assigning star ratings in the next-generation USA New Car Assessment Program (NCAP) for frontal impacts was assessed. That basis, the Combined Probability of Injury, or CPI, is the probability of an occupant sustaining an injury to any of the specified body regions. For an NCAP test, a CPI value is computed by (a) using risk curves to convert body-region responses from a test dummy into body-region risks and (b) using a theoretical, overarching CPI equation to convert those separate body-region risks into a single CPI value. Though the general concept of applying a CPI equation to assign star ratings has existed since 1994, there will be numerous changes to the 2011 frontal NCAP: there will be two additional body regions (n = 4 vs. 2), the injury probabilities will be evaluated for lower-severity (more likely) injury levels, and some of the occupant responses will change. These changes could yield more disperse CPIs that could yield more disperse ratings. However, the reasons for this increased dispersion should be consistent with real-world findings. Related assessments were the topic of this two-part study, focused on drivers. In Part 1, the CPI equation was assessed without applying risk curves. Specifically, field injury probabilities for the four body regions were used as inputs to the CPI equation, and the resulting equation-produced CPIs were compared with the field CPIs. In Part 2, subject to analyses of test dummy responses from recent NCAP tests, the effect of risk curve choice on CPIs was assessed. Specifically, dispersion statistics were compared for CPIs based on various underlying risk curves applied to data from 2001-2005 model year vehicles (n = 183). From Part 1, the theoretical CPI equation for four body regions demonstrated acceptable fidelity when provided field injury rates (R(2)= 0.92), with the equation-based CPIs being approximately 12 percent lower than those of ideal correlation. From Part 2, the 2011 NCAP protocol

  3. Quantifying and analysing food waste generated by Indonesian undergraduate students

    Science.gov (United States)

    Mandasari, P.

    2018-03-01

    Despite the fact that environmental consequences derived from food waste have been widely known, studies on the amount of food waste and its influencing factors have relatively been paid little attention. Addressing this shortage, this paper aimed to quantify monthly avoidable food waste generated by Indonesian undergraduate students and analyse factors influencing the occurrence of avoidable food waste. Based on data from 106 undergraduate students, descriptive statistics and logistic regression were applied in this study. The results indicated that 4,987.5 g of food waste was generated in a month (equal to 59,850 g yearly); or 47.05 g per person monthly (equal to 564.62 g per person per a year). Meanwhile, eating out frequency and gender were found to be significant predictors of food waste occurrence.

  4. Indian Point 2 steam generator tube rupture analyses

    International Nuclear Information System (INIS)

    Dayan, A.

    1985-01-01

    Analyses were conducted with RETRAN-02 to study consequences of steam generator tube rupture (SGTR) events. The Indian Point, Unit 2, power plant (IP2, PWR) was modeled as a two asymmetric loops, consisting of 27 volumes and 37 junctions. The break section was modeled once, conservatively, as a 150% flow area opening at the wall of the steam generator cold leg plenum, and once as a 200% double-ended tube break. Results revealed 60% overprediction of breakflow rates by the traditional conservative model. Two SGTR transients were studied, one with low-pressure reactor trip and one with an earlier reactor trip via over temperature ΔT. The former is more typical to a plant with low reactor average temperature such as IP2. Transient analyses for a single tube break event over 500 seconds indicated continued primary subcooling and no need for steam line pressure relief. In addition, SGTR transients with reactor trip while the pressurizer still contains water were found to favorably reduce depressurization rates. Comparison of the conservative results with independent LOFTRAN predictions showed good agreement

  5. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  6. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  7. Review of Ontario Hydro Pickering 'A' and Bruce 'A' nuclear generating stations' accident analyses

    International Nuclear Information System (INIS)

    Serdula, K.J.

    1988-01-01

    Deterministic safety analysis for the Pickering 'A' and Bruce 'A' nuclear generating stations were reviewed. The methodology used in the evaluation and assessment was based on the concept of 'N' critical parameters defining an N-dimensional safety parameter space. The reviewed accident analyses were evaluated and assessed based on their demonstrated safety coverage for credible values and trajectories of the critical parameters within this N-dimensional safety parameter space. The reported assessment did not consider probability of occurrence of event. The reviewed analyses were extensive for potential occurrence of accidents under normal steady-state operating conditions. These analyses demonstrated an adequate assurance of safety for the analyzed conditions. However, even for these reactor conditions, items have been identified for consideration of review and/or further study, which would provide a greater assurance of safety in the event of an accident. Accident analyses based on a plant in a normal transient operating state or in an off-normal condition but within the allowable operating envelope are not as extensive. Improvements in demonstrations and/or justifications of safety upon potential occurrence of accidents would provide further assurance of adequacy of safety under these conditions. Some events under these conditions have not been analyzed because of their judged low probability; however, accident analyses in this area should be considered. Recommendations are presented relating to these items; it is also recommended that further study is needed of the Pickering 'A' special safety systems

  8. Sensitivity analyses on in-vessel hydrogen generation for KNGR

    International Nuclear Information System (INIS)

    Kim, See Darl; Park, S.Y.; Park, S.H.; Park, J.H.

    2001-03-01

    Sensitivity analyses for the in-vessel hydrogen generation, using the MELCOR program, are described in this report for the Korean Next Generation Reactor. The typical accident sequences of a station blackout and a large LOCA scenario are selected. A lower head failure model, a Zircaloy oxidation reaction model and a B 4 C reaction model are considered for the sensitivity parameters. As for the base case, 1273.15K for a failure temperature of the penetrations or the lower head, an Urbanic-Heidrich correlation for the Zircaloy oxidation reaction model and the B 4 C reaction model are used. Case 1 used 1650K as the failure temperature for the penetrations and Case 2 considered creep rupture instead of penetration failure. Case 3 used a MATPRO-EG and G correlation for the Zircaloy oxidation reaction model and Case 4 turned off the B 4 C reaction model. The results of the studies are summarized below : (1) When the penetration failure temperature is higher, or the creep rupture failure model is considered, the amount of hydrogen increases for two sequences. (2) When the MATPRO-EG and G correlation for a Zircaloy oxidation reaction is considered, the amount of hydrogen is less than the Urbanic-Heidrich correlation (Base case) for both scenarios. (3) When the B 4 C reaction model turns off, the amount of hydrogen decreases for two sequences

  9. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  10. A formalism to generate probability distributions for performance-assessment modeling

    International Nuclear Information System (INIS)

    Kaplan, P.G.

    1990-01-01

    A formalism is presented for generating probability distributions of parameters used in performance-assessment modeling. The formalism is used when data are either sparse or nonexistent. The appropriate distribution is a function of the known or estimated constraints and is chosen to maximize a quantity known as Shannon's informational entropy. The formalism is applied to a parameter used in performance-assessment modeling. The functional form of the model that defines the parameter, data from the actual field site, and natural analog data are analyzed to estimate the constraints. A beta probability distribution of the example parameter is generated after finding four constraints. As an example of how the formalism is applied to the site characterization studies of Yucca Mountain, the distribution is generated for an input parameter in a performance-assessment model currently used to estimate compliance with disposal of high-level radioactive waste in geologic repositories, 10 CFR 60.113(a)(2), commonly known as the ground water travel time criterion. 8 refs., 2 figs

  11. Concise method for evaluating the probability distribution of the marginal cost of power generation

    International Nuclear Information System (INIS)

    Zhang, S.H.; Li, Y.Z.

    2000-01-01

    In the developing electricity market, many questions on electricity pricing and the risk modelling of forward contracts require the evaluation of the expected value and probability distribution of the short-run marginal cost of power generation at any given time. A concise forecasting method is provided, which is consistent with the definitions of marginal costs and the techniques of probabilistic production costing. The method embodies clear physical concepts, so that it can be easily understood theoretically and computationally realised. A numerical example has been used to test the proposed method. (author)

  12. A Probability Analysis of the Generating Cost for EU-APR1400 Single Unit

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Gak Hyeon; Kim, Sung Hwan [KHNP CRI, Seoul (Korea, Republic of)

    2014-10-15

    The nuclear power plant market is expected to grow rapidly in order to address issues of global warming, reducing CO{sub 2} emissions and securing stable electricity supplies. Under these circumstances, the main primary goal of the EU-APR100 development is to ensure export competitiveness in the European countries. To this end, EU-APR1400 have been developed based one te APR1400 (Advanced Power Reactor, GEN Type) The EU-APR1400 adds many advanced design features to its predecessor, as outlined below in Table 1. In this simulation, the results of the generating cost of the EU-APR1400 single unit were determined using the probability cost analysis technique, the generating cost range was shown to be 56.16 ∼ 70.92 won/kWh.

  13. Probability of a steam generator tube rupture due to the presence of axial through wall cracks

    International Nuclear Information System (INIS)

    Mavko, B.; Cizelj, L.

    1991-01-01

    Using the Leak-Before-Break (LBB) approach to define tube plugging criteria a possibility to operate with through wall crack(s) in steam generator tubes may be considered. This fact may imply an increase in tube rupture probability. Improved examination techniques (in addition to the 100% tube examination) have been developed and introduced to counterbalance the associated risk. However no estimates of the amount of total increase or decrease of risk due to the introduction of LBB have been made. A scheme to predict this change of risk is proposed in the paper, based on probabilistic fracture mechanics analysis of axial cracks combined with available data of steam generator tube nondestructive examination reliability. (author)

  14. A Probability Analysis of the Generating Cost for EU-APR1400 Single Unit

    International Nuclear Information System (INIS)

    Ha, Gak Hyeon; Kim, Sung Hwan

    2014-01-01

    The nuclear power plant market is expected to grow rapidly in order to address issues of global warming, reducing CO 2 emissions and securing stable electricity supplies. Under these circumstances, the main primary goal of the EU-APR100 development is to ensure export competitiveness in the European countries. To this end, EU-APR1400 have been developed based one te APR1400 (Advanced Power Reactor, GEN Type) The EU-APR1400 adds many advanced design features to its predecessor, as outlined below in Table 1. In this simulation, the results of the generating cost of the EU-APR1400 single unit were determined using the probability cost analysis technique, the generating cost range was shown to be 56.16 ∼ 70.92 won/kWh.

  15. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  16. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    Science.gov (United States)

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  17. Power generation in India: analysing trends and outlook

    International Nuclear Information System (INIS)

    2011-01-01

    The objective of this report is to provide up-to-date data, critical analysis and information encompassing all aspects of power generation in India. The report provides historic and future outlook for power generation in India. It also provides an evaluation of private participation in power generation segment of India and investment opportunities in Indian power sector. In addition, the report examines policies, regulatory framework and financing of power generation in India. It also highlights key issues and challenges that are restricting the accelerated development of this sector. The report has thirteen chapters in total. (author)

  18. Radioactivity release vs probability for a steam generator tube rupture accident

    International Nuclear Information System (INIS)

    Buslik, A.J.; Hall, R.E.

    1978-01-01

    A calculation of the probability of obtaining various radioactivity releases from a steam generator tube rupture (SGTR) is presented. The only radioactive isotopes considered are Iodine-131 and Xe-133. The particular accident path considered consists of a double-ended guillotine SGTR followed by loss of offsite power (LOSP). If there is no loss of offsite power, and no system fault other than the SGTR, it is judged that the consequences will be minimal, since the amount of iodine released through the condenser air ejector is expected to be quite small; this is a consequence of the fact that the concentration of iodine in the vapor released from the condenser air ejector is very small compared to that dissolved in the condensate water. In addition, in some plants the condenser air ejector flow is automatically diverted to containment or a high-activity alarm. The analysis presented here is for a typical Westinghouse PWR such as described in RESAR-3S

  19. ANALYSING SOLAR-WIND HYBRID POWER GENERATING SYSTEM

    Directory of Open Access Journals (Sweden)

    Mustafa ENGİN

    2005-02-01

    Full Text Available In this paper, a solar-wind hybrid power generating, system that will be used for security lighting was designed. Hybrid system was installed and solar cells, wind turbine, battery bank, charge regulators and inverter performance values were measured through the whole year. Using measured values of overall system efficiency, reliability, demanded energy cost per kWh were calculated, and percentage of generated energy according to resources were defined. We also include in the paper a discussion of new strategies to improve hybrid power generating system performance and demanded energy cost per kWh.

  20. Thermo hydrodynamical analyses of steam generator of nuclear power plant

    International Nuclear Information System (INIS)

    Petelin, S.; Gregoric, M.

    1984-01-01

    SMUP computer code for stationary model of a U-tube steam generator of a PWR nuclear power plant was developed. feed water flow can enter through main and auxiliary path. The computer code is based on the one dimensional mathematical model. Among the results that give an insight into physical processes along the tubes of steam generator are distribution of temperatures, water qualities, heat transfer rates. Parametric analysis permits conclusion on advantage of each design solution regarding heat transfer effects and safety of steam generator. (author)

  1. The probability of false positives in zero-dimensional analyses of one-dimensional kinematic, force and EMG trajectories.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2016-06-14

    A false positive is the mistake of inferring an effect when none exists, and although α controls the false positive (Type I error) rate in classical hypothesis testing, a given α value is accurate only if the underlying model of randomness appropriately reflects experimentally observed variance. Hypotheses pertaining to one-dimensional (1D) (e.g. time-varying) biomechanical trajectories are most often tested using a traditional zero-dimensional (0D) Gaussian model of randomness, but variance in these datasets is clearly 1D. The purpose of this study was to determine the likelihood that analyzing smooth 1D data with a 0D model of variance will produce false positives. We first used random field theory (RFT) to predict the probability of false positives in 0D analyses. We then validated RFT predictions via numerical simulations of smooth Gaussian 1D trajectories. Results showed that, across a range of public kinematic, force/moment and EMG datasets, the median false positive rate was 0.382 and not the assumed α=0.05, even for a simple two-sample t test involving N=10 trajectories per group. The median false positive rate for experiments involving three-component vector trajectories was p=0.764. This rate increased to p=0.945 for two three-component vector trajectories, and to p=0.999 for six three-component vectors. This implies that experiments involving vector trajectories have a high probability of yielding 0D statistical significance when there is, in fact, no 1D effect. Either (a) explicit a priori identification of 0D variables or (b) adoption of 1D methods can more tightly control α. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Results of Analyses of the Next Generation Solvent for Parsons

    International Nuclear Information System (INIS)

    Peters, T.; Washington, A.; Fink, S.

    2012-01-01

    Savannah River National Laboratory (SRNL) prepared a nominal 150 gallon batch of Next Generation Solvent (NGS) for Parsons. This material was then analyzed and tested for cesium mass transfer efficiency. The bulk of the results indicate that the solvent is qualified as acceptable for use in the upcoming pilot-scale testing at Parsons Technology Center. This report describes the analysis and testing of a batch of Next Generation Solvent (NGS) prepared in support of pilot-scale testing in the Parsons Technology Center. A total of ∼150 gallons of NGS solvent was prepared in late November of 2011. Details for the work are contained in a controlled laboratory notebook. Analysis of the Parsons NGS solvent indicates that the material is acceptable for use. SRNL is continuing to improve the analytical method for the guanidine.

  3. On the Generation of Random Ensembles of Qubits and Qutrits Computing Separability Probabilities for Fixed Rank States

    Directory of Open Access Journals (Sweden)

    Khvedelidze Arsen

    2018-01-01

    Full Text Available The generation of random mixed states is discussed, aiming for the computation of probabilistic characteristics of composite finite dimensional quantum systems. In particular, we consider the generation of random Hilbert-Schmidt and Bures ensembles of qubit and qutrit pairs and compute the corresponding probabilities to find a separable state among the states of a fixed rank.

  4. 78 FR 53483 - Inspections, Tests, Analyses, and Acceptance Criteria; Vogtle Electric Generating Plant, Unit 3

    Science.gov (United States)

    2013-08-29

    ... NUCLEAR REGULATORY COMMISSION [Docket No. 052-00025; NRC-2008-0252] Inspections, Tests, Analyses, and Acceptance Criteria; Vogtle Electric Generating Plant, Unit 3 AGENCY: Nuclear Regulatory Commission. ACTION: Determination of inspections, tests, analyses, and acceptance criteria (ITAAC) completion...

  5. 78 FR 53484 - Inspections, Tests, Analyses, and Acceptance Criteria; Vogtle Electric Generating Plant, Unit 4

    Science.gov (United States)

    2013-08-29

    ... NUCLEAR REGULATORY COMMISSION [Docket No. 052-00026; NRC-2008-0252] Inspections, Tests, Analyses, and Acceptance Criteria; Vogtle Electric Generating Plant, Unit 4 AGENCY: Nuclear Regulatory Commission. ACTION: Determination of inspections, tests, analyses, and acceptance criteria (ITAAC) completion...

  6. 78 FR 65007 - Inspections, Tests, Analyses, and Acceptance Criteria; Vogtle Electric Generating Plant, Unit 3

    Science.gov (United States)

    2013-10-30

    ... NUCLEAR REGULATORY COMMISSION [Docket No. 052-00026; NRC-2008-0252] Inspections, Tests, Analyses, and Acceptance Criteria; Vogtle Electric Generating Plant, Unit 3 AGENCY: Nuclear Regulatory Commission. ACTION: Determination of inspections, tests, analyses, and acceptance criteria completion...

  7. Next generation sequencing and comparative analyses of Xenopus mitogenomes

    Directory of Open Access Journals (Sweden)

    Lloyd Rhiannon E

    2012-09-01

    -coding genes were shown to be under strong negative (purifying selection, with genes under the strongest pressure (Complex 4 also being the most highly expressed, highlighting their potentially crucial functions in the mitochondrial respiratory chain. Conclusions Next generation sequencing of long-PCR amplicons using single taxon or multi-taxon approaches enabled two new species of Xenopus mtDNA to be fully characterized. We anticipate our complete mitochondrial genome amplification methods to be applicable to other amphibians, helpful for identifying the most appropriate markers for differentiating species, populations and resolving phylogenies, a pressing need since amphibians are undergoing drastic global decline. Our mtDNAs also provide templates for conserved primer design and the assembly of RNA and DNA reads following high throughput “omic” techniques such as RNA- and ChIP-seq. These could help us better understand how processes such mitochondrial replication and gene expression influence xenopus growth and development, as well as how they evolved and are regulated.

  8. Probability function of breaking-limited surface elevation. [wind generated waves of ocean

    Science.gov (United States)

    Tung, C. C.; Huang, N. E.; Yuan, Y.; Long, S. R.

    1989-01-01

    The effect of wave breaking on the probability function of surface elevation is examined. The surface elevation limited by wave breaking zeta sub b(t) is first related to the original wave elevation zeta(t) and its second derivative. An approximate, second-order, nonlinear, non-Gaussian model for zeta(t) of arbitrary but moderate bandwidth is presented, and an expression for the probability density function zeta sub b(t) is derived. The results show clearly that the effect of wave breaking on the probability density function of surface elevation is to introduce a secondary hump on the positive side of the probability density function, a phenomenon also observed in wind wave tank experiments.

  9. Analyses of moments in pseudorapidity intervals at √s = 546 GeV by means of two probability distributions in pure-birth process

    International Nuclear Information System (INIS)

    Biyajima, M.; Shirane, K.; Suzuki, N.

    1988-01-01

    Moments in pseudorapidity intervals at the CERN Sp-barpS collider (√s = 546 GeV) are analyzed by means of two probability distributions in the pure-birth stochastic process. Our results show that a probability distribution obtained from the Poisson distribution as an initial condition is more useful than that obtained from the Kronecker δ function. Analyses of moments by Koba-Nielsen-Olesen scaling functions derived from solutions of the pure-birth stochastic process are also made. Moreover, analyses of preliminary data at √s = 200 and 900 GeV are added

  10. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  11. Making Heads or Tails of Probability: An Experiment with Random Generators

    Science.gov (United States)

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  12. Pedigree analyses of yeast cells recovering from DNA damage allow assignment of lethal events to individual post-treatment generations

    International Nuclear Information System (INIS)

    Klein, F.; Karwan, A.; Wintersberger, U.

    1990-01-01

    Haploid cells of Saccharomyces cerevisiae were treated with different DNA damaging agents at various doses. A study of the progeny of individual such cells allowed the assignment of lethal events to distinct post treatment generations. By microscopically inspecting those cells which were not able to form visible colonies the authors could discriminate between cells dying from immediately effective lethal hits and those generating microcolonies probably as a consequence of lethal mutation(s). The experimentally obtained numbers of lethal events were mathematically transformed into mean probabilities of lethal fixations at taking place in cells of certain post treatment generations. Such analyses give detailed insight into the kinetics of lethality as a consequence of different kinds of DNA damage. For example, X-irradiated cells lost viability mainly by lethal hits, only at a higher dose also lethal mutations fixed in the cells that were in direct contact with the mutagen, but not in later generations, occurred. Ethyl methanesulfonate (EMS)-treated cells were hit by 00-fixations in a dose dependent manner. The distribution of all sorts of lethal fixations taken together, which occurred in the EMS-damaged cell families, was not random. For comparison analyses of cells treated with methyl methanesulfonate, N-methyl-N'-nitro-N-nitrosoguanidine and nitrous acid are also reported

  13. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    DEFF Research Database (Denmark)

    Hu, Y.; Li, H.; Liao, X

    2016-01-01

    method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...... of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.......This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration...

  14. Use of heterogeneous finite elements generated by collision probability solutions to calculate a pool reactor core

    International Nuclear Information System (INIS)

    Calabrese, C.R.; Grant, C.R.

    1990-01-01

    This work presents comparisons between measured fluxes obtained by activation of Manganese foils in the light water, enriched uranium research pool reactor RA-2 MTR (Materials Testing Reactors) fuel element) and fluxes calculated by the finite element method FEM using DELFIN code, and describes the heterogeneus finite elements by a set of solutions of the transport equations for several different configurations obtained using the collision probability code HUEMUL. The agreement between calculated and measured fluxes is good, and the advantage of using FEM is showed because to obtain the flux distribution with same detail using an usual diffusion calculation it would be necessary 12000 mesh points against the 2000 points that FEM uses, hence the processing time is reduced in a factor ten. An interesting alternative to use in MTR fuel management is presented. (Author) [es

  15. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    International Nuclear Information System (INIS)

    Bovy, Jo; Hogg, David W.; Weaver, Benjamin A.; Myers, Adam D.; Hennawi, Joseph F.; McMahon, Richard G.; Schiminovich, David; Sheldon, Erin S.; Brinkmann, Jon; Schneider, Donald P.

    2012-01-01

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  16. Using Probability of Exceedance to Compare the Resource Risk of Renewable and Gas-Fired Generation

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-08-01

    Of the myriad risks surrounding long-term investments in power plants, resource risk is one of the most difficult to mitigate, and is also perhaps the risk that most-clearly distinguishes renewable generation from natural gas-fired generation. For renewable generators like wind and solar projects, resource risk manifests as a quantity risk—i.e., the risk that the quantity of wind and insolation will be less than expected.i For gas-fired generators (i.e., a combined-cycle gas turbine or “CCGT”), resource risk manifests primarily as a price risk—i.e., the risk that natural gas will cost more than expected. Most often, resource risk—and natural gas price risk in particular—falls disproportionately on utility ratepayers, who are typically not well-equipped to manage this risk. As such, it is incumbent upon utilities, regulators, and policymakers to ensure that resource risk is taken into consideration when making or approving resource decisions, or enacting policies that influence the development of the electricity sector more broadly.

  17. WE-F-304-04: Radiosurgery for Vestibular Schwannomas: Tumor Control Probability Analyses and Recommended Reporting Standards

    Energy Technology Data Exchange (ETDEWEB)

    Soltys, S. [Stanford Univ. (United States)

    2015-06-15

    Stereotactic Body Radiation Therapy (SBRT) was introduced clinically more than twenty years ago, and many subsequent publications have reported safety and efficacy data. The AAPM Working Group on Biological Effects of Hypofractionated Radiotherapy/SBRT (WGSBRT) extracted published treatment outcomes data from extensive literature searches to summarize and construct tumor control probability (TCP) and normal tissue complication probability (NTCP) models for six anatomical regions: Cranial, Head and Neck, Thoracic, Abdominal, Pelvic, and Spinal. In this session, we present the WGSBRT’s work for cranial sites, and recurrent head and neck cancer. From literature-based data and associated models, guidelines to aid with safe and effective hypofractionated radiotherapy treatment are being determined. Further, the ability of existing and proposed radiobiological models to fit these data is considered as to the ability to distinguish between the linear-quadratic and alternative radiobiological models such as secondary cell death from vascular damage, immunogenic, or bystander effects. Where appropriate, specific model parameters are estimated. As described in “The lessons of QUANTEC,” (1), lack of adequate reporting standards continues to limit the amount of useful quantitative information that can be extracted from peer-reviewed publications. Recommendations regarding reporting standards are considered, to enable such reviews to achieve more complete characterization of clinical outcomes. 1 Jackson A, Marks LB, Bentzen SM, Eisbruch A, Yorke ED, Ten Haken RK, Constine LS, Deasy JO. The lessons of QUANTEC: recommendations for reporting and gathering data on dose-volume dependencies of treatment outcome. Int J Radiat Oncol Biol Phys. 2010 Mar 1;76(3 Suppl):S155–60. Learning Objectives: Describe the techniques, types of cancer and dose schedules used in treating recurrent H&N cancers with SBRT List the radiobiological models that compete with the linear-quadratic model

  18. WE-F-304-04: Radiosurgery for Vestibular Schwannomas: Tumor Control Probability Analyses and Recommended Reporting Standards

    International Nuclear Information System (INIS)

    Soltys, S.

    2015-01-01

    Stereotactic Body Radiation Therapy (SBRT) was introduced clinically more than twenty years ago, and many subsequent publications have reported safety and efficacy data. The AAPM Working Group on Biological Effects of Hypofractionated Radiotherapy/SBRT (WGSBRT) extracted published treatment outcomes data from extensive literature searches to summarize and construct tumor control probability (TCP) and normal tissue complication probability (NTCP) models for six anatomical regions: Cranial, Head and Neck, Thoracic, Abdominal, Pelvic, and Spinal. In this session, we present the WGSBRT’s work for cranial sites, and recurrent head and neck cancer. From literature-based data and associated models, guidelines to aid with safe and effective hypofractionated radiotherapy treatment are being determined. Further, the ability of existing and proposed radiobiological models to fit these data is considered as to the ability to distinguish between the linear-quadratic and alternative radiobiological models such as secondary cell death from vascular damage, immunogenic, or bystander effects. Where appropriate, specific model parameters are estimated. As described in “The lessons of QUANTEC,” (1), lack of adequate reporting standards continues to limit the amount of useful quantitative information that can be extracted from peer-reviewed publications. Recommendations regarding reporting standards are considered, to enable such reviews to achieve more complete characterization of clinical outcomes. 1 Jackson A, Marks LB, Bentzen SM, Eisbruch A, Yorke ED, Ten Haken RK, Constine LS, Deasy JO. The lessons of QUANTEC: recommendations for reporting and gathering data on dose-volume dependencies of treatment outcome. Int J Radiat Oncol Biol Phys. 2010 Mar 1;76(3 Suppl):S155–60. Learning Objectives: Describe the techniques, types of cancer and dose schedules used in treating recurrent H&N cancers with SBRT List the radiobiological models that compete with the linear-quadratic model

  19. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  20. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  1. Multi-scale ancient DNA analyses confirm the western origin of Michelsberg farmers and document probable practices of human sacrifice.

    Directory of Open Access Journals (Sweden)

    Alice Beau

    Full Text Available In Europe, the Middle Neolithic is characterized by an important diversification of cultures. In northeastern France, the appearance of the Michelsberg culture has been correlated with major cultural changes and interpreted as the result of the settlement of new groups originating from the Paris Basin. This cultural transition has been accompanied by the expansion of particular funerary practices involving inhumations within circular pits and individuals in "non-conventional" positions (deposited in the pits without any particular treatment. If the status of such individuals has been highly debated, the sacrifice hypothesis has been retained for the site of Gougenheim (Alsace. At the regional level, the analysis of the Gougenheim mitochondrial gene pool (SNPs and HVR-I sequence analyses permitted us to highlight a major genetic break associated with the emergence of the Michelsberg in the region. This genetic discontinuity appeared to be linked to new affinities with farmers from the Paris Basin, correlated to a noticeable hunter-gatherer legacy. All of the evidence gathered supports (i the occidental origin of the Michelsberg groups and (ii the potential implication of this migration in the progression of the hunter-gatherer legacy from the Paris Basin to Alsace / Western Germany at the beginning of the Late Neolithic. At the local level, we noted some differences in the maternal gene pool of individuals in "conventional" vs. "non-conventional" positions. The relative genetic isolation of these sub-groups nicely echoes both their social distinction and the hypothesis of sacrifices retained for the site. Our investigation demonstrates that a multi-scale aDNA study of ancient communities offers a unique opportunity to disentangle the complex relationships between cultural and biological evolution.

  2. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  3. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    International Nuclear Information System (INIS)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J.

    2012-01-01

    Greenhouse gas (CO 2 , CH 4 and N 2 O, hereinafter GHG) and criteria air pollutant (CO, NO x , VOC, PM 10 , PM 2.5 and SO x , hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life-cycle modeling with GREET.

  4. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  5. Task 4.1: Development of a framework for creating a databank to generate probability density functions for process parameters

    International Nuclear Information System (INIS)

    Burgazzi, Luciano

    2011-01-01

    PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest

  6. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  7. Advanced exergy-based analyses applied to a system including LNG regasification and electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Morosuk, Tatiana; Tsatsaronis, George; Boyano, Alicia; Gantiva, Camilo [Technische Univ. Berlin (Germany)

    2012-07-01

    Liquefied natural gas (LNG) will contribute more in the future than in the past to the overall energy supply in the world. The paper discusses the application of advanced exergy-based analyses to a recently developed LNG-based cogeneration system. These analyses include advanced exergetic, advanced exergoeconomic, and advanced exergoenvironmental analyses in which thermodynamic inefficiencies (exergy destruction), costs, and environmental impacts have been split into avoidable and unavoidable parts. With the aid of these analyses, the potentials for improving the thermodynamic efficiency and for reducing the overall cost and the overall environmental impact are revealed. The objectives of this paper are to demonstrate (a) the potential for generating electricity while regasifying LNG and (b) some of the capabilities associated with advanced exergy-based methods. The most important subsystems and components are identified, and suggestions for improving them are made. (orig.)

  8. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  9. Application of probability generating function to the essentials of nondestructive nuclear materials assay system using neutron correlation

    International Nuclear Information System (INIS)

    Hosoma, Takashi

    2017-01-01

    In the previous research (JAEA-Research 2015-009), essentials of neutron multiplicity counting mathematics were reconsidered where experiences obtained at the Plutonium Conversion Development Facility were taken into, and formulae of multiplicity distribution were algebraically derived up to septuplet using a probability generating function to make a strategic move in the future. Its principle was reported by K. Böhnel in 1985, but such a high-order expansion was the first case due to its increasing complexity. In this research, characteristics of the high-order correlation were investigated. It was found that higher-order correlation increases rapidly in response to the increase of leakage multiplication, crosses and leaves lower-order correlations behind, when leakage multiplication is > 1.3 that depends on detector efficiency and counter setting. In addition, fission rates and doubles count rates by fast neutron and by thermal neutron in their coexisting system were algebraically derived using a probability generating function again. Its principle was reported by I. Pázsit and L. Pál in 2012, but such a physical interpretation, i.e. associating their stochastic variables with fission rate, doubles count rate and leakage multiplication, is the first case. From Rossi-alpha combined distribution and measured ratio of each area obtained by Differential Die-Away Self-Interrogation (DDSI) and conventional assay data, it is possible to estimate: the number of induced fissions per unit time by fast neutron and by thermal neutron; the number of induced fissions (< 1) by one source neutron; and individual doubles count rates. During the research, a hypothesis introduced in their report was proved to be true. Provisional calculations were done for UO_2 of 1∼10 kgU containing ∼ 0.009 wt% "2"4"4Cm. (author)

  10. Analyses of an air conditioning system with entropy generation minimization and entransy theory

    International Nuclear Information System (INIS)

    Wu Yan-Qiu; Cai Li; Wu Hong-Juan

    2016-01-01

    In this paper, based on the generalized heat transfer law, an air conditioning system is analyzed with the entropy generation minimization and the entransy theory. Taking the coefficient of performance (denoted as COP ) and heat flow rate Q out which is released into the room as the optimization objectives, we discuss the applicabilities of the entropy generation minimization and entransy theory to the optimizations. Five numerical cases are presented. Combining the numerical results and theoretical analyses, we can conclude that the optimization applicabilities of the two theories are conditional. If Q out is the optimization objective, larger entransy increase rate always leads to larger Q out , while smaller entropy generation rate does not. If we take COP as the optimization objective, neither the entropy generation minimization nor the concept of entransy increase is always applicable. Furthermore, we find that the concept of entransy dissipation is not applicable for the discussed cases. (paper)

  11. Impaired mismatch negativity (MMN) generation in schizophrenia as a function of stimulus deviance, probability, and interstimulus/interdeviant interval.

    Science.gov (United States)

    Javitt, D C; Grochowski, S; Shelley, A M; Ritter, W

    1998-03-01

    Schizophrenia is a severe mental disorder associated with disturbances in perception and cognition. Event-related potentials (ERP) provide a mechanism for evaluating potential mechanisms underlying neurophysiological dysfunction in schizophrenia. Mismatch negativity (MMN) is a short-duration auditory cognitive ERP component that indexes operation of the auditory sensory ('echoic') memory system. Prior studies have demonstrated impaired MMN generation in schizophrenia along with deficits in auditory sensory memory performance. MMN is elicited in an auditory oddball paradigm in which a sequence of repetitive standard tones is interrupted infrequently by a physically deviant ('oddball') stimulus. The present study evaluates MMN generation as a function of deviant stimulus probability, interstimulus interval, interdeviant interval and the degree of pitch separation between the standard and deviant stimuli. The major findings of the present study are first, that MMN amplitude is decreased in schizophrenia across a broad range of stimulus conditions, and second, that the degree of deficit in schizophrenia is largest under conditions when MMN is normally largest. The pattern of deficit observed in schizophrenia differs from the pattern observed in other conditions associated with MMN dysfunction, including Alzheimer's disease, stroke, and alcohol intoxication.

  12. Measurement of electromagnetic fields generated by air traffic control radar systems with spectrum analysers.

    Science.gov (United States)

    Barellini, A; Bogi, L; Licitra, G; Silvi, A M; Zari, A

    2009-12-01

    Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar.

  13. Analysing and evaluating the task of automatic tweet generation: Knowledge to business

    OpenAIRE

    Lloret, Elena; Palomar, Manuel

    2016-01-01

    In this paper a study concerning the evaluation and analysis of natural language tweets is presented. Based on our experience in text summarisation, we carry out a deep analysis on user's perception through the evaluation of tweets manual and automatically generated from news. Specifically, we consider two key issues of a tweet: its informativeness and its interestingness. Therefore, we analyse: (1) do users equally perceive manual and automatic tweets?; (2) what linguistic features a good tw...

  14. Analysing The Thermalhydraulic Parameters Of VVER-1000 Reactor For The Accident Of Steam Generator Tube Rupture

    International Nuclear Information System (INIS)

    Luu Nam Hai; Truong Cong Thang

    2011-01-01

    To ensure the safety operation of nuclear power plant (NPP), a lot of postulated accident scenarios were considered and analysed. This research chose and analysed the accident of steam generator tube rupture (SGTR) under the actual plant conditions by using the simulation program PCTRAN. The SGTR accident is happen when the NPP is under operation with the steady state condition (power of 3000 MWth, primary pressure of 157 bar and secondary pressure of 63 bar). The accident is initiated by creating a break with equivalent diameter of 100 mm in the area of lower row heat exchanging tubes. The result of analysis is compared with the calculation of the Shariz University, Iran using the thermal hydraulics code RELAP5/mod3.2 and the report in the PSAR data of VVER-1000. This comparison shows that it is possible for using PCTRAN to analyse accidents of VVER-1000 reactor. (author)

  15. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Thermodynamic and carbon analyses of micro-generators for UK households

    International Nuclear Information System (INIS)

    Allen, S.R.; Hammond, G.P.

    2010-01-01

    Micro-generators have the potential to reduce carbon emissions and enhance energy security by providing heat or electricity either from renewable sources, or via the more efficient use of fossil fuels. Such potential is often, however, unquantified or unclear, and hence a thermodynamic and related carbon analysis of micro-generators for UK household energy supply has been performed. Where pertinent, the thermodynamic concept of exergy is employed alongside that of energy. Analysis begins with a description of the established methods of energy supply to, and use within, typical UK households. On these foundations a grid-tied micro-wind turbine, a grid-tied solar photovoltaic array, and a solar hot-water system are analysed. Annual outputs are estimated and contextualised against the demands of representative households. The annual energy-resource and carbon savings provided by the micro-generators are determined on the basis that they (partially) displace the established supply systems. Savings are then compared with the energy-resource and carbon-emission 'debts' of the micro-generators, to assess the latter's net performance. Given appropriate installations, all three micro-generators are found to provide significant net energy and carbon benefits, confirming that all three technologies can provide net reductions in both carbon emissions and dependence on conventional energy resources.

  17. Discussion on the applicability of entropy generation minimization to the analyses and optimizations of thermodynamic processes

    International Nuclear Information System (INIS)

    Cheng, XueTao; Liang, XinGang

    2013-01-01

    Highlights: • The applicability of entropy generation minimization is conditional. • The concept of exergy-work conversion efficiency is defined. • The concept of exergy destruction number is introduced. • Smaller exergy destruction number leads to larger exergy-work conversion efficiency. - Abstract: This work reports the analyses of some thermodynamic systems with the concepts of entropy generation, entropy generation numbers and revised entropy generation number, as well as exergy destruction number and exergy-work conversion efficiency that are proposed in this paper. The applicability of entropy generation minimization (EGM) is conditional if the optimization objective is the output power. The EGM leads to the maximum output power when the net exergy flow rate into the system is fixed, but it may not be appropriate if the net exergy flow rate into the system is not fixed. On the other hand, smaller exergy destruction number always corresponds to larger exergy-work conversion efficiency. The refrigeration cycle with the reverse Carnot engine is also analyzed in which mechanical work is input. The result shows that the EGM leads to the largest COP if the temperature of the high temperature heat source is fixed

  18. Analyses of reliability characteristics of emergency diesel generator population using empirical Bayes methods

    International Nuclear Information System (INIS)

    Vesely, W.E.; Uryas'ev, S.P.; Samanta, P.K.

    1993-01-01

    Emergency Diesel Generators (EDGs) provide backup power to nuclear power plants in case of failure of AC buses. The reliability of EDGs is important to assure response to loss-of-offsite power accident scenarios, a dominant contributor to the plant risk. The reliable performance of EDGs has been of concern both for regulators and plant operators. In this paper the authors present an approach and results from the analysis of failure data from a large population of EDGs. They used empirical Bayes approach to obtain both the population distribution and the individual failure probabilities from EDGs failure to start and load-run data over 4 years for 194 EDGs at 63 plant units

  19. Higher third-generation cephalosporin prescription proportion is associated with lower probability of reducing carbapenem use: a nationwide retrospective study

    Directory of Open Access Journals (Sweden)

    Allison Muller

    2018-01-01

    Full Text Available Abstract Background The ongoing extended spectrum β-lactamase-producing Enterobacteriaceae (ESBL-PE pandemic has led to an increasing carbapenem use, requiring release of guidelines for carbapenem usage in France in late 2010. We sought to determine factors associated with changes in carbapenem use in intensive care units (ICUs, medical and surgical wards between 2009 and 2013. Methods This ward-level multicentre retrospective study was based on data from French antibiotic and multidrug-resistant bacteria surveillance networks in healthcare facilities. Antibiotic use was expressed in defined daily doses per 1000 patient-days. Factors associated with the reduction in carbapenem use (yes/no over the study period were determined from random-effects logistic regression model (493 wards nested within 259 healthcare facilities: ward characteristics (type, size…, ward antibiotic use (initial antibiotic use [i.e., consumption of a given antibiotic in 2009], initial antibiotic prescribing profile [i.e., proportion of a given antibiotic in the overall antibiotic consumption in 2009] and reduction in the use of a given antibiotic between 2009 and 2013 and regional ESBL-PE incidence rate in acute care settings in 2011. Results Over the study period, carbapenem consumption in ICUs (n = 85, medical (n = 227 and surgical wards (n = 181 was equal to 73.4, 6.2 and 5.4 defined daily doses per 1000 patient-days, respectively. Release of guidelines was followed by a significant decrease in carbapenem use within ICUs and medical wards, and a slowdown in use within surgical wards. The following factors were independently associated with a higher probability of reducing carbapenem use: location in Eastern France, higher initial carbapenem prescribing profile and reductions in consumption of fluoroquinolones, glycopeptides and piperacillin/tazobactam. In parallel, factors independently associated with a lower probability of reducing carbapenem use were

  20. Life cycle analyses applied to first generation bio-fuels consumed in France

    International Nuclear Information System (INIS)

    2010-01-01

    This rather voluminous publication reports detailed life cycle analyses for the different present bio-fuels channels also named first-generation bio-fuels: bio-ethanol, bio-diesel, pure vegetal oils, and oil. After a recall of the general principles adopted for this life-cycle analysis, it reports the modelling of the different channels (agricultural steps, bio-fuel production steps, Ethyl tert-butyl ether or ETBE steps, vehicles, animal fats and used vegetal oils, soil assignment change). It gives synthetic descriptions of the different production ways (methyl ester from different plants, ethanol from different plants). It reports and compares the results obtained in terms of performance

  1. Measurement of electromagnetic fields generated by air traffic control radar systems with spectrum analysers

    International Nuclear Information System (INIS)

    Barellini, A.; Bogi, L.; Licitra, G.; Silvi, A. M.; Zari, A.

    2009-01-01

    Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar. (authors)

  2. Analyse on changes of runoff generation and confluence of the Luohe River

    International Nuclear Information System (INIS)

    Yiming Si; Xiaowei Liu

    2004-01-01

    The change trend of water cycle factors such as rainfall, runoff and flood events etc. in the Luohe River basin are analysed based on hydrological data since 1950s. The analysis shows that rainfall has been decreasing, but not much, while runoff has been decreasing remarkably. Under the same rainfall conditions, runoff and peak discharge have dropped considerably, runoff coefficient has become much smaller, and the frequency of flood occurrence has been decreasing. It is considered that environmental variation caused by human activities accounts for the change, in characteristics of runoff generation and confluence in the basin.(Author)

  3. How sex- and age-disaggregated data and gender and generational analyses can improve humanitarian response.

    Science.gov (United States)

    Mazurana, Dyan; Benelli, Prisca; Walker, Peter

    2013-07-01

    Humanitarian aid remains largely driven by anecdote rather than by evidence. The contemporary humanitarian system has significant weaknesses with regard to data collection, analysis, and action at all stages of response to crises involving armed conflict or natural disaster. This paper argues that humanitarian actors can best determine and respond to vulnerabilities and needs if they use sex- and age-disaggregated data (SADD) and gender and generational analyses to help shape their assessments of crises-affected populations. Through case studies, the paper shows how gaps in information on sex and age limit the effectiveness of humanitarian response in all phases of a crisis. The case studies serve to show how proper collection, use, and analysis of SADD enable operational agencies to deliver assistance more effectively and efficiently. The evidence suggests that the employment of SADD and gender and generational analyses assists in saving lives and livelihoods in a crisis. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  4. Analyses of pebble-bed reactors for the generation of heat for heating purposes

    International Nuclear Information System (INIS)

    Muehlensiep, J.; Fricke, U.; Inhester, K.H.; Kugeler, K.; Phlippen, P.W.; Schmidtlein, P.; Swatoch, R.; Wagner, U.

    1986-10-01

    Marginal conditions are described for the use of a nuclear power reactor for long-distance heat supply in densely populated areas. For the design of the high-temperature heat generating reactor, plant components and possible arrangements are analyzed with consideration to safety and costs. System sizes are reasonably chosen on the basis of analyzed parameters, the paramount design goal being to adequately retain the fission products in the coated particles of the fuel elements, in anticipation of probable accidents. With the help of the data record obtained, a system is designed with a cuboid-shaped core as its characteristic feature; the advantage of the core consists in the fact that it quickly discharges the after-heat outwards even in case of a hypothetical accident. Due to the core shape, it is possible to install heat-exchanging components near the core, and to place the safety rods where they can be very effective in reflector borings. (orig./HP) [de

  5. Classical evolution and quantum generation in generalized gravity theories including string corrections and tachyons: Unified analyses

    International Nuclear Information System (INIS)

    Hwang, Jai-chan; Noh, Hyerim

    2005-01-01

    We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein's gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein's gravity and others

  6. Predictive analyses of flow-induced vibration and fretting wear in steam generator tubes

    International Nuclear Information System (INIS)

    Axisa, F.

    1989-01-01

    Maintaining the service life of PWR steam generators under highly reliable conditions requires a complex design to prevent various damaging processes, including those related to flow induced vibration. Predictive analyses have to rely on numerical tools to compute the vibratory response of multi-supported tubes in association with experimental data and semi-empirical relationships for quantifying flow-induced excitation mechanisms and tube damaging processes. In the presence of loose supports tube dynamics becomes highly nonlinear in nature. To deal with such problems CEA and FRAMATOME developed a computer program called GERBOISE. This paper provides a short description of an experimental program currently in progress at CEN Saclay to validate the numerical methods implemented in GERBOISE. According to the results obtained so far reasonable agreement is obtained between experiment and numerical simulation, especially as averaged quantities are concerned

  7. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. EXPERIMENTAL ANALYSES OF SPALLATION NEUTRONS GENERATED BY 100 MEV PROTONS AT THE KYOTO UNIVERSITY CRITICAL ASSEMBLY

    Directory of Open Access Journals (Sweden)

    CHEOL HO PYEON

    2013-02-01

    Full Text Available Neutron spectrum analyses of spallation neutrons are conducted in the accelerator-driven system (ADS facility at the Kyoto University Critical Assembly (KUCA. High-energy protons (100 MeV obtained from the fixed field alternating gradient accelerator are injected onto a tungsten target, whereby the spallation neutrons are generated. For neutronic characteristics of spallation neutrons, the reaction rates and the continuous energy distribution of spallation neutrons are measured by the foil activation method and by an organic liquid scintillator, respectively. Numerical calculations are executed by MCNPX with JENDL/HE-2007 and ENDF/B-VI libraries to evaluate the reaction rates of activation foils (bismuth and indium set at the target and the continuous energy distribution of spallation neutrons set in front of the target. For the reaction rates by the foil activation method, the C/E values between the experiments and the calculations are found around a relative difference of 10%, except for some reactions. For continuous energy distribution by the organic liquid scintillator, the spallation neutrons are observed up to 45 MeV. From these results, the neutron spectrum information on the spallation neutrons generated at the target are attained successfully in injecting 100 MeV protons onto the tungsten target.

  9. Generation of anti-idiotype antibodies for application in clinical immunotherapy laboratory analyses.

    Science.gov (United States)

    Liu, Zhanqi; Panousis, Con; Smyth, Fiona E; Murphy, Roger; Wirth, Veronika; Cartwright, Glenn; Johns, Terrance G; Scott, Andrew M

    2003-08-01

    The chimeric monoclonal antibody ch806 specifically targets the tumor-associated mutant epidermal growth factor receptor (de 2-7EGFR or EGFRVIII) and is currently under investigation for its potential use in cancer therapy. The humanised monoclonal antibody hu3S193 specifically targets the Lewis Y epithelial antigen and is currently in Phase I clinical trials in patients with advanced breast, colon, and ovarian carcinomas. To assist the clinical evaluation of ch806 and hu3S193, laboratory assays are required to monitor their serum pharmacokinetics and quantitate any immune responses to the antibodies. Mice immunized with ch806 or hu3S193 were used to generate hybridomas producing antibodies with specific binding to ch806 or hu3S193 and competitive for antigen binding. These anti-idiotype antibodies (designated Ludwig Melbourne Hybridomas, LMH) were investigated as reagents suitable for use as positive controls for HAHA or HACA analyses and for measuring hu3S193 or ch806 in human serum. Anti-idiotypes with the ability to concurrently bind two target antibody molecules were identified, which enabled the development of highly reproducible, sensitive, specific ELISA assays for determining serum concentrations of hu3S193 and ch806 with a 3 ng/mL limit of quantitation using LMH-3 and LMH-12, respectively. BIAcore analyses determined high apparent binding affinity for both idiotypes: LMH-3 binding immobilized hu3S193, Ka = 4.76 x 10(8) M(-1); LMH-12 binding immobilised ch806, Ka = 1.74 x 10(9) M(-1). Establishment of HAHA or HACA analysis of sera samples using BIAcore was possible using LMH-3 and LMH-12 as positive controls for quantitation of immune responses to hu3S193 or ch806 in patient sera. These anti-idiotypes could also be used to study the penetrance and binding of ch806 or hu3S193 to tumor cells through immunohistochemical analysis of tumor biopsies. The generation of anti-idiotype antibodies capable of concurrently binding a target antibody on each variable

  10. Suitability Analyses of Wind Power Generation Complex in South Korea by Using Environmental & Social Criterias

    Science.gov (United States)

    Zhu, Y.; Jeon, S. W.; Seong, M.

    2017-12-01

    In case of wind-power, one of the most economical renewable energy resources, it is highly emerged owing to the strategic aspect of the response of environmental restriction and strong energy security as well as the upcoming motivation for huge industrial growth in the future. According to the fourth Fundamental Renewable Energy Plan, declared in Sep. 2014, the government instituted the scheme to minimize the proportion of previous RDF(Refused Derived Fuel) till 2035, promoting the solar power and wind power as the core energy for the next generation. Especially in South Korea, it is somewhat desperate to suggest the standard for environmentally optimal locations of wind power setup accompanied with the prevention of disasters from the climate changes. This is because that in case of South Korea, most of suitable places for Wind power complex are in the ridge of the mountains, where is highly invaluable sites as the pool of bio-resources and ecosystem conservations. In this research, we are to focus on the analysis of suitable locations for wind farm site which is relevant to the meteorological and geological factors, by utilizing GIS techniques through the whole South Korea. Ultimately, this analyses are to minimize the adverse effect derived from the current development of wind power in mountain ridges and the time for negotiation for wind power advance.

  11. Scoping analyses for the safety injection system configuration for Korean next generation reactor

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Song, Jin Ho; Park, Jong Kyoon

    1996-01-01

    Scoping analyses for the Safety Injection System (SIS) configuration for Korean Next Generation Reactor (KNGR) are performed in this study. The KNGR SIS consists of four mechanically separated hydraulic trains. Each hydraulic train consisting of a High Pressure Safety Injection (HPSI) pump and a Safety Injection Tank (SIT) is connected to the Direct Vessel Injection (DVI) nozzle located above the elevation of cold leg and thus injects water into the upper portion of reactor vessel annulus. Also, the KNGR is going to adopt the advanced design feature of passive fluidic device which will be installed in the discharge line of SIT to allow more effective use of borated water during the transient of large break LOCA. To determine the feasible configuration and capacity of SIT and HPSl pump with the elimination of the Low Pressure Safety Injection (LPSI) pump for KNGR, licensing design basis evaluations are performed for the limiting large break LOCA. The study shows that the DVI injection with the fluidic device SlT enhances the SIS performance by allowing more effective use of borated water for an extended period of time during the large break LOCA

  12. 78 FR 38411 - Vogtle Electric Generating Plant, Unit 4; Inspections, Tests, Analyses, and Acceptance Criteria

    Science.gov (United States)

    2013-06-26

    ... Plant, Unit 4; Inspections, Tests, Analyses, and Acceptance Criteria AGENCY: Nuclear Regulatory Commission. ACTION: Determination of inspections, tests, analyses, and acceptance criteria completion. SUMMARY: The U.S. Nuclear Regulatory Commission (NRC) staff has determined that the inspections, tests...

  13. Analyses of internal tides generation and propagation over a Gaussian ridge in laboratory and numerical experiments

    Science.gov (United States)

    Dossmann, Yvan; Paci, Alexandre; Auclair, Francis; Floor, Jochem

    2010-05-01

    test the dynamics and energetics of the numerical model, but also to advance the analysis based on combined wavelet and empirical orthogonal function. In particular, we focus on the study of the transient regime of internal wave generation near the ridge. Our analyses of the experimental fields show that, for fixed background stratification and topography, the evolution of the stratification anomaly strongly depends on the forcing frequency. The duration of the transient regime, as well as the amplitude reached in the stationary state vary significantly with the parameter ω/N (where ω is the forcing frequency, and N is the background Brunt-Väisälä frequency). We also observe that, for particular forcing frequencies, for which the ridge slope matches the critical slope of the first harmonic mode, internal waves are excited both at the fundamental and the first harmonic frequency. Associated energy transfers are finally evaluated both experimentally and numerically, enabling us to highlight the similarities and discrepancies between the laboratory experiments and the numerical simulations. References [1] Munk W. and C. Wunsch (1998): Abyssal recipes II: energetics of tidal and wind mixing Deep-Sea Res. 45, 1977-2010 [2] Tailleux R. (2009): On the energetics of stratified turbulent mixing, irreversible thermodynamics, Boussinesq models and the ocean heat engine controversy, J. Fluid Mech. 638, 339-382 [3] Knigge C., D. Etling, A. Paci and O. Eiff (2010): Laboratory experiments on mountain-induced rotors, Quarterly Journal of the Royal Meteorological Society, in press. [4] Auclair F., C. Estournel, J. Floor, C. N'Guyen and P. Marsaleix, (2009): A non-hydrostatic, energy conserving algorithm for regional ocean modelling. Under revision. [5] Wunsch, C. & R. Ferrari (2004): Vertical mixing, energy and the general circulation of the oceans. Annu. Rev. Fluid Mech., 36:281-314.

  14. The applicability of detailed process for neutron resonance absorption to neutronics analyses in LWR next generation fuels to extend burnup

    International Nuclear Information System (INIS)

    Kameyama, Takanori; Nauchi, Yasushi

    2004-01-01

    Neutronics analyses with detail processing for neutron resonance absorption in LWR next generation UOX and MOX fuels to extend burnup were performed based on the neutronic transport and burnup calculation. In the detailed processing, ultra-fine energy nuclear library and collision probabilities between neutron and U, Pu nuclides (actinide nuclides) are utilized for two-dimension geometry. In the usual simple processing (narrow resonance approximation), shielding factors and compensation equations for neutron resonance absorption are utilized. The results with detailed and simple processing were compared to clarify where the detailed processing is needed. The two processing caused difference of neutron multiplication factor by 0.5% at the beginning of irradiation, while the difference became smaller as burnup increased and was not significant at high burnup. The nuclide compositions of the fuel rods for main actinide nuclides were little different besides Cm isotopes by the processing, since the neutron absorption rate of 244 Cm became different. The detail processing is needed to evaluate the neutron emission rate in spent fuels. In the fuel assemblies, the distributions of rod power rates were not different within 0.5%, and the peak rates of fuel rod were almost the same by the two processing at the beginning of irradiation when the peak rate is the largest during the irradiation. The simple processing is also satisfied for safety evaluation based on the peak rate of rod power. The difference of local power densities in fuel pellets became larger as burnup increased, since the neutron absorption rate of 238 U in the peripheral region of pellets were significantly different by the two processing. The detail processing is needed to evaluate the fuel behavior at high burnup. (author)

  15. Comparative analyses of two Geraniaceae transcriptomes using next-generation sequencing.

    Science.gov (United States)

    Zhang, Jin; Ruhlman, Tracey A; Mower, Jeffrey P; Jansen, Robert K

    2013-12-29

    Organelle genomes of Geraniaceae exhibit several unusual evolutionary phenomena compared to other angiosperm families including accelerated nucleotide substitution rates, widespread gene loss, reduced RNA editing, and extensive genomic rearrangements. Since most organelle-encoded proteins function in multi-subunit complexes that also contain nuclear-encoded proteins, it is likely that the atypical organellar phenomena affect the evolution of nuclear genes encoding organellar proteins. To begin to unravel the complex co-evolutionary interplay between organellar and nuclear genomes in this family, we sequenced nuclear transcriptomes of two species, Geranium maderense and Pelargonium x hortorum. Normalized cDNA libraries of G. maderense and P. x hortorum were used for transcriptome sequencing. Five assemblers (MIRA, Newbler, SOAPdenovo, SOAPdenovo-trans [SOAPtrans], Trinity) and two next-generation technologies (454 and Illumina) were compared to determine the optimal transcriptome sequencing approach. Trinity provided the highest quality assembly of Illumina data with the deepest transcriptome coverage. An analysis to determine the amount of sequencing needed for de novo assembly revealed diminishing returns of coverage and quality with data sets larger than sixty million Illumina paired end reads for both species. The G. maderense and P. x hortorum transcriptomes contained fewer transcripts encoding the PLS subclass of PPR proteins relative to other angiosperms, consistent with reduced mitochondrial RNA editing activity in Geraniaceae. In addition, transcripts for all six plastid targeted sigma factors were identified in both transcriptomes, suggesting that one of the highly divergent rpoA-like ORFs in the P. x hortorum plastid genome is functional. The findings support the use of the Illumina platform and assemblers optimized for transcriptome assembly, such as Trinity or SOAPtrans, to generate high-quality de novo transcriptomes with broad coverage. In addition

  16. Genetic Analyses of a Three Generation Family Segregating Hirschsprung Disease and Iris Heterochromia.

    Directory of Open Access Journals (Sweden)

    Long Cui

    Full Text Available We present the genetic analyses conducted on a three-generation family (14 individuals with three members affected with isolated-Hirschsprung disease (HSCR and one with HSCR and heterochromia iridum (syndromic-HSCR, a phenotype reminiscent of Waardenburg-Shah syndrome (WS4. WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10 and in the main HSCR gene (RET. Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor the transition results in the abolishment of translation initiation (M1V, in isoform 3 (only in the cytosol the replacement occurs at Met91 (M91V and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency- in the 5'-untranslated region of EDN3 (EDNRB ligand was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family.

  17. Hydrologic analyses in support of the Navajo Generating Station–Kayenta Mine Complex environmental impact statement

    Science.gov (United States)

    Leake, Stanley A.; Macy, Jamie P.; Truini, Margot

    2016-06-01

    reclamation operations within the Kayenta Mine permit boundary since 1973.The KMC part of the proposed project requires approval by the Office of Surface Mining (OSM) of a significant revision of the mine’s permit to operate in accordance with the Surface Mine Control and Reclamation Act (Public Law 95-87, 91 Stat. 445 [30 U.S.C. 1201 et seq.]). The revision will identify coal resource areas that may be used to continue extracting coal at the present rate of approximately 8.2 million tons per year. The Kayenta Mine Complex uses water pumped from the D and N aquifers beneath PWCC’s leasehold to support mining and reclamation activities. Prior to 2006, water from the PWCC well field also was used to transport coal by way of a coal-slurry pipeline to the now-closed Mohave Generating Station. Water usage at the leasehold was approximately 4,100 acre-feet per year (acre-ft/yr) during the period the pipeline was in use, and declined to an average 1,255 acre-ft/yr from 2006 to 2011. The Probable Hydrologic Consequences (PHC) section of the mining and reclamation permit must be modified to project the consequences of extended water use by the mine for the duration of the KMC part of the project, including a post-mining reclamation period.Since 1971, the U.S. Geological Survey (USGS) has conducted the Black Mesa Monitoring Program, which consists of monitoring water levels and water quality in the N aquifer, compiling information on water use by PWCC and tribal communities, maintaining several stream-gaging stations, measuring discharge at selected springs, conducting special studies, and reporting findings. These data are useful in evaluating the effects on the N aquifer from PWCC and community pumping, and the effects of variable precipitation.The EIS will assess the impacts of continued pumping on the N aquifer, including changes in storage, water quality, and effects on spring and baseflow discharge, by proposed mining through 2044, and during the reclamation process to 2057

  18. Trend analyses of the emergency diesel generator problem events in Japanese and U.S. nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2011-01-01

    Up to 2009, the author and a colleague conducted trend analyses of problem events related to main generators, emergency diesel generators, breakers, motors and transformers which are more likely to cause problems than other electric components in nuclear power plants. Among the electric components with high frequency of defect occurrence, i.e., emergency diesel generators, several years have passed since the last analyses. These are very important components needed to stop a nuclear reactor safely and to cool it down during external power supply loses. Then trend analyses were conducted for the second time. The trend analyses were performed on 80 problem events with emergency diesel generators which had occurred in U.S. nuclear power plants in the five years from 2005 through 2009 among events reported in the Licensee Event Reports (LERs: event reports submitted to NRC by U.S. nuclear power plants) which have been registered in the nuclear information database of the Institute of Nuclear Safety System, Inc. (INSS) , as well as 40 events registered in the Nuclear Information Archives (NUCIA), which occurred in Japanese nuclear power plants in the same time period. It was learned from the trend analyses of the problem events with emergency diesel generators that frequency of defect occurrence are high in both Japanese and US plants during plant operations and functional tests (that is, defects can be discovered effectively in advance), so that implementation of periodical functional tests under plant operation is an important task for the future. (author)

  19. Comparison based on energy and exergy analyses of the potential cogeneration efficiencies for fuel cells and other electricity generation devices

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnical Inst., Toronto, (CA). Dept. of Mechanical Engineering

    1990-01-01

    Comparisons of the potential cogeneration efficiencies are made, based on energy and exergy analyses, for several devices for electricity generation. The investigation considers several types of fuel cell system (Phosphoric Acid, Alkaline, Solid Polymer Electrolyte, Molten Carbonate and Solid Oxide), and several fossil-fuel and nuclear cogeneration systems based on steam power plants. In the analysis, each system is modelled as a device for which fuel and air enter, and electrical- and thermal-energy products and material and thermal-energy wastes exit. The results for all systems considered indicate that exergy analyses should be used when analysing the cogeneration potential of systems for electricity generation, because they weigh the usefulnesses of heat and electricity on equivalent bases. Energy analyses tend to present overly optimistic views of performance. These findings are particularly significant when large fractions of the heat output from a system are utilized for cogeneration. (author).

  20. A developmental study of risky decisions on the cake gambling task: age and gender analyses of probability estimation and reward evaluation.

    Science.gov (United States)

    Van Leijenhorst, Linda; Westenberg, P Michiel; Crone, Eveline A

    2008-01-01

    Decision making, or the process of choosing between competing courses of actions, is highly sensitive to age-related change, showing development throughout adolescence. In this study, we tested whether the development of decision making under risk is related to changes in risk-estimation abilities. Participants (N = 93) between ages 8-30 performed a child friendly gambling task, the Cake Gambling task, which was inspired by the Cambridge Gambling Task (Rogers et al., 1999), which has previously been shown to be sensitive to orbitofrontal cortex (OFC) damage. The task allowed comparisons of the contributions to risk perception of (1) the ability to estimate probabilities and (2) evaluate rewards. Adult performance patterns were highly similar to those found in previous reports, showing increased risk taking with increases in the probability of winning and the magnitude of potential reward. Behavioral patterns in children and adolescents did not differ from adult patterns, showing a similar ability for probability estimation and reward evaluation. These data suggest that participants 8 years and older perform like adults in a gambling task, previously shown to depend on the OFC in which all the information needed to make an advantageous decision is given on each trial and no information needs to be inferred from previous behavior. Interestingly, at all ages, females were more risk-averse than males. These results suggest that the increase in real-life risky behavior that is seen in adolescence is not a consequence of changes in risk perception abilities. The findings are discussed in relation to theories about the protracted development of the prefrontal cortex.

  1. Analyses of steam generator collector rupture for WWER-1000 using Relap5 code

    Energy Technology Data Exchange (ETDEWEB)

    Balabanov, E.; Ivanova, A. [Energoproekt, Sofia (Bulgaria)

    1995-12-31

    The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment.

  2. Analyses of steam generator collector rupture for WWER-1000 using Relap5 code

    Energy Technology Data Exchange (ETDEWEB)

    Balabanov, E; Ivanova, A [Energoproekt, Sofia (Bulgaria)

    1996-12-31

    The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment.

  3. Analyses of steam generator collector rupture for WWER-1000 using Relap5 code

    International Nuclear Information System (INIS)

    Balabanov, E.; Ivanova, A.

    1995-01-01

    The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment

  4. Insights from the analyses of risk-informed extension of diesel generator allowed outage time

    International Nuclear Information System (INIS)

    Lin, J.C.; He Wei

    2005-01-01

    In recent years, many U.S. nuclear plants have applied and received approval for the risk-informed extension of the Allowed Outage Time (AOT) for Emergency Diesel Generators (EDGs). These risk-informed applications need to meet the regulatory guidance on the risk criteria. This paper discusses in detail insights derived from the risk-informed analyses performed to support these applications. The risk criteria on ΔCDF/ΔLERF evaluate the increase in average risk by extending the AOT for EDGs, induced primarily by an increase in EDG maintenance unavailability due to the introduction of additional EDG preventive maintenance. By performing this preventive maintenance work on-line, the outage duration can be shortened. With proper refinement of the risk model, most plants can meet the ΔCDF/ΔLERF criteria for extending the EDGAOT from, for example, 3 days to 14 days. The key areas for model enhancements to meet these criteria include offsite/onsite power recovery, LERF modeling, etc. The most important LERF model enhancements consist of refinement of the penetrations included in the containment isolation model for the consideration of a large release, and taking credit for operator vessel depressurization during the time period between core damage and vessel failure. A recent study showed that although the frequency of loss of offsite power (LOSP) has decreased, the duration of offsite power recovery has actually increased. However, many of the events used to derive this conclusion may not be applicable to PRAs. One approach develops the offsite power non-recovery factor by first screening the LOSP events for applicability to the plant being analyzed, power operation, and LOSP initiating event, then using the remaining events data for the derivation based on the fraction of events with recovery duration longer than the time window allowed. The risk criteria on ICCDP/ICLERP examine the increase in risk from the average CDF/LERF, based on the increased maintenance

  5. Multi-Objective Sensitivity Analyses for Power Generation Mix: Malaysia Case Study

    OpenAIRE

    Siti Mariam Mohd Shokri; Nofri Yenita Dahlan; Hasmaini Mohamad

    2017-01-01

    This paper presents an optimization framework to determine long-term optimal generation mix for Malaysia Power Sector using Dynamic Programming (DP) technique. Several new candidate units with a pre-defined MW capacity were included in the model for generation expansion planning from coal, natural gas, hydro and renewable energy (RE). Four objective cases were considered, 1) economic cost, 2) environmental, 3) reliability and 4) multi-objectives that combining the three cases. Results show th...

  6. Thermal and stress analyses in thermoelectric generator with tapered and rectangular pin configurations

    International Nuclear Information System (INIS)

    Yilbas, Bekir Sami; Akhtar, S.S.; Sahin, A.Z.

    2016-01-01

    Thermal stress developed in thermoelectric generators is critical for long service applications. High temperature gradients, due to a large temperature difference across the junctions, causes excessive stress levels developed in the device pins and electrodes at the interfaces. In the present study, a thermoelectric generator with horizontal pin configuration is considered and thermal stress analysis in the device is presented. Ceramic wafer is considered to resemble the high temperature plate and copper electrodes are introduced at the pin junctions to reduce the electrical resistance between the pins and the high and low temperature junction plates during the operation. Finite element code is used to simulate temperature and stress fields in the thermoelectric generator. In the simulations, convection and radiation losses from the thermoelectric pins are considered and bismuth telluride pin material with and without tapering is incorporated. It is found that von Mises stress attains high values at the interface between the hot and cold junctions and the copper electrodes. Thermal stress developed in tapered pin configuration attains lower values than that of rectangular pin cross-section. - Highlights: • Different cold junction temperatures improves thermoelectric generator performance. • von Mises stress remains high across copper electrodes and hot junction ceramics. • von Mises stress reduces along pin length towards cold junction. • Pin tapering lowers stress levels in thermoelectric generator.

  7. Characterization of Yellow Seahorse Hippocampus kuda feeding click sound signals in a laboratory environment: an application of probability density function and power spectral density analyses

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Saran, A.K.; Kuncolienker, D.S.; Sreepada, R.A.; Haris, K.; Fernandes, W.A

    based on the assumption of combinations of normal / Gaussian distributions indicate well fitted multimodal curves generated using MATLAB (Math Works Inc 2005) programs. Out of the twenty three clicks, four clicks (two clicks of 16 and 18cm male... Society of America 115: 2331-2333. Malamud BD, Turcotte DL. 1999. Self affine time series: measures of weak and strong persistence. Journal of Statistical Planning and Inference 80:173-196. MATLAB, Curve Fitting toolbox, Math Works Inc 2005. Available...

  8. A method for analysing secondary economic effects generated by big research centres.

    CERN Document Server

    Bianchi-Streit, M.; Budde, R.; Reitz, H.; Sagnell, B.; Schmied, H.; Schorr, B.

    Research activities in the natural sciences, and especially those in the field of pure research work as opposed to applied research, are being financially supported for various reasons, probably the least of which is the hope for a quick economic return. It has, nevertheless, been realised for a number of years that benefits of one sort or another may appear in various and sometimes unexpected ways, where these be— nefits are not the direct consequence of the applica— tion of a research result. They are rather to be com— pared with the well—known ”spin—off” effects obtained while pursuing the research work. An example may help to illustrate what is meant.

  9. Analyses of thermodynamic performance for the endoreversible Otto cycle with the concepts of entropy generation and entransy

    Institute of Scientific and Technical Information of China (English)

    WU; YanQiu

    2017-01-01

    In this paper,power the the and endoreversible the Otto cycle is analyzed with the entropy generation minimization objectives,and the the entransy theory.of The output power,the heat-work conversion efficiency are taken as the optimization rate,and relationships the output heat-work conversion efficiency,entransy the entropy generation and the entropy generation rate numbers,the work entransy are loss rate,The entransy loss of coefficient,the dissipation rate the entransy variation associated with discussed.applicability entropy the entropy generation minimization and the entransy theory while to the analyses is also analyzed.It is found do that smaller generation rate does not always lead to larger output our power,smaller entropy entransy generation loss numbers and not always lead to larger heat-work conversion efficiency,either.larger the In calculations,power,both larger larger rate larger entransy variation heat-work rate associated with work correspond also to that output while entransy is loss coefficient suitable results the in larger conversion developed efficiency.It is found concept of entransy dissipation not always for analyses because it was for heat transfer.

  10. Auto-ignition generated combustion. Pt. 2. Thermodynamic fundamentals; Verbrennungssteuerung durch Selbstzuendung. T. 2. Experimentelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Guibert, P. [Paris-6 Univ. (France). Lab. de Mecanique Physique; Morin, C. [Paris-6 Univ. (France); Mokhtari, S.

    2004-02-01

    The combustion initiation by auto-ignition demonstrates benefits in NO{sub x} reduction and in process stability for both spark-ignited and compression ignited engines. Based on the better thermodynamic particularities of the auto-ignition, which have been presented in the first part, the characteristics of this process are demonstrated in the second part by experimental analysis. For comparison with similar studies, the analyses have been carried out in base of a two stroke loop scavenged spark-ignition single cylinder engine. (orig.) [German] Die Steuerung der Verbrennung durch Selbstzuendung zeigt Vorteile bezueglich Senkung der NO{sub x}-Emission und Prozessstabilitaet, sowohl bei Otto- als auch bei Dieselmotoren. Auf Grundlage der thermodynamischen Besonderheiten der Selbstzuendvorgaenge, die im ersten Teil praesentiert wurden, erfolgt im zweiten Teil eine experimentelle Betrachtung der Prozesscharakteristika. Zur Vergleichbarkeit mit aehnlichen Untersuchungen wird die experimentelle Analyse auf Basis eines Zweitakt-Einzylinder-Ottomotors mit Umkehrspuelung durchgefuehrt. (orig.)

  11. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  12. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  13. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  14. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  15. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  16. Dynamic analyses, FPGA implementation and engineering applications of multi-butterfly chaotic attractors generated from generalised Sprott C system

    Science.gov (United States)

    Lai, Qiang; Zhao, Xiao-Wen; Rajagopal, Karthikeyan; Xu, Guanghui; Akgul, Akif; Guleryuz, Emre

    2018-01-01

    This paper considers the generation of multi-butterfly chaotic attractors from a generalised Sprott C system with multiple non-hyperbolic equilibria. The system is constructed by introducing an additional variable whose derivative has a switching function to the Sprott C system. It is numerically found that the system creates two-, three-, four-, five-butterfly attractors and any other multi-butterfly attractors. First, the dynamic analyses of multi-butterfly chaotic attractors are presented. Secondly, the field programmable gate array implementation, electronic circuit realisation and random number generator are done with the multi-butterfly chaotic attractors.

  17. Comparative Analyses on OPR1000 Steam Generator Tube Rupture Event Emergency Operational Guideline

    International Nuclear Information System (INIS)

    Lee, Sang Won; Bae, Yeon Kyoung; Kim, Hyeong Teak

    2006-01-01

    The Steam Generator Tube Rupture (SGTR) event is one of the important scenarios in respect to the radiation release to the environment. When the SGTR occurs, containment integrity is not effective because of the direct bypass of containment via the ruptured steam generator to the MSSV and MSADV. To prevent this path, the Emergency Operational Guideline of OPR1000 indicates the use of Turbine Bypass Valves (TBVs) as an effective means to depressurize the main steam line and prevent the lifting of MSSV. However, the TBVs are not operable when the offsite power is not available (LOOP). In this situation, the RCS cool-down is achieved by opening the both intact and ruptured SG MSADV. But this action causes the large amount of radiation release to the environment. To minimize the radiation release to the environment, KSNP EOG adopts the improved strategy when the SGTR concurrently with LOOP is occurred. However, these procedures show some duplicated procedure and branch line that might confusing the operator for optimal recovery action. So, in this paper, the comparative analysis on SGTR and SGTR with LOOP is performed and optimized procedure is proposed

  18. Generation and analyses of human synthetic antibody libraries and their application for protein microarrays.

    Science.gov (United States)

    Säll, Anna; Walle, Maria; Wingren, Christer; Müller, Susanne; Nyman, Tomas; Vala, Andrea; Ohlin, Mats; Borrebaeck, Carl A K; Persson, Helena

    2016-10-01

    Antibody-based proteomics offers distinct advantages in the analysis of complex samples for discovery and validation of biomarkers associated with disease. However, its large-scale implementation requires tools and technologies that allow development of suitable antibody or antibody fragments in a high-throughput manner. To address this we designed and constructed two human synthetic antibody fragment (scFv) libraries denoted HelL-11 and HelL-13. By the use of phage display technology, in total 466 unique scFv antibodies specific for 114 different antigens were generated. The specificities of these antibodies were analyzed in a variety of immunochemical assays and a subset was further evaluated for functionality in protein microarray applications. This high-throughput approach demonstrates the ability to rapidly generate a wealth of reagents not only for proteome research, but potentially also for diagnostics and therapeutics. In addition, this work provides a great example on how a synthetic approach can be used to optimize library designs. By having precise control of the diversity introduced into the antigen-binding sites, synthetic libraries offer increased understanding of how different diversity contributes to antibody binding reactivity and stability, thereby providing the key to future library optimization. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. The control of a free-piston engine generator. Part 1: Fundamental analyses

    Energy Technology Data Exchange (ETDEWEB)

    Mikalsen, R.; Roskilly, A.P. [Sir Joseph Swan Institute for Energy Research, Newcastle University, Newcastle upon Tyne, NE1 7RU, England (United Kingdom)

    2010-04-15

    Free-piston engines are under investigation by a number of research groups due to potential fuel efficiency and exhaust emissions advantages over conventional technology. The main challenge with such engines is the control of the piston motion, and this has not yet been fully resolved for all types of free-piston engines. This paper discusses the basic features of a single piston free-piston engine generator under development at Newcastle University and investigates engine control issues using a full-cycle simulation model. Control variables and disturbances are identified, and a control strategy is proposed. It is found that the control of the free-piston engine is a challenge, but that the proposed control strategy is feasible. Engine speed control does, however, represent a challenge in the current design. (author)

  20. Geospatial analyses and system architectures for the next generation of radioactive materials risk assessment and routing

    International Nuclear Information System (INIS)

    Ganter, J.H.

    1996-01-01

    This paper suggests that inexorable changes in the society are presenting both challenges and a rich selection of technologies for responding to these challenges. The citizen is more demanding of environmental and personal protection, and of information. Simultaneously, the commercial and government information technologies markets are providing new technologies like commercial off-the-shelf (COTS) software, common datasets, ''open'' GIS, recordable CD-ROM, and the World Wide Web. Thus one has the raw ingredients for creating new techniques and tools for spatial analysis, and these tools can support participative study and decision-making. By carrying out a strategy of thorough and demonstrably correct science, design, and development, can move forward into a new generation of participative risk assessment and routing for radioactive and hazardous materials

  1. Generation and analyses of human synthetic antibody libraries and their application for protein microarrays

    DEFF Research Database (Denmark)

    Säll, Anna; Walle, Maria; Wingren, Christer

    2016-01-01

    in a high-throughput manner. To address this we designed and constructed two human synthetic antibody fragment (scFv) libraries denoted HelL-11 and HelL-13. By the use of phage display technology, in total 466 unique scFv antibodies specific for 114 different antigens were generated. The specificities......Antibody-based proteomics offers distinct advantages in the analysis of complex samples for discovery and validation of biomarkers associated with disease. However, its large-scale implementation requires tools and technologies that allow development of suitable antibody or antibody fragments...... for diagnostics and therapeutics. In addition, this work provides a great example on how a synthetic approach can be used to optimize library designs. By having precise control of the diversity introduced into the antigen-binding sites, synthetic libraries offer increased understanding of how different diversity...

  2. Thermodynamic analyses of a biomass-coal co-gasification power generation system.

    Science.gov (United States)

    Yan, Linbo; Yue, Guangxi; He, Boshu

    2016-04-01

    A novel chemical looping power generation system is presented based on the biomass-coal co-gasification with steam. The effects of different key operation parameters including biomass mass fraction (Rb), steam to carbon mole ratio (Rsc), gasification temperature (Tg) and iron to fuel mole ratio (Rif) on the system performances like energy efficiency (ηe), total energy efficiency (ηte), exergy efficiency (ηex), total exergy efficiency (ηtex) and carbon capture rate (ηcc) are analyzed. A benchmark condition is set, under which ηte, ηtex and ηcc are found to be 39.9%, 37.6% and 96.0%, respectively. Furthermore, detailed energy Sankey diagram and exergy Grassmann diagram are drawn for the entire system operating under the benchmark condition. The energy and exergy efficiencies of the units composing the system are also predicted. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  4. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  5. Energy Balance of Nuclear Power Generation. Life Cycle Analyses of Nuclear Power

    International Nuclear Information System (INIS)

    Wallner, A.; Wenisch, A.; Baumann, M.; Renner, S.

    2011-01-01

    The accident at the Japanese nuclear power plant Fukushima in March 2011 triggered a debate about phasing out nuclear energy and the safety of nuclear power plants. Several states are preparing to end nuclear power generation. At the same time the operational life time of many nuclear power plants is reaching its end. Governments and utilities now need to take a decision to replace old nuclear power plants or to use other energy sources. In particular the requirement of reducing greenhouse gas emissions (GHG) is used as an argument for a higher share of nuclear energy. To assess the contribution of nuclear power to climate protection, the complete life cycle needs to be taken into account. Some process steps are connected to high CO2 emissions due to the energy used. While the processes before and after conventional fossil-fuel power stations can contribute up to 25% of direct GHG emission, it is up to 90 % for nuclear power (Weisser 2007). This report aims to produce information about the energy balance of nuclear energy production during its life cycle. The following key issues were examined: How will the forecasted decreasing uranium ore grades influence energy intensity and greenhouse emissions and from which ore grade on will no energy be gained anymore? In which range can nuclear energy deliver excess energy and how high are greenhouse gas emissions? Which factors including ore grade have the strongest impact on excess energy? (author)

  6. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  7. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  8. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  9. A comparative study between xerographic, computer-assisted overlay generation and animated-superimposition methods in bite mark analyses.

    Science.gov (United States)

    Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran

    2016-09-01

    This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  11. Full in-vitro analyses of new-generation bulk fill dental composites cured by halogen light.

    Science.gov (United States)

    Tekin, Tuçe Hazal; Kantürk Figen, Aysel; Yılmaz Atalı, Pınar; Coşkuner Filiz, Bilge; Pişkin, Mehmet Burçin

    2017-08-01

    The objective of this study was to investigate the full in-vitro analyses of new-generation bulk-fill dental composites cured by halogen light (HLG). Two types' four composites were studied: Surefill SDR (SDR) and Xtra Base (XB) as bulk-fill flowable materials; QuixFill (QF) and XtraFill (XF) as packable bulk-fill materials. Samples were prepared for each analysis and test by applying the same procedure, but with different diameters and thicknesses appropriate to the analysis and test requirements. Thermal properties were determined by thermogravimetric analysis (TG/DTG) and differential scanning calorimetry (DSC) analysis; the Vickers microhardness (VHN) was measured after 1, 7, 15 and 30days of storage in water. The degree of conversion values for the materials (DC, %) were immediately measured using near-infrared spectroscopy (FT-IR). The surface morphology of the composites was investigated by scanning electron microscopes (SEM) and atomic-force microscopy (AFM) analyses. The sorption and solubility measurements were also performed after 1, 7, 15 and 30days of storage in water. In addition to his, the data were statistically analyzed using one-way analysis of variance, and both the Newman Keuls and Tukey multiple comparison tests. The statistical significance level was established at pfill, resin-based dental composites. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Using next-generation sequencing to analyse the diet of a highly endangered land snail (Powelliphanta augusta feeding on endemic earthworms.

    Directory of Open Access Journals (Sweden)

    Stéphane Boyer

    Full Text Available Predation is often difficult to observe or quantify for species that are rare, very small, aquatic or nocturnal. The assessment of such species' diet can be conducted using molecular methods that target prey DNA remaining in predators' guts and faeces. These techniques do not require high taxonomic expertise, are applicable to soft-bodied prey and allow for identification at the species level. However, for generalist predators, the presence of mixed prey DNA in guts and faeces can be a major impediment as it requires development of specific primers for each potential prey species for standard (Sanger sequencing. Therefore, next generation sequencing methods have recently been applied to such situations. In this study, we used 454-pyrosequencing to analyse the diet of Powelliphantaaugusta, a carnivorous landsnail endemic to New Zealand and critically endangered after most of its natural habitat has been lost to opencast mining. This species was suspected to feed mainly on earthworms. Although earthworm tissue was not detectable in snail faeces, earthworm DNA was still present in sufficient quantity to conduct molecular analyses. Based on faecal samples collected from 46 landsnails, our analysis provided a complete map of the earthworm-based diet of P. augusta. Predated species appear to be earthworms that live in the leaf litter or earthworms that come to the soil surface at night to feed on the leaf litter. This indicates that P. augusta may not be selective and probably predates any earthworm encountered in the leaf litter. These findings are crucial for selecting future translocation areas for this highly endangered species. The molecular diet analysis protocol used here is particularly appropriate to study the diet of generalist predators that feed on liquid or soft-bodied prey. Because it is non-harmful and non-disturbing for the studied animals, it is also applicable to any species of conservation interest.

  13. Using Next-Generation Sequencing to Analyse the Diet of a Highly Endangered Land Snail (Powelliphanta augusta) Feeding on Endemic Earthworms

    Science.gov (United States)

    Boyer, Stéphane; Wratten, Stephen D.; Holyoake, Andrew; Abdelkrim, Jawad; Cruickshank, Robert H.

    2013-01-01

    Predation is often difficult to observe or quantify for species that are rare, very small, aquatic or nocturnal. The assessment of such species’ diet can be conducted using molecular methods that target prey DNA remaining in predators’ guts and faeces. These techniques do not require high taxonomic expertise, are applicable to soft-bodied prey and allow for identification at the species level. However, for generalist predators, the presence of mixed prey DNA in guts and faeces can be a major impediment as it requires development of specific primers for each potential prey species for standard (Sanger) sequencing. Therefore, next generation sequencing methods have recently been applied to such situations. In this study, we used 454-pyrosequencing to analyse the diet of Powelliphantaaugusta , a carnivorous landsnail endemic to New Zealand and critically endangered after most of its natural habitat has been lost to opencast mining. This species was suspected to feed mainly on earthworms. Although earthworm tissue was not detectable in snail faeces, earthworm DNA was still present in sufficient quantity to conduct molecular analyses. Based on faecal samples collected from 46 landsnails, our analysis provided a complete map of the earthworm-based diet of P . augusta . Predated species appear to be earthworms that live in the leaf litter or earthworms that come to the soil surface at night to feed on the leaf litter. This indicates that P . augusta may not be selective and probably predates any earthworm encountered in the leaf litter. These findings are crucial for selecting future translocation areas for this highly endangered species. The molecular diet analysis protocol used here is particularly appropriate to study the diet of generalist predators that feed on liquid or soft-bodied prey. Because it is non-harmful and non-disturbing for the studied animals, it is also applicable to any species of conservation interest. PMID:24086671

  14. Live births after simultaneous avoidance of monogenic diseases and chromosome abnormality by next-generation sequencing with linkage analyses.

    Science.gov (United States)

    Yan, Liying; Huang, Lei; Xu, Liya; Huang, Jin; Ma, Fei; Zhu, Xiaohui; Tang, Yaqiong; Liu, Mingshan; Lian, Ying; Liu, Ping; Li, Rong; Lu, Sijia; Tang, Fuchou; Qiao, Jie; Xie, X Sunney

    2015-12-29

    In vitro fertilization (IVF), preimplantation genetic diagnosis (PGD), and preimplantation genetic screening (PGS) help patients to select embryos free of monogenic diseases and aneuploidy (chromosome abnormality). Next-generation sequencing (NGS) methods, while experiencing a rapid cost reduction, have improved the precision of PGD/PGS. However, the precision of PGD has been limited by the false-positive and false-negative single-nucleotide variations (SNVs), which are not acceptable in IVF and can be circumvented by linkage analyses, such as short tandem repeats or karyomapping. It is noteworthy that existing methods of detecting SNV/copy number variation (CNV) and linkage analysis often require separate procedures for the same embryo. Here we report an NGS-based PGD/PGS procedure that can simultaneously detect a single-gene disorder and aneuploidy and is capable of linkage analysis in a cost-effective way. This method, called "mutated allele revealed by sequencing with aneuploidy and linkage analyses" (MARSALA), involves multiple annealing and looping-based amplification cycles (MALBAC) for single-cell whole-genome amplification. Aneuploidy is determined by CNVs, whereas SNVs associated with the monogenic diseases are detected by PCR amplification of the MALBAC product. The false-positive and -negative SNVs are avoided by an NGS-based linkage analysis. Two healthy babies, free of the monogenic diseases of their parents, were born after such embryo selection. The monogenic diseases originated from a single base mutation on the autosome and the X-chromosome of the disease-carrying father and mother, respectively.

  15. Gene Mutation Profiles in Primary Diffuse Large B Cell Lymphoma of Central Nervous System: Next Generation Sequencing Analyses

    Science.gov (United States)

    Todorovic Balint, Milena; Jelicic, Jelena; Mihaljevic, Biljana; Kostic, Jelena; Stanic, Bojana; Balint, Bela; Pejanovic, Nadja; Lucic, Bojana; Tosic, Natasa; Marjanovic, Irena; Stojiljkovic, Maja; Karan-Djurasevic, Teodora; Perisic, Ognjen; Rakocevic, Goran; Popovic, Milos; Raicevic, Sava; Bila, Jelena; Antic, Darko; Andjelic, Bosko; Pavlovic, Sonja

    2016-01-01

    The existence of a potential primary central nervous system lymphoma-specific genomic signature that differs from the systemic form of diffuse large B cell lymphoma (DLBCL) has been suggested, but is still controversial. We investigated 19 patients with primary DLBCL of central nervous system (DLBCL CNS) using the TruSeq Amplicon Cancer Panel (TSACP) for 48 cancer-related genes. Next generation sequencing (NGS) analyses have revealed that over 80% of potentially protein-changing mutations were located in eight genes (CTNNB1, PIK3CA, PTEN, ATM, KRAS, PTPN11, TP53 and JAK3), pointing to the potential role of these genes in lymphomagenesis. TP53 was the only gene harboring mutations in all 19 patients. In addition, the presence of mutated TP53 and ATM genes correlated with a higher total number of mutations in other analyzed genes. Furthermore, the presence of mutated ATM correlated with poorer event-free survival (EFS) (p = 0.036). The presence of the mutated SMO gene correlated with earlier disease relapse (p = 0.023), inferior event-free survival (p = 0.011) and overall survival (OS) (p = 0.017), while mutations in the PTEN gene were associated with inferior OS (p = 0.048). Our findings suggest that the TP53 and ATM genes could be involved in the molecular pathophysiology of primary DLBCL CNS, whereas mutations in the PTEN and SMO genes could affect survival regardless of the initial treatment approach. PMID:27164089

  16. Value added structures and coordination structures of the decentral power generation. An actor-centered and institution-centered analyses by means of selected case examples; Wertschoepfungs- und Koordinationsstrukturen der dezentralen Stromerzeugung. Eine akteur- und institutionenzentrierte Analyse anhand ausgewaehlter Fallbeispiele

    Energy Technology Data Exchange (ETDEWEB)

    Brocke, Tobias

    2012-07-01

    Against the background of energy policy and climate policy decisions, the decentralized power generation has gained in importance in Germany. Previous research activities on this topic mostly concerned with technical, legal, environmental and economic issues as well as potential analyses for certain forms of power generation. In contrast, the contribution under consideration deals with the organizational structures and governance structures of the decentralized power generation at local and regional level. In particular, it concerns the question to what extent the decentralized power generation results in the formation of localized production connections. In addition, it is about the importance of institutional framework as well as the role of regulatory, political and civil society actors who are affected by the distributed power generation.

  17. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  18. Performance analyses of a hybrid geothermal–fossil power generation system using low-enthalpy geothermal resources

    International Nuclear Information System (INIS)

    Liu, Qiang; Shang, Linlin; Duan, Yuanyuan

    2016-01-01

    Highlights: • Geothermal energy is used to preheat the feedwater in a coal-fired power unit. • The performance of a hybrid geothermal–fossil power generation system is analyzed. • Models for both parallel and serial geothermal preheating schemes are presented. • Effects of geothermal source temperatures, distances and heat losses are analyzed. • Power increase of the hybrid system over an ORC and tipping distance are discussed. - Abstract: Low-enthalpy geothermal heat can be efficiently utilized for feedwater preheating in coal-fired power plants by replacing some of the high-grade steam that can then be used to generate more power. This study analyzes a hybrid geothermal–fossil power generation system including a supercritical 1000 MW power unit and a geothermal feedwater preheating system. This study models for parallel and serial geothermal preheating schemes and analyzes the thermodynamic performance of the hybrid geothermal–fossil power generation system for various geothermal resource temperatures. The models are used to analyze the effects of the temperature matching between the geothermal water and the feedwater, the heat losses and pumping power during the geothermal water transport and the resource distance and temperature on the power increase to improve the power generation. The serial geothermal preheating (SGP) scheme generally generates more additional power than the parallel geothermal preheating (PGP) scheme for geothermal resource temperatures of 100–130 °C, but the SGP scheme generates slightly less additional power than the PGP scheme when the feedwater is preheated to as high a temperature as possible before entering the deaerator for geothermal resource temperatures higher than 140 °C. The additional power decreases as the geothermal source distance increases since the pipeline pumping power increases and the geothermal water temperature decreases due to heat losses. More than 50% of the power decrease is due to geothermal

  19. A Novel Numerical Algorithm for Optimal Sizing of a Photovoltaic/Wind/Diesel Generator/Battery Microgrid Using Loss of Load Probability Index

    Directory of Open Access Journals (Sweden)

    Hussein A. Kazem

    2013-01-01

    Full Text Available This paper presents a method for determining optimal sizes of PV array, wind turbine, diesel generator, and storage battery installed in a building integrated system. The objective of the proposed optimization is to design the system that can supply a building load demand at minimum cost and maximum availability. The mathematical models for the system components as well as meteorological variables such as solar energy, temperature, and wind speed are employed for this purpose. Moreover, the results showed that the optimum sizing ratios (the daily energy generated by the source to the daily energy demand for the PV array, wind turbine, diesel generator, and battery for a system located in Sohar, Oman, are 0.737, 0.46, 0.22, and 0.17, respectively. A case study represented by a system consisting of 30 kWp PV array (36%, 18 kWp wind farm (55%, and 5 kVA diesel generator (9% is presented. This system is supposed to power a 200 kWh/day load demand. It is found that the generated energy share of the PV array, wind farm, and diesel generator is 36%, 55%, and 9%, respectively, while the cost of energy is 0.17 USD/kWh.

  20. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  1. The Fundamentals of Economic Dynamics and Policy Analyses : Learning through Numerical Examples. Part Ⅳ. Overlapping Generations Model

    OpenAIRE

    Futamura, Hiroshi

    2015-01-01

    An overlapping generations model is an applied dynamic general equilibrium model for which the lifecycle models are employed as main analytical tools. At any point in time, there are overlapping generations consisting of individuals born this year, individuals born last year, individuals born two years ago, and so on. As we saw in the analysis of lifecycle models, each individual makes an optimal consumption-saving plan to maximize lifetime utility over her/his lifecycle. For example, an indi...

  2. A comparative study of first and all-author co-citation counting, and two different matrix generation approaches applied for author co-citation analyses

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg; Larsen, Birger; Ingwersen, Peter

    2009-01-01

    XML documents extracted from the IEEE collection. These data allow the construction of ad-hoc citation indexes, which enables us to carry out the hitherto largest all-author co-citation study. Four ACA are made, combining the different units of analyses with the different matrix generation approaches...

  3. Complementary investigations concerning the analyses of the deposits and underlying surfaces observed on French PWR steam generator pulled out tubes

    Energy Technology Data Exchange (ETDEWEB)

    Sala, B.; Chevalier, S. [Framatome, Technical Center, 71 - Le Creusot (France). Dept. Chemistry and Corrosion; Dupin, M. [EDF/GDL, 37 - Avoine (France); Gelpi, A. [FRAMATOME, 92 - Paris-La-Defence (France). Dept. Material and Technologies

    1998-07-01

    The objective of this study is to confirm the possible correlations that may be drawn between the nature of the deposit (alumino-silicate, carbon species, magnetite...) and the corrosion phenomenon which can induce the formation of a non protective, thin brittle oxide layer enriched in chromium and IGASCC. This paper describes analyses conducted on two tubes to complete the previous studies of six tubes: Firstly, a tube sample located at the top of tubesheet to compare with analyses carried out on the same unit at TSP elevation where there is corrosion; secondly, a tube sample pulled from a unit not affected bu the secondary side corrosion. The operating conditions of this unit (brass condenser, morpholine conditioning, river water cooled plant) are similar than those used in units where the previous corroded tubes has been pulled out. Finally, a synthesis is presented with a comparison between the results obtained on these tubes and the ones already available. (authors)

  4. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  5. Energy and exergy analyses on a novel hybrid solar heating, cooling and power generation system for remote areas

    International Nuclear Information System (INIS)

    Zhai, H.; Dai, Y.J.; Wu, J.Y.; Wang, R.Z.

    2009-01-01

    In this study, a small scale hybrid solar heating, chilling and power generation system, including parabolic trough solar collector with cavity receiver, a helical screw expander and silica gel-water adsorption chiller, etc., was proposed and extensively investigated. The system has the merits of effecting the power generation cycle at lower temperature level with solar energy more efficiently and can provide both thermal energy and power for remote off-grid regions. A case study was carried out to evaluate an annual energy and exergy efficiency of the system under the climate of northwestern region of China. It is found that both the main energy and exergy loss take place at the parabolic trough collector, amount to 36.2% and 70.4%, respectively. Also found is that the studied system can have a higher solar energy conversion efficiency than the conventional solar thermal power generation system alone. The energy efficiency can be increased to 58.0% from 10.2%, and the exergy efficiency can be increased to 15.2% from 12.5%. Moreover, the economical analysis in terms of cost and payback period (PP) has been carried out. The study reveals that the proposed system the PP of the proposed system is about 18 years under present energy price conditions. The sensitivity analysis shows that if the interest rate decreases to 3% or energy price increase by 50%, PP will be less than 10 years. (author)

  6. Experimental and thermodynamical analyses of the diesel exhaust vortex generator heat exchanger for optimizing its operating condition

    International Nuclear Information System (INIS)

    Hatami, M.; Ganji, D.D.; Gorji-Bandpy, M.

    2015-01-01

    In this research, a vortex generator heat exchanger is used to recover exergy from the exhaust of an OM314 diesel engine. Twenty vortex generators with 30° angle of attack are used to increase the heat recovery as well as the low back pressure in the exhaust. The experiments are prepared for five engine loads (0, 20, 40, 60 and 80% of full load), two exhaust gases amount (50 and 100%) and four water mass flow rates (50, 40, 30 and 20 g/s). After a thermodynamical analysis on the obtained data, an optimization study based on Central Composite Design (CCD) is performed due to complex effect of engine loads and water mass flow rates on exergy recovery and irreversibility to reach the best operating condition. - Highlights: • A vortex generator heat exchanger is used for diesel exhaust heat recovery. • A thermodynamic analysis is performed for experimental data. • Exergy recovery, irreversibility are calculated in different exhaust gases amount. • Optimization study is performed using response surface method

  7. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  8. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  9. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  10. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  11. Static analysis: from theory to practice; Static analysis of large-scale embedded code, generation of abstract domains; Analyse statique: de la theorie a la pratique; analyse statique de code embarque de grande taille, generation de domaines abstraits

    Energy Technology Data Exchange (ETDEWEB)

    Monniaux, D.

    2009-06-15

    Software operating critical systems (aircraft, nuclear power plants) should not fail - whereas most computerised systems of daily life (personal computer, ticket vending machines, cell phone) fail from time to time. This is not a simple engineering problem: it is known, since the works of Turing and Cook, that proving that programs work correctly is intrinsically hard. In order to solve this problem, one needs methods that are, at the same time, efficient (moderate costs in time and memory), safe (all possible failures should be found), and precise (few warnings about nonexistent failures). In order to reach a satisfactory compromise between these goals, one can research fields as diverse as formal logic, numerical analysis or 'classical' algorithmics. From 2002 to 2007 I participated in the development of the Astree static analyser. This suggested to me a number of side projects, both theoretical and practical (use of formal proof techniques, analysis of numerical filters...). More recently, I became interested in modular analysis of numerical property and in the applications to program analysis of constraint solving techniques (semi-definite programming, SAT and SAT modulo theory). (author)

  12. A probable prehistoric case of meningococcal disease from San Francisco Bay: Next generation sequencing of Neisseria meningitidis from dental calculus and osteological evidence.

    Science.gov (United States)

    Eerkens, Jelmer W; Nichols, Ruth V; Murray, Gemma G R; Perez, Katherine; Murga, Engel; Kaijankoski, Phil; Rosenthal, Jeffrey S; Engbring, Laurel; Shapiro, Beth

    2018-05-25

    Next Generation Sequencing (NGS) of ancient dental calculus samples from a prehistoric site in San Francisco Bay, CA-SCL-919, reveals a wide range of potentially pathogenic bacteria. One older adult woman, in particular, had high levels of Neisseria meningitidis and low levels of Haemophilus influenzae, species that were not observed in the calculus from three other individuals. Combined with the presence of incipient endocranial lesions and pronounced meningeal grooves, we interpret this as an ancient case of meningococcal disease. This disease afflicts millions around the globe today, but little is known about its (pre)history. With additional sampling, we suggest NGS of calculus offers an exciting new window into the evolutionary history of these bacterial species and their interactions with humans. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. OTSGI--a program analysing two-phase flow instabilities in helical tubes of once-through steam generator

    International Nuclear Information System (INIS)

    Shi Shaoping; Zhou Fangde; Wang Maohua

    1998-01-01

    The author has studied the two-phases flow instabilities of the helical tubes of once-through steam generator. Using liner-frequency-domain analytical method, the authors have derived out a mathematical model and designed the program. In this model, the authors also have considered the thermal dynamic characteristics of the tube's wall. The program is used to calculate the threshold of the stability and the influences of some factors, such as entrance throttling coefficient, system pressure, entrance supercooling degree, et al. The outcomes are compared with other studies

  14. Evaluation of maintenance strategies for steam generator tubes in pressurized waster reactors. 2. Cost and profitability analyses

    International Nuclear Information System (INIS)

    Isobe, Y.; Sagisaka, M.; Yoshimura, S.; Yagawa, G.

    2000-01-01

    As an application of probabilistic fracture mechanics (PFM), risk-benefit analysis was carried out to evaluate maintenance activities of steam generator (SG) tubes used in pressurized water reactors (PWRs). The analysis was conducted for SG tubes made of Inconel 600, and Inconel 690 as well assuming its crack initiation and crack propagation law based on Inconel 600 data. The following results were drawn from the analysis. Improvement of inspection accuracy reduces the maintenance costs significantly and is preferable from the viewpoint of profitability due to reduction of SG tube leakage and rupture. There is a certain region of SCC properties of SG tubes where sampling inspection is effective. (author)

  15. SARAPAN-A simulated-annealing-based tool to generate random patterned-channel-age in CANDU fuel management analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kastanya, Doddy [Safety and Licensing Department, Candesco Division of Kinectrics Inc., Toronto (Canada)

    2017-02-15

    In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium) utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP) code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.

  16. SARAPAN—A Simulated-Annealing-Based Tool to Generate Random Patterned-Channel-Age in CANDU Fuel Management Analyses

    Directory of Open Access Journals (Sweden)

    Doddy Kastanya

    2017-02-01

    Full Text Available In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.

  17. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  18. Spatial and temporal analyses of citrus sudden death as a tool to generate hypotheses concerning its etiology.

    Science.gov (United States)

    Bassanezi, Renato B; Bergamin Filho, Armando; Amorim, Lilian; Gimenes-Fernandes, Nelson; Gottwald, Tim R; Bové, Joseph M

    2003-04-01

    grafted on Rangpur lime. Based on the symptoms of CSD and on its spatial and temporal patterns, our hypothesis is that CSD may be caused by a similar but undescribed pathogen such as a virus and probably vectored by insects such as aphids by similar spatial processes to those affecting CTV.

  19. De novo assembly and next-generation sequencing to analyse full-length gene variants from codon-barcoded libraries.

    Science.gov (United States)

    Cho, Namjin; Hwang, Byungjin; Yoon, Jung-ki; Park, Sangun; Lee, Joongoo; Seo, Han Na; Lee, Jeewon; Huh, Sunghoon; Chung, Jinsoo; Bang, Duhee

    2015-09-21

    Interpreting epistatic interactions is crucial for understanding evolutionary dynamics of complex genetic systems and unveiling structure and function of genetic pathways. Although high resolution mapping of en masse variant libraries renders molecular biologists to address genotype-phenotype relationships, long-read sequencing technology remains indispensable to assess functional relationship between mutations that lie far apart. Here, we introduce JigsawSeq for multiplexed sequence identification of pooled gene variant libraries by combining a codon-based molecular barcoding strategy and de novo assembly of short-read data. We first validate JigsawSeq on small sub-pools and observed high precision and recall at various experimental settings. With extensive simulations, we then apply JigsawSeq to large-scale gene variant libraries to show that our method can be reliably scaled using next-generation sequencing. JigsawSeq may serve as a rapid screening tool for functional genomics and offer the opportunity to explore evolutionary trajectories of protein variants.

  20. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  1. Enhancement in the structure quality of ZnO nanorods by diluted Co dopants: Analyses via optical second harmonic generation

    International Nuclear Information System (INIS)

    Liu, Chung-Wei; Hsiao, Chih-Hung; Chang, Shoou-Jinn; Brahma, Sanjaya; Chang, Feng Ming; Wang, Peng Han; Lo, Kuang-Yao

    2015-01-01

    We report a systematic study about the effect of cobalt concentration in the growth solution over the crystallization, growth, and optical properties of hydrothermally synthesized Zn 1−x Co x O [0 ≤ x ≤ 0.40, x is the weight (wt.) % of Co in the growth solution] nanorods. Dilute Co concentration of 1 wt. % in the growth solution enhances the bulk crystal quality of ZnO nanorods, and high wt. % leads to distortion in the ZnO lattice that depresses the crystallization, growth as well as the surface structure quality of ZnO. Although, Co concentration in the growth solution varies from 1 to 40 wt. %, the real doping concentration is limited to 0.28 at. % that is due to the low growth temperature of 80 °C. The enhancement in the crystal quality of ZnO nanorods at dilute Co concentration in the solution is due to the strain relaxation that is significantly higher for ZnO nanorods prepared without, and with high wt. % of Co in the growth solution. Second harmonic generation is used to investigate the net dipole distribution from these coatings, which provides detailed information about bulk and surface structure quality of ZnO nanorods at the same time. High quality ZnO nanorods are fabricated by a low-temperature (80 °C) hydrothermal synthesis method, and no post synthesis treatment is needed for further crystallization. Therefore, this method is advantageous for the growth of high quality ZnO coatings on plastic substrates that may lead toward its application in flexible electronics

  2. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  3. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  4. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  5. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  6. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  7. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  8. 基于概率模型的ATC系统冲突目标生成算法%Probability-Based Method of Generating the Conflict Trajectories for ATC System

    Institute of Scientific and Technical Information of China (English)

    苏志刚; 眭聪聪; 吴仁彪

    2011-01-01

    For testing the capability of short term conflict alerting of air traffic control system, two methods are usually used. The former is to set a higher threshold, use the real data testing whether the system can alert when distance between two flights gets lower than the threshold. However, this method is not reliable. The second method is simulating flights which will conflict and obtain their trajectory from calculating, and then send these data to ATC system to see its reaction. This method is usually too simple to test whether the system can pre-detect a conflict effectively. To solve these problems, a probabilistic approach is used in this paper to simulate air-crafts with given probability of conflicting. Firstly, we derived the conflict probability of turing flights from Prandaini' s method of conflict probability estimation for linear flight. Then using reverse derivation we got the motion parameters of two targets whose conflict probability was pre-setted. At last, we simulated this pair of targets' track and anlysised their conflict probability. The simulation results show that the targets' probability of conflict was in line with the previous assumption. The trajectories generated by this algorithm are more realistic then a more effective conclusion of ATC system' s capability of short term conflict alerting and pre-detecting will be provided.%通常用于测试空中交通管制(Air Traffic Control,ATC)自动化系统的飞行冲突告警功能的方法主要有放宽系统告警值和向系统输入模拟的飞行冲突目标的雷达数据.前一种方法存在不可靠性,第二种方法由于只产生简单的确定目标轨迹数据,因此只能简单地测试系统能否告警,无法对系统的飞行冲突预测能力作出评价.为了使用于测试系统的模拟雷达数据更符合实际飞行情况,并检测系统预测飞行冲突的技术水平,本文提出了一种基于飞行冲突概率模型的航迹模拟方法,通过对不同目标

  9. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  10. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  11. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  12. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  13. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  14. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  15. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  16. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  17. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  18. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  19. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  20. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  1. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  2. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  3. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  4. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  5. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  6. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  7. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  8. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  9. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  10. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  11. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  12. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  13. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  14. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  15. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  16. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  17. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  18. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  19. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  2. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  3. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  5. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  6. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  7. An investigation of the ignition probability and data analysis for the detection of relevant parameters of mechanically generated steel sparks in explosive gas/air-mixtures; Untersuchungen zur Zuendwahrscheinlichkeit und Datenanalyse zur Erfassung der Einflussgroessen mechanisch erzeugter Stahl-Schlagfunktion in explosionsfaehigen Brenngas/Luft-Gemischen

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, Thomas; Finke, Robert; Graetz, Rainer

    2010-07-01

    Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.

  8. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.

    Science.gov (United States)

    Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A

    2016-07-27

    The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.

  9. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  10. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  11. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  12. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  13. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  14. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses

    Directory of Open Access Journals (Sweden)

    Jason L. Brown

    2017-12-01

    Full Text Available SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism, generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  15. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses.

    Science.gov (United States)

    Brown, Jason L; Bennett, Joseph R; French, Connor M

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  16. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    Science.gov (United States)

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  18. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  19. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  20. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  1. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  2. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  3. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  4. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  5. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  6. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  7. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  8. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  9. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  10. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  11. Electricity generation analyses in an oil-exporting country: Transition to non-fossil fuel based power units in Saudi Arabia

    International Nuclear Information System (INIS)

    Farnoosh, Arash; Lantz, Frederic; Percebois, Jacques

    2014-01-01

    In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop an optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. For instance, by decarbonizing half of its generation mix, Saudi Arabia can release around 0.5 Mb/d barrels of oil equivalent per day from 2020. Moreover, total power generation cost reduction can reach up to around 28% per year from 2030 if Saudi Arabia manages to attain the most optimal generation mix structure introduced in the model (50% of power from renewables and nuclear power plants and 50% from the fossil power plants). - Highlights: • We model the current and future power generation situation of Saudi Arabia. • We take into account the integration of the other alternative resources. • We consider different scenarios of power generation structure for the country. • Optimal generation mix can release considerable amount of oil for export

  12. Data Interpretation: Using Probability

    Science.gov (United States)

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…

  13. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  14. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    statistics at all levels. .... P(Ai) for k < ∞ and A1,A2, ··· ,Ak ∈ F and Ai ∩ Aj = ∅ for i = j. Next, it is reasonable to require that F be closed .... roll of dice, card games such as Bridge. ..... ing data (i.e., generating random variables) according to ...

  15. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  16. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  17. Die Generation Y – Ein Plädoyer für die Analyse einer „vernetzten Generation“ zur Weiterentwicklung der Bibliotheksdidaktik

    Directory of Open Access Journals (Sweden)

    Kim Farah Giuliani

    2017-04-01

    Full Text Available Dieser Beitrag greift sowohl den wissenschaftlichen Diskurs über die „Generation Y“ als auch Aspekte der Vermittlung von Informationskompetenz in Bibliotheken auf. Beide Themen werden verknüpft und in Bezug auf eine erfolgreiche Bibliotheksdidaktik diskutiert.   This article takes up both the scientific discourse about the Millenials (Generation Y as well as the debate on how to impart information literacy and library didactics. Both topics are discussed in combination, with respect to successful didactics in libraries.

  18. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  19. 用于统计测试概率分布生成的自动搜索方法%Automated Search Method for Statistical Test Probability Distribution Generation

    Institute of Scientific and Technical Information of China (English)

    周晓莹; 高建华

    2013-01-01

    A strategy based on automated search for probability distribution construction is proposed, which comprises the design of representation format and evaluation function for the probability distribution. Combining with simulated annealing algorithm, an indicator is defined to formalize the automated search process based on the Markov model. Experimental results show that the method effectively improves the accuracy of the automated search, which can reduce the expense of statistical test by providing the statistical test with fairly efficient test data since it successfully finds the neat-optimal probability distribution within a certain time.%提出一种基于自动搜索的概率分布生成方法,设计对概率分布的表示形式与评估函数,同时结合模拟退火算法设计基于马尔可夫模型的自动搜索过程.实验结果表明,该方法能够有效地提高自动搜索的准确性,在一定时间内成功找到接近最优的概率分布,生成高效的测试数据,同时达到降低统计测试成本的目的.

  20. Electricity generation analyses in an oil-exporting country: Transition to non-fossil fuel based power units in Saudi Arabia

    International Nuclear Information System (INIS)

    Farnoosh, Arash; Lantz, Frederic; Percebois, Jacques

    2013-12-01

    In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop a optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. (authors)

  1. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  2. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  3. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  4. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  5. CONFIGURATION GENERATOR MODEL

    International Nuclear Information System (INIS)

    Alsaed, A.

    2004-01-01

    ''The Disposal Criticality Analysis Methodology Topical Report'' prescribes an approach to the methodology for performing postclosure criticality analyses within the monitored geologic repository at Yucca Mountain, Nevada. An essential component of the methodology is the ''Configuration Generator Model for In-Package Criticality'' that provides a tool to evaluate the probabilities of degraded configurations achieving a critical state. The configuration generator model is a risk-informed, performance-based process for evaluating the criticality potential of degraded configurations in the monitored geologic repository. The method uses event tree methods to define configuration classes derived from criticality scenarios and to identify configuration class characteristics (parameters, ranges, etc.). The probabilities of achieving the various configuration classes are derived in part from probability density functions for degradation parameters. The NRC has issued ''Safety Evaluation Report for Disposal Criticality Analysis Methodology Topical Report, Revision 0''. That report contained 28 open items that required resolution through additional documentation. Of the 28 open items, numbers 5, 6, 9, 10, 18, and 19 were concerned with a previously proposed software approach to the configuration generator methodology and, in particular, the k eff regression analysis associated with the methodology. However, the use of a k eff regression analysis is not part of the current configuration generator methodology and, thus, the referenced open items are no longer considered applicable and will not be further addressed

  6. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  7. Weld region corrosion during chemical cleaning of PWR [pressurized-water reactor] steam generators: Volume 2, Tests and analyses: Final report

    International Nuclear Information System (INIS)

    Barna, J.L.; Bozeka, S.A.; Jevec, J.M.

    1987-07-01

    The potential for preferential corrosion of steam generator weld regions during chemical cleaning using the generic SGOG solvents was investigated. The investigations included development and use of a corrosion assessment test facility which measured corrosion currents in a realistic model of the steam generator geometry in the vicinity of a specific weld during a simulated chemical dissolution of sludge consisting of essentially pure magnetite. A corrosion monitoring technique was developed and qualified. In this technique free corrosion rates measured by linear polarization techniques are added to corrosion rates calculated from galvanic current measured using a zero resistance ammeter to give an estimate of total corrosion rate for a galvanically corroding material. An analytic modeling technique was developed and proved useful in determining the size requirements for the weld region mockup used in the corrosion assessment test facility. The technique predicted galvanic corrosion rates consistent with that observed in a corrosion assessement test when polarization data used as model input were obtained on-line during the test. The test results obtained during this investigation indicated that chemical cleaning using the SGOG magnetite dissolution solvent can be performed with a small amount of corrosion of secondary side internals and pressure boundary welds. The maximum weld region corrosion measured during a typical chemical cleaning cycle to remove essentially pure magnetite sludge was about 8 mils. However, additional site specific weld region corrosion assessment testing and qualification will be required prior to chemical cleaning steam generators at a specific plant. Recommendations for site specific qualification of chemical cleaning processes and for use of process monitors and on-line corrosion instrumentation are included in this report

  8. A generative inference framework for analysing patterns of cultural change in sparse population data with evidence for fashion trends in LBK culture.

    Science.gov (United States)

    Kandler, Anne; Shennan, Stephen

    2015-12-06

    Cultural change can be quantified by temporal changes in frequency of different cultural artefacts and it is a central question to identify what underlying cultural transmission processes could have caused the observed frequency changes. Observed changes, however, often describe the dynamics in samples of the population of artefacts, whereas transmission processes act on the whole population. Here we develop a modelling framework aimed at addressing this inference problem. To do so, we firstly generate population structures from which the observed sample could have been drawn randomly and then determine theoretical samples at a later time t2 produced under the assumption that changes in frequencies are caused by a specific transmission process. Thereby we also account for the potential effect of time-averaging processes in the generation of the observed sample. Subsequent statistical comparisons (e.g. using Bayesian inference) of the theoretical and observed samples at t2 can establish which processes could have produced the observed frequency data. In this way, we infer underlying transmission processes directly from available data without any equilibrium assumption. We apply this framework to a dataset describing pottery from settlements of some of the first farmers in Europe (the LBK culture) and conclude that the observed frequency dynamic of different types of decorated pottery is consistent with age-dependent selection, a preference for 'young' pottery types which is potentially indicative of fashion trends. © 2015 The Author(s).

  9. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  10. Efficient plasma and bubble generation underwater by an optimized laser excitation and its application for liquid analyses by laser-induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    Lazic, Violeta; Jovicevic, Sonja; Fantoni, Roberta; Colao, Francesco

    2007-01-01

    Laser-induced breakdown spectroscopy (LIBS) measurements were performed on bulk water solutions by applying a double-pulse excitation from a Q-Switched (QS) Nd:YAG laser emitting at 1064 nm. In order to optimize the LIBS signal, laser pulse energies were varied through changing of the QS trigger delays with respect to the flash-lamp trigger. We had noted that reduction of the first pulse energy from 92 mJ to 72 mJ drastically improves the signal, although the second pulse energy was also lowered from 214 mJ to 144 mJ. With lower pulse energies, limit of detection (LOD) for Mg in pure water was reduced for one order of magnitude (34 ppb instead of 210 ppb). In order to explain such a phenomenon, we studied the dynamics of the gas bubble generated after the first laser pulse through measurements of the HeNe laser light scattered on the bubble. The influence of laser energy on underwater bubble and plasma formation and corresponding plasma emission intensity were also studied by photographic technique. From the results obtained, we conclude that the optimal first pulse energy should be kept close to the plasma elongation threshold, in our case about 65 mJ, where the gas bubble has its maximum lateral expansion and the secondary plasma is still well-localized. The importance of a multi-pulse sequence on the LIBS signal was also analyzed, where the pulse sequence after the first QS aperture was produced by operating the laser close to the lasing threshold, with the consequent generation of relaxation oscillations. Low-energy multi-pulses might keep the bubble expansion large prior to the probing pulse, but preventing the formation of secondary weak plasmas in multiple sites, which reduces the LIBS signal. The short interval between the pre-pulses and the probing pulse is another reason for the observed LIBS signal enhancement

  11. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  13. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Directory of Open Access Journals (Sweden)

    Alexandre G. de Brevern

    2015-01-01

    Full Text Available Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  14. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Science.gov (United States)

    de Brevern, Alexandre G.; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries. PMID:26125026

  15. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies.

    Science.gov (United States)

    de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  16. Genome-wide analyses of long noncoding RNA expression profiles correlated with radioresistance in nasopharyngeal carcinoma via next-generation deep sequencing.

    Science.gov (United States)

    Li, Guo; Liu, Yong; Liu, Chao; Su, Zhongwu; Ren, Shuling; Wang, Yunyun; Deng, Tengbo; Huang, Donghai; Tian, Yongquan; Qiu, Yuanzheng

    2016-09-06

    Radioresistance is one of the major factors limiting the therapeutic efficacy and prognosis of patients with nasopharyngeal carcinoma (NPC). Accumulating evidence has suggested that aberrant expression of long noncoding RNAs (lncRNAs) contributes to cancer progression. Therefore, here we identified lncRNAs associated with radioresistance in NPC. The differential expression profiles of lncRNAs associated with NPC radioresistance were constructed by next-generation deep sequencing by comparing radioresistant NPC cells with their parental cells. LncRNA-related mRNAs were predicted and analyzed using bioinformatics algorithms compared with the mRNA profiles related to radioresistance obtained in our previous study. Several lncRNAs and associated mRNAs were validated in established NPC radioresistant cell models and NPC tissues. By comparison between radioresistant CNE-2-Rs and parental CNE-2 cells by next-generation deep sequencing, a total of 781 known lncRNAs and 2054 novel lncRNAs were annotated. The top five upregulated and downregulated known/novel lncRNAs were detected using quantitative real-time reverse transcription-polymerase chain reaction, and 7/10 known lncRNAs and 3/10 novel lncRNAs were demonstrated to have significant differential expression trends that were the same as those predicted by deep sequencing. From the prediction process, 13 pairs of lncRNAs and their associated genes were acquired, and the prediction trends of three pairs were validated in both radioresistant CNE-2-Rs and 6-10B-Rs cell lines, including lncRNA n373932 and SLITRK5, n409627 and PRSS12, and n386034 and RIMKLB. LncRNA n373932 and its related SLITRK5 showed dramatic expression changes in post-irradiation radioresistant cells and a negative expression correlation in NPC tissues (R = -0.595, p < 0.05). Our study provides an overview of the expression profiles of radioresistant lncRNAs and potentially related mRNAs, which will facilitate future investigations into the

  17. Validation of a CFD model by using 3D sonic anemometers to analyse the air velocity generated by an air-assisted sprayer equipped with two axial fans.

    Science.gov (United States)

    García-Ramos, F Javier; Malón, Hugo; Aguirre, A Javier; Boné, Antonio; Puyuelo, Javier; Vidal, Mariano

    2015-01-22

    A computational fluid dynamics (CFD) model of the air flow generated by an air-assisted sprayer equipped with two axial fans was developed and validated by practical experiments in the laboratory. The CFD model was developed by considering the total air flow supplied by the sprayer fan to be the main parameter, rather than the outlet air velocity. The model was developed for three air flows corresponding to three fan blade settings and assuming that the sprayer is stationary. Actual measurements of the air velocity near the sprayer were taken using 3D sonic anemometers. The workspace sprayer was divided into three sections, and the air velocity was measured in each section on both sides of the machine at a horizontal distance of 1.5, 2.5, and 3.5 m from the machine, and at heights of 1, 2, 3, and 4 m above the ground The coefficient of determination (R2) between the simulated and measured values was 0.859, which demonstrates a good correlation between the simulated and measured data. Considering the overall data, the air velocity values produced by the CFD model were not significantly different from the measured values.

  18. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  19. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  20. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  1. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon) berries at ripening initiation

    Science.gov (United States)

    Lücker, Joost; Laszczak, Mario; Smith, Derek; Lund, Steven T

    2009-01-01

    Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison') in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening initiation and may be further

  3. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  4. Design by theoretical and CFD analyses of a multi-blade screw pump evolving liquid lead for a Generation IV LFR

    Energy Technology Data Exchange (ETDEWEB)

    Ferrini, Marcello [GeNERG - DIME/TEC, University of Genova, via all’Opera Pia 15/a, 16145 Genova (Italy); Borreani, Walter [Ansaldo Nucleare S.p.A., Corso F.M. Perrone 25, 16152 Genova (Italy); INFN, Via Dodecaneso 33, 16146 Genova (Italy); Lomonaco, Guglielmo, E-mail: guglielmo.lomonaco@unige.it [GeNERG - DIME/TEC, University of Genova, via all’Opera Pia 15/a, 16145 Genova (Italy); INFN, Via Dodecaneso 33, 16146 Genova (Italy); Magugliani, Fabrizio [Ansaldo Nucleare S.p.A., Corso F.M. Perrone 25, 16152 Genova (Italy)

    2016-02-15

    Lead-cooled fast reactor (LFR) has both a long history and a penchant of innovation. With early work related to its use for submarine propulsion dating to the 1950s, Russian scientists pioneered the development of reactors cooled by heavy liquid metals (HLM). More recently, there has been substantial interest in both critical and subcritical reactors cooled by lead (Pb) or lead–bismuth eutectic (LBE), not only in Russia, but also in Europe, Asia, and the USA. The growing knowledge of the thermal-fluid-dynamic properties of these fluids and the choice of the LFR as one of the six reactor types selected by Generation IV International Forum (GIF) for further research and development has fostered the exploration of new geometries and new concepts aimed at optimizing the key components that will be adopted in the Advanced Lead Fast Reactor European Demonstrator (ALFRED), the 300 MW{sub t} pool-type reactor aimed at proving the feasibility of the design concept adopted for the European Lead-cooled Fast Reactor (ELFR). In this paper, a theoretical and computational analysis is presented of a multi-blade screw pump evolving liquid Lead as primary pump for the adopted reference conceptual design of ALFRED. The pump is at first analyzed at design operating conditions from the theoretical point of view to determine the optimal geometry according to the velocity triangles and then modeled with a 3D CFD code (ANSYS CFX). The choice of a 3D simulation is dictated by the need to perform a detailed spatial simulation taking into account the peculiar geometry of the pump as well as the boundary layers and turbulence effects of the flow, which are typically tri-dimensional. The use of liquid Lead impacts significantly the fluid dynamic design of the pump because of the key requirement to avoid any erosion affects. These effects have a major impact on the performance, reliability and lifespan of the pump. Albeit some erosion-related issues remain to be fully addressed, the results

  5. Design by theoretical and CFD analyses of a multi-blade screw pump evolving liquid lead for a Generation IV LFR

    International Nuclear Information System (INIS)

    Ferrini, Marcello; Borreani, Walter; Lomonaco, Guglielmo; Magugliani, Fabrizio

    2016-01-01

    Lead-cooled fast reactor (LFR) has both a long history and a penchant of innovation. With early work related to its use for submarine propulsion dating to the 1950s, Russian scientists pioneered the development of reactors cooled by heavy liquid metals (HLM). More recently, there has been substantial interest in both critical and subcritical reactors cooled by lead (Pb) or lead–bismuth eutectic (LBE), not only in Russia, but also in Europe, Asia, and the USA. The growing knowledge of the thermal-fluid-dynamic properties of these fluids and the choice of the LFR as one of the six reactor types selected by Generation IV International Forum (GIF) for further research and development has fostered the exploration of new geometries and new concepts aimed at optimizing the key components that will be adopted in the Advanced Lead Fast Reactor European Demonstrator (ALFRED), the 300 MW t pool-type reactor aimed at proving the feasibility of the design concept adopted for the European Lead-cooled Fast Reactor (ELFR). In this paper, a theoretical and computational analysis is presented of a multi-blade screw pump evolving liquid Lead as primary pump for the adopted reference conceptual design of ALFRED. The pump is at first analyzed at design operating conditions from the theoretical point of view to determine the optimal geometry according to the velocity triangles and then modeled with a 3D CFD code (ANSYS CFX). The choice of a 3D simulation is dictated by the need to perform a detailed spatial simulation taking into account the peculiar geometry of the pump as well as the boundary layers and turbulence effects of the flow, which are typically tri-dimensional. The use of liquid Lead impacts significantly the fluid dynamic design of the pump because of the key requirement to avoid any erosion affects. These effects have a major impact on the performance, reliability and lifespan of the pump. Albeit some erosion-related issues remain to be fully addressed, the results of

  6. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  7. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  8. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  9. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  10. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  11. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  12. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  13. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  14. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  15. Finite-size scaling of survival probability in branching processes

    OpenAIRE

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Alvaro

    2014-01-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We reveal the finite-size scaling law of the survival probability for a given branching process ruled by a probability distribution of the number of offspring per element whose standard deviation is finite, obtaining the exact scaling function as well as the critical exponents. Our findings prove the universal behavi...

  16. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  17. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  18. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  19. Socio-economic well-to-wheel analysis of biofuels. Scenarios for rapeseed diesel (RME) and 1. and 2. generation bioethanol; Samfundsoekonomisk well-to-wheel-analyse af biobraendstoffer. Scenarieberegninger for rapsdiesel (RME) og 1.- og 2.-generations bioethanol

    Energy Technology Data Exchange (ETDEWEB)

    Slentoe, E.; Moeller, F.; Winther, M.; Hjort Mikkelsen, M.

    2010-10-15

    The report examines in an integrated form, the energy, emissions and welfare economic implications of introducing Danish produced biodiesel, i.e. rapeseed diesel (RME) and the first and second generation wheat ethanol in two scenarios with low and high rate of blending with fossil fuel based automotive fuels. Within this project's, analytical framework and assumptions the welfare economic analysis shows, that it would be beneficial for society to realize the biofuel scenarios to some extent by oil prices above $ 100 a barrel, while it will cause losses by oil prices at $ 65. In all cases, the fossil fuel consumption and the emissions CO2eq emissions are reduced, the effect of which is priced and included in the welfare economic analysis. The implementation of biofuels in Denmark will be dependent on market price. As it stands now, it is not favorable in terms of biofuels. The RME is currently produced in Denmark is exported to other European countries where there are state subsidies. Subsidies would also be a significant factor in Denmark to achieve objectives for biofuel blending. (ln)

  20. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  2. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  3. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  4. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  5. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  6. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  7. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  8. Life Cycle Assessments Applied to First Generation Biofuels Used in France. Final report; Analyses de Cycle de Vie appliquees aux biocarburants de premiere generation consommes en France. Rapport final

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    Climatic concerns become more substantial each day. Proofs of climatic change of man-made origin accumulate. Even if the consequences of such change remain difficult to foresee for most, this major environmental problem is now the subject of great attention by governments and public opinion. In this context, biofuels have known a growing interest over the last years. This enthusiasm is essentially based on their potential to reduce non-renewable energy consumption, notably petroleum, and to reduce greenhouse gas emissions (GHG) for the transportation sector. Indeed, the transportation sector currently generates about 14% of the world's GHG at a growth rate of about 2% per year that is particularly difficult to reduce. The issue of biofuel balances on these two criteria (GHG emission and non-renewable energy consumption) is therefore fundamental because it justifies largely the different forms of public financial support devoted to ensure the development of these pathways. Thus, numerous studies are aimed at comparing biofuels to equivalent petroleum fuels (gasoline and diesel) in order to assess GHG emission reduction potential associated with using biofuels in transportation. The Directive 2009/28/CE of April 29, 2009 dedicated to Renewable Energies promotion (RE Directive) sets forth a compelling objective, asking each Member State to make sure that the portion of energy produced from renewable sources in all forms of transportation will be at least equal to 10% of its final energy consumption in the transportation sector by 2020. This objective is set subject to a production of sustainable nature and to second generation biofuel availability on the market. The RE Directive introduces several criteria for qualifying biofuels' sustainability. Thus, biofuels should not be produced from land recognized as of great value in terms of biological diversity: forest undisturbed by important human activity, zone assigned to nature conservation, meadows presenting

  9. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  10. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  11. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  13. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  14. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  15. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  16. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  17. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  18. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  19. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  20. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  1. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  2. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  3. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  4. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  5. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  6. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  7. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  8. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  9. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  10. Ruin probabilities for a regenerative Poisson gap generated risk process

    DEFF Research Database (Denmark)

    Asmussen, Søren; Biard, Romain

    A risk process with constant premium rate c and Poisson arrivals of claims is considered. A threshold r is defined for claim interarrival times, such that if k consecutive interarrival times are larger than r, then the next claim has distribution G. Otherwise, the claim size distribution is F...

  11. Demand and choice probability generating functions for perturbed consumers

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2011-01-01

    This paper considers demand systems for utility-maximizing consumers equipped with additive linearly perturbed utility of the form U(x)+m⋅x and faced with general budget constraints x 2 B. Given compact budget sets, the paper provides necessary as well as sufficient conditions for a demand genera...

  12. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  13. Probabilistic approach in treatment of deterministic analyses results of severe accidents

    International Nuclear Information System (INIS)

    Krajnc, B.; Mavko, B.

    1996-01-01

    Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)

  14. Probability analysis of MCO over-pressurization during staging

    International Nuclear Information System (INIS)

    Pajunen, A.L.

    1997-01-01

    The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience

  15. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  16. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  17. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  18. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  19. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  20. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  1. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  2. Radiation risk of tissue late effects, a net consequence of probabilities of various cellular responses

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    1991-01-01

    Late effects from the exposure to low doses of ionizing radiation are hardly or not at all observed in man mainly due to the low values of risk coefficients that preclude statistical analyses of data from populations that are exposed to doses less than 0.2 Gy. In order to arrive at an assessment of potential risk from radiation exposure in the low dose range, the microdosimetry approach is essential. In the low dose range, ionizing radiation generates particle tracks, mainly electrons, which are distributed rather heterogeneously within the exposed tissue. Taking the individual cell as the elemental unit of life, observations and calculations of cellular responses to being hit by energy depositions events from low LET type are analysed. It emerges that besides the probability of a hit cell to sustain a detrimental effect with the consequense of malignant transformation there are probabilities of various adaptive responses that equipp the hit cell with a benefit. On the one hand, an improvement of cellular radical detoxification was observed in mouse bone marrow cells; another adaptive response pertaining to improved DNA repair, was reported for human lymphocytes. The improved radical detoxification in mouse bone marrow cells lasts for a period of 5-10 hours and improved DNA repair in human lymphocytes was seen for some 60 hours following acute irradiation. It is speculated that improved radical detoxification and improved DNA repair may reduce the probability of spontaneous carcinogenesis. Thus it is proposed to weigh the probability of detriment for a hit cell within a multicellular system against the probability of benefit through adaptive responses in other hit cells in the same system per radiation exposure. In doing this, the net effect of low doses of low LET radiation in tissue with individual cells being hit by energy deposition events could be zero or even beneficial. (orig./MG)

  3. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  4. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  5. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  6. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  7. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  8. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Science.gov (United States)

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  9. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Directory of Open Access Journals (Sweden)

    Karel Doubravsky

    Full Text Available Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (rechecked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  10. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  11. Tuned by experience: How orientation probability modulates early perceptual processing.

    Science.gov (United States)

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  13. Functional Analyses of a Novel Splice Variant in the CHD7 Gene, Found by Next Generation Sequencing, Confirm Its Pathogenicity in a Spanish Patient and Diagnose Him with CHARGE Syndrome

    Directory of Open Access Journals (Sweden)

    Olatz Villate

    2018-01-01

    Full Text Available Mutations in CHD7 have been shown to be a major cause of CHARGE syndrome, which presents many symptoms and features common to other syndromes making its diagnosis difficult. Next generation sequencing (NGS of a panel of intellectual disability related genes was performed in an adult patient without molecular diagnosis. A splice donor variant in CHD7 (c.5665 + 1G > T was identified. To study its potential pathogenicity, exons and flanking intronic sequences were amplified from patient DNA and cloned into the pSAD® splicing vector. HeLa cells were transfected with this construct and a wild-type minigene and functional analysis were performed. The construct with the c.5665 + 1G > T variant produced an aberrant transcript with an insert of 63 nucleotides of intron 28 creating a premature termination codon (TAG 25 nucleotides downstream. This would lead to the insertion of 8 new amino acids and therefore a truncated 1896 amino acid protein. As a result of this, the patient was diagnosed with CHARGE syndrome. Functional analyses underline their usefulness for studying the pathogenicity of variants found by NGS and therefore its application to accurately diagnose patients.

  14. Functional Analyses of a Novel Splice Variant in the CHD7 Gene, Found by Next Generation Sequencing, Confirm Its Pathogenicity in a Spanish Patient and Diagnose Him with CHARGE Syndrome.

    Science.gov (United States)

    Villate, Olatz; Ibarluzea, Nekane; Fraile-Bethencourt, Eugenia; Valenzuela, Alberto; Velasco, Eladio A; Grozeva, Detelina; Raymond, F L; Botella, María P; Tejada, María-Isabel

    2018-01-01

    Mutations in CHD7 have been shown to be a major cause of CHARGE syndrome, which presents many symptoms and features common to other syndromes making its diagnosis difficult. Next generation sequencing (NGS) of a panel of intellectual disability related genes was performed in an adult patient without molecular diagnosis. A splice donor variant in CHD7 (c.5665 + 1G > T) was identified. To study its potential pathogenicity, exons and flanking intronic sequences were amplified from patient DNA and cloned into the pSAD ® splicing vector. HeLa cells were transfected with this construct and a wild-type minigene and functional analysis were performed. The construct with the c.5665 + 1G > T variant produced an aberrant transcript with an insert of 63 nucleotides of intron 28 creating a premature termination codon (TAG) 25 nucleotides downstream. This would lead to the insertion of 8 new amino acids and therefore a truncated 1896 amino acid protein. As a result of this, the patient was diagnosed with CHARGE syndrome. Functional analyses underline their usefulness for studying the pathogenicity of variants found by NGS and therefore its application to accurately diagnose patients.

  15. Does charge transfer correlate with ignition probability?

    International Nuclear Information System (INIS)

    Holdstock, Paul

    2008-01-01

    Flammable or explosive atmospheres exist in many industrial environments. The risk of ignition caused by electrostatic discharges is very real and there has been extensive study of the incendiary nature of sparks and brush discharges. It is clear that in order to ignite a gas, an amount of energy needs to be delivered to a certain volume of gas within a comparatively short time. It is difficult to measure the energy released in an electrostatic discharge directly, but it is possible to approximate the energy in a spark generated from a well defined electrical circuit. The spark energy required to ignite a gas, vapour or dust cloud can be determined by passing such sparks through them. There is a relationship between energy and charge in a capacitive circuit and so it is possible to predict whether or not a spark discharge will cause an ignition by measuring the charge transferred in the spark. Brush discharges are in many ways less well defined than sparks. Nevertheless, some work has been done that has established a relationship between charge transferred in brush discharges and the probability of igniting a flammable atmosphere. The question posed by this paper concerns whether such a relationship holds true in all circumstances and if there is a universal correlation between charge transfer and ignition probability. Data is presented on discharges from textile materials that go some way to answering this question.

  16. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  17. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  18. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  19. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  20. Joint probabilities reproducing three EPR experiments on two qubits

    NARCIS (Netherlands)

    Roy, S. M.; Atkinson, D.; Auberson, G.; Mahoux, G.; Singh, V.

    2007-01-01

    An eight-parameter family of the most general non-negative quadruple probabilities is constructed for EPR-Bohm-Aharonov experiments when only three pairs of analyser settings are used. It is a simultaneous representation of three different Bohr-incompatible experimental configurations involving

  1. Combination of probabilities in looking for cosmic ray sources

    International Nuclear Information System (INIS)

    Goodman, M.

    1991-08-01

    The use of small chance probabilities as evidence for sources of cosmic rays is examined, with particular emphasis upon issues involved when combining results from two experiments, two analyses, or two independent tests of the same data. Examples are given in which different methods of combining results should be used

  2. Calculational framework for safety analyses of non-reactor nuclear facilities

    International Nuclear Information System (INIS)

    Coleman, J.R.

    1994-01-01

    A calculational framework for the consequences analysis of non-reactor nuclear facilities is presented. The analysis framework starts with accident scenarios which are developed through a traditional hazard analysis and continues with a probabilistic framework for the consequences analysis. The framework encourages the use of response continua derived from engineering judgment and traditional deterministic engineering analyses. The general approach consists of dividing the overall problem into a series of interrelated analysis cells and then devising Markov chain like probability transition matrices for each of the cells. An advantage of this division of the problem is that intermediate output (as probability state vectors) are generated at each calculational interface. The series of analyses when combined yield risk analysis output. The analysis approach is illustrated through application to two non-reactor nuclear analyses: the Ulysses Space Mission, and a hydrogen burn in the Hanford waste storage tanks

  3. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  4. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  5. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  6. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  7. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  8. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  9. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  10. Gas prices: realities and probabilities

    International Nuclear Information System (INIS)

    Broadfoot, M.

    2000-01-01

    An assessment of price trends suggests continuing rise in 2001, with some easing of upward price movement in 2002 and 2003. Storage levels as of Nov. 1, 2000 are expected to be at 2.77 Tcf, but if the winter of 2000/2001 proves to be more severe than usual, inventory levels could sink as low as 500 Bcf by April 1, 2001. With increasing demand for natural gas for non-utility electric power generation the major challenge will be to achieve significant supply growth, which means increased developmental drilling and inventory draw-downs, as well as more exploratory drilling in deepwater and frontier regions. Absence of a significant supply response by next summer will affect both growth in demand and in price levels, and the increased demand for electric generation in the summer will create a flatter consumption profile, erasing the traditional summer/winter spread in consumption, further intensifying price volatility. Managing price fluctuations is the second biggest challenge (after potential supply problems) facing the industry

  11. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  12. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  13. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  14. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  15. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  16. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  17. Heart sounds analysis using probability assessment.

    Science.gov (United States)

    Plesinger, F; Viscor, I; Halamek, J; Jurco, J; Jurak, P

    2017-07-31

    This paper describes a method for automated discrimination of heart sounds recordings according to the Physionet Challenge 2016. The goal was to decide if the recording refers to normal or abnormal heart sounds or if it is not possible to decide (i.e. 'unsure' recordings). Heart sounds S1 and S2 are detected using amplitude envelopes in the band 15-90 Hz. The averaged shape of the S1/S2 pair is computed from amplitude envelopes in five different bands (15-90 Hz; 55-150 Hz; 100-250 Hz; 200-450 Hz; 400-800 Hz). A total of 53 features are extracted from the data. The largest group of features is extracted from the statistical properties of the averaged shapes; other features are extracted from the symmetry of averaged shapes, and the last group of features is independent of S1 and S2 detection. Generated features are processed using logical rules and probability assessment, a prototype of a new machine-learning method. The method was trained using 3155 records and tested on 1277 hidden records. It resulted in a training score of 0.903 (sensitivity 0.869, specificity 0.937) and a testing score of 0.841 (sensitivity 0.770, specificity 0.913). The revised method led to a test score of 0.853 in the follow-up phase of the challenge. The presented solution achieved 7th place out of 48 competing entries in the Physionet Challenge 2016 (official phase). In addition, the PROBAfind software for probability assessment was introduced.

  18. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  19. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  20. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  1. Theoretical analysis on the probability of initiating persistent fission chain

    International Nuclear Information System (INIS)

    Liu Jianjun; Wang Zhe; Zhang Ben'ai

    2005-01-01

    For the finite multiplying system of fissile material in the presence of a weak neutron source, the authors analyses problems on the probability of initiating a persistent fission chain through reckoning the stochastic theory of neutron multiplication. In the theoretical treatment, the conventional point reactor conception model is developed to an improved form with position x and velocity v dependence. The estimated results including approximate value of the probability mentioned above and its distribution are given by means of diffusion approximation and compared with those with previous point reactor conception model. They are basically consistent, however the present model can provide details on the distribution. (authors)

  2. Decision-making in probability and statistics Chilean curriculum

    DEFF Research Database (Denmark)

    Elicer, Raimundo

    2018-01-01

    Probability and statistics have become prominent subjects in school mathematics curricula. As an exemplary case, I investigate the role of decision making in the justification for probability and statistics in the current Chilean upper secondary mathematics curriculum. For addressing this concern......, I draw upon Fairclough’s model for Critical Discourse Analysis to analyse selected texts as examples of discourse practices. The texts are interconnected with politically driven ideas of stochastics “for all”, the notion of statistical literacy coined by statisticians’ communities, schooling...

  3. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  4. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  5. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  7. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  8. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  9. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  10. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  11. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  12. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  13. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  14. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  15. Clear-Sky Probability for the August 21, 2017, Total Solar Eclipse Using the NREL National Solar Radiation Database

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Roberts, Billy J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kutchenreiter, Mark C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wilcox, Steve [Solar Resource Solutions, LLC, Lakewood, CO (United States); Stoffel, Tom [Solar Resource Solutions, LLC, Lakewood, CO (United States)

    2017-07-21

    The National Renewable Energy Laboratory (NREL) and collaborators have created a clear-sky probability analysis to help guide viewers of the August 21, 2017, total solar eclipse, the first continent-spanning eclipse in nearly 100 years in the United States. Using cloud and solar data from NREL's National Solar Radiation Database (NSRDB), the analysis provides cloudless sky probabilities specific to the date and time of the eclipse. Although this paper is not intended to be an eclipse weather forecast, the detailed maps can help guide eclipse enthusiasts to likely optimal viewing locations. Additionally, high-resolution data are presented for the centerline of the path of totality, representing the likelihood for cloudless skies and atmospheric clarity. The NSRDB provides industry, academia, and other stakeholders with high-resolution solar irradiance data to support feasibility analyses for photovoltaic and concentrating solar power generation projects.

  16. Evaluating probability measures related to subsurface flow and transport

    International Nuclear Information System (INIS)

    Cawlfield, J.D.

    1991-01-01

    Probabilistic modeling approaches are being used increasingly in order to carry out quantified risk analysis and to evaluate the uncertainty existing in subsurface flow and transport analyses. The work presented in this paper addresses three issues: comparison of common probabilistic modeling techniques, recent results regarding the sensitivity of probability measures to likely changes in the uncertain variables for transport in porous media, and a discussion of some questions regarding fundamental modeling philosophy within a probabilistic framework. Recent results indicate that uncertainty regarding average flow velocity controls the probabilistic outcome, while uncertainty in the dispersivity and diffusion coefficient does not seem very important. Uncertainty of reaction terms is important only at early times in the transport process. Questions are posed regarding (1) the inclusion of macrodispersion in a probabilistic analysis, (2) statistics of flow velocity and (3) the notion of an ultimate probability measure for subsurface flow analyses

  17. Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization

    Science.gov (United States)

    Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.

    2014-01-01

    Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406

  18. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  19. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  20. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  1. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  2. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  3. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  4. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  5. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  6. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  7. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    Science.gov (United States)

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  8. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  9. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  10. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  11. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  12. Finite-size scaling of survival probability in branching processes.

    Science.gov (United States)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G(y)=2ye(y)/(e(y)-1), with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  13. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  14. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  15. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  16. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  17. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  18. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  19. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  20. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  1. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  2. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  3. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  4. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  5. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  6. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  7. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  8. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  9. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  10. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  11. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  12. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  13. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  14. Analysis of the probability of channel satisfactory state in P2P live ...

    African Journals Online (AJOL)

    In this paper a model based on user behaviour of P2P live streaming systems was developed in order to analyse one of the key QoS parameter of such systems, i.e. the probability of channel-satisfactory state, the impact of upload bandwidths and channels' popularity on the probability of channel-satisfactory state was also ...

  15. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  16. Using probability of drug use as independent variable in a register-based pharmacoepidemiological cause-effect study-An application of the reverse waiting time distribution

    DEFF Research Database (Denmark)

    Hallas, Jesper; Pottegård, Anton; Støvring, Henrik

    2017-01-01

    generated adjusted ORs in the upper range (4.37-4.75) while at the same time having the most narrow confidence intervals (ratio between upper and lower confidence limit, 1.46-1.50). Some ORs generated by conventional measures were higher than the probabilistic ORs, but only when the assumed period of intake......BACKGROUND: In register-based pharmacoepidemiological studies, each day of follow-up is usually categorized either as exposed or unexposed. However, there is an underlying continuous probability of exposure, and by insisting on a dichotomy, researchers unwillingly force a nondifferential...... misclassification into their analyses. We have recently developed a model whereby probability of exposure can be modeled, and we tested this on an empirical case of nonsteroidal anti-inflammatory drug (NSAID)-induced upper gastrointestinal bleeding (UGIB). METHODS: We used a case-controls data set, consisting...

  17. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  18. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  19. What probabilities tell about quantum systems, with application to entropy and entanglement

    CERN Document Server

    Myers, John M

    2010-01-01

    The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”

  20. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  1. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  2. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  3. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  4. Third generation nuclear plants

    Science.gov (United States)

    Barré, Bertrand

    2012-05-01

    After the Chernobyl accident, a new generation of Light Water Reactors has been designed and is being built. Third generation nuclear plants are equipped with dedicated systems to insure that if the worst accident were to occur, i.e. total core meltdown, no matter how low the probability of such occurrence, radioactive releases in the environment would be minimal. This article describes the EPR, representative of this "Generation III" and a few of its competitors on the world market.

  5. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  6. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  8. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  9. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  10. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  11. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  12. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  13. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  14. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  15. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  16. The influence of initial beliefs on judgments of probability.

    Science.gov (United States)

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  17. Rare gases transition probabilities for plasma diagnostics

    International Nuclear Information System (INIS)

    Katsonis, K.; Siskos, A.; Ndiaye, A.; Clark, R.E.H.; Cornille, M.; Abdallah, J. Jr

    2005-01-01

    Emission spectroscopy is a powerful optical diagnostics tool which has been largely used in studying and monitoring various industrial, laboratory and natural plasmas. As these plasmas are rarely in Local Thermodynamic Equilibrium (LTE) a prerequisite of satisfactory evaluation of the plasma electron density n e and temperature T e is the existence of a detailed Collisional-Radiative (C-R) model taking into account the main physical processes influencing the plasma state and dynamics of its main constituents. The theoretical spectra which such a model generates match the experimental ones whenever the experimental values of ne and T e are introduced. In practice, in validating such models, discrepancies are observed which often are due to the atomic data included in the C-R model. In generating theoretical spectra pertaining to each atom(ion) multiplet, the most sensible atomic data are the relevant transition probabilities A j→i and electron collision excitation cross sections σ i→j . We note that the latter are actually poorly known, especially for low ionization stages and near the excitation threshold. We address here the evaluation of the former, especially of the A j→i of the Ar 2+ ion responsible for the Ar III spectra and of those of the Xe 2+ ion which are evaluated in an analogous way. Extensive studies of the Ar III and Xe III spectra exist, but the present status of Aj i cannot be considered sufficient for the generation of the theoretical spectra even of the most prominent visible lines coming from the Ar III multiplets 4s - 4p, 5p (corresponding to the well known '' red '' and 'blue' lines of Ar I) 4p - 4d, 5d and 3p - 4s, 5s (resonant) and the analogous Xe III multiplets (which have principal quantum numbers increased by two). Due to the gap observed in the Grotrian diagrams, the resonant lines which, together with the important metastable ones, belong to the 3p - 4s, 5s multiplets, (5p - 6s, 7s for Xe III), give spectra in the UV region. On

  18. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-01-01

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k eff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  19. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  20. The probability of traffic accidents associated with the transport of radioactive wastes

    International Nuclear Information System (INIS)

    James, I.A.

    1986-01-01

    This report evaluates the probability of a container impact during transit between generating and disposal sites. Probabilities per route mile are combined with the characteristics of the transport systems described in previous reports, to allow a comparison of different disposal options to be made. (author)

  1. Exact capture probability analysis of GSC receivers over Rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2010-01-01

    For third generation systems and ultrawideband systems, RAKE receivers have been introduced due to the advantage of RAKE receivers which is their ability to combine different replicas of the transmitted signal arriving at different delays in a rich multipath environment. In principle, RAKE receivers combine all resolvable paths which gives the best performance in a rich diversity environment. However, this is usually costly in terms of hardware required as the number of RAKE fingers increases. Therefore, generalized selection combining (GSC) RAKE reception was proposed and has been studied by many researcher as an alternative to the classical two fundamental diversity schemes: maximal ratio combining and selection combining. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closedform expressions for various performance measures. However, the remaining set of uncombined paths affect the overall performance both in terms of loss in power. Therefore, to have a full understanding of the performance of GSC RAKE receivers, we introduce in this paper the notion of capture probability, which is defined as the ratio of the captured power (essentially combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  2. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  3. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  4. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  5. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  6. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  7. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  8. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  9. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  10. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  11. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  12. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  13. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  14. On the average capacity and bit error probability of wireless communication systems

    KAUST Repository

    Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2011-01-01

    Analysis of the average binary error probabilities and average capacity of wireless communications systems over generalized fading channels have been considered separately in the past. This paper introduces a novel moment generating function

  15. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  16. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  17. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  18. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  19. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  20. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  1. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  2. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  3. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  4. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  5. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    S. Kuzio

    2004-01-01

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  6. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  7. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  8. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Energy Technology Data Exchange (ETDEWEB)

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  9. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  10. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  11. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  12. Evaluation of burst probability for tubes by Weibull distributions

    International Nuclear Information System (INIS)

    Kao, S.

    1975-10-01

    The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities

  13. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  14. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  15. Variate generation for probabilistic fracture mechanics and fitness-for-service studies

    International Nuclear Information System (INIS)

    Walker, J.R.

    1987-01-01

    Atomic Energy of Canada Limited is conducting studies in Probabilistic Fracture Mechanics. These studies are being conducted as part of a fitness-for-service programme in support of CANDU reactors. The Monte Carlo analyses, which form part of the Probabilistic Fracture Mechanics studies, require that variates can be sampled from probability density functions. Accurate pseudo-random numbers are necessary for accurate variate generation. This report details the principles of variate generation, and describes the production and testing of pseudo-random numbers. A new algorithm has been produced for the correct performance of the lattice test for the independence of pseudo-random numbers. Two new pseudo-random number generators have been produced. These generators have excellent randomness properties and can be made fully machine-independent. Versions, in FORTRAN, for VAX and CDC computers are given. Accurate and efficient algorithms for the generation of variates from the specialized probability density functions of Probabilistic Fracture Mechanics are given. 38 refs

  16. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  17. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  18. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  19. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  20. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  1. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  2. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  3. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  4. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  6. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  7. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  8. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  9. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  10. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  11. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  12. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  13. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  14. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  15. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  16. Probabilities from entanglement, Born's rule from envariance

    International Nuclear Information System (INIS)

    Zurek, W.

    2005-01-01

    Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)

  17. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  18. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  19. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  20. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462