WorldWideScience

Sample records for clinical pre-test probability

  1. A methodological proposal to research patients’ demands and pre-test probabilities using paper forms in primary care settings

    Directory of Open Access Journals (Sweden)

    Gustavo Diniz Ferreira Gusso

    2013-04-01

    Full Text Available Objective: The purpose of this study is to present a methodology for assessing patients’ demands and calculating pre-test probabilities using paper forms in Primary Care. Method: Most developing countries do not use Electronic Health Records (EHR in primary care settings. This makes it difficult to access information regarding what occurs within the health center working process. Basically, there are two methodologies to assess patients’ demands and problems or diagnosis stated by doctors. The first is based on single attendance at each appointment, while the second is based on episodes of care; the latter deals with each problem in a longitudinal manner. The methodology developed in this article followed the approach of confronting the ‘reason for the appointment’ and ‘the problem registered’ by doctors. Paper forms were developed taking this concept as central. All appointments were classified by the International Classification of Primary Care (ICPC. Discussion: Even in paper form, confrontation between ‘reason for the appointment’ and ‘problem registered’ is useful for measuring the pre-test probabilities of each problem-based appointment. This approach can be easily reproduced in any health center and enables a better understanding of population profile. Prevalence of many illnesses and diseases are not known in each reality, and studies conducted in other settings, such as secondary and tertiary care, are not adequate for primary health care. Conclusion: This study offers adequate technology for primary health care workers that have potential to transform each health center into a research-led practice, contributing directly to patient care.

  2. Accuracy of dual-source CT coronary angiography: first experience in a high pre-test probability population without heart rate control

    Energy Technology Data Exchange (ETDEWEB)

    Scheffel, Hans; Alkadhi, Hatem; Desbiolles, Lotus; Frauenfelder, Thomas; Schertler, Thomas; Husmann, Lars; Marincek, Borut; Leschka, Sebastian [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Plass, Andre; Vachenauer, Robert; Grunenfelder, Juerg; Genoni, Michele [Clinic for Cardiovascular Surgery, Zurich (Switzerland); Gaemperli, Oliver; Schepis, Tiziano [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); Kaufmann, Philipp A. [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); University of Zurich, Center for Integrative Human Physiology, Zurich (Switzerland)

    2006-12-15

    The aim of this study was to assess the diagnostic accuracy of dual-source computed tomography (DSCT) for evaluation of coronary artery disease (CAD) in a population with extensive coronary calcifications without heart rate control. Thirty patients (24 male, 6 female, mean age 63.1{+-}11.3 years) with a high pre-test probability of CAD underwent DSCT coronary angiography and invasive coronary angiography (ICA) within 14{+-}9 days. No beta-blockers were administered prior to the scan. Two readers independently assessed image quality of all coronary segments with a diameter {>=}1.5 mm using a four-point score (1: excellent to 4: not assessable) and qualitatively assessed significant stenoses as narrowing of the luminal diameter >50%. Causes of false-positive (FP) and false-negative (FN) ratings were assigned to calcifications or motion artifacts. ICA was considered the standard of reference. Mean body mass index was 28.3{+-}3.9 kg/m{sup 2} (range 22.4-36.3 kg/m{sup 2}), mean heart rate during CT was 70.3{+-}14.2 bpm (range 47-102 bpm), and mean Agatston score was 821{+-}904 (range 0-3,110). Image quality was diagnostic (scores 1-3) in 98.6% (414/420) of segments (mean image quality score 1.68{+-}0.75); six segments in three patients were considered not assessable (1.4%). DSCT correctly identified 54 of 56 significant coronary stenoses. Severe calcifications accounted for false ratings in nine segments (eight FP/one FN) and motion artifacts in two segments (one FP/one FN). Overall sensitivity, specificity, positive and negative predictive value for evaluating CAD were 96.4, 97.5, 85.7, and 99.4%, respectively. First experience indicates that DSCT coronary angiography provides high diagnostic accuracy for assessment of CAD in a high pre-test probability population with extensive coronary calcifications and without heart rate control. (orig.)

  3. Pre-Test Assessment

    Science.gov (United States)

    Berry, Thomas

    2008-01-01

    Pre-tests are a non-graded assessment tool used to determine pre-existing subject knowledge. Typically pre-tests are administered prior to a course to determine knowledge baseline, but here they are used to test students prior to topical material coverage throughout the course. While counterintuitive, the pre-tests cover material the student is…

  4. Probability, clinical decision making and hypothesis testing

    Directory of Open Access Journals (Sweden)

    A Banerjee

    2009-01-01

    Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.

  5. A short note on probability in clinical medicine.

    Science.gov (United States)

    Upshur, Ross E G

    2013-06-01

    Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine.

  6. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If...

  7. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study.

    NARCIS (Netherlands)

    Noordman, J.; Weijden, T. van der; Dulmen, S. van

    2014-01-01

    Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be

  8. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study

    NARCIS (Netherlands)

    Noordman, J.; Weijden, T.T. van der; Dulmen, S. van

    2014-01-01

    AIMS: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. BACKGROUND: Continuing professional education may be

  9. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie

    2008-01-01

    The preoperative prevalence of deep venous thrombosis (DVT) in patients with colorectal cancer may be as high as 8%. In order to minimize the risk of pulmonary embolism, it is important to rule out preoperative DVT. A large study has confirmed that a negative D-dimer test in combination with a low...

  10. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  11. Clinical features of probable severe acute respiratory syndrome in Beijing

    Institute of Scientific and Technical Information of China (English)

    Hai-Ying Lu; Xiao-Yuan Xu; Yu Lei; Yang-Feng Wu; Bo-Wen Chen; Feng Xiao; Gao-Qiang Xie; De-Min Han

    2005-01-01

    AIM: To summarize clinical features of probable severe acute respiratory syndrome (SARS) in Beijing.METHODS: Retrospective cases involving 801 patients admitted to hospitals in Beijing between March and June 2003, with a diagnosis of probable SARS, moderate type.The series of clinical manifestation, laboratory and radiograph data obtained from 801 cases were analyzed. RESULTS: One to three days after the onset of SARS, the major clinical symptoms were fever (in 88.14% of patients), fatigue, headache, myalgia, arthralgia (25-36%), etc. The counts of WBC (in 22.56% of patients) lymphocyte (70.25%)and CD3, CD4, CD8 positive T cells (70%) decreased. From 4-7 d, the unspecific symptoms became weak; however, the rates of low respiratory tract symptoms, such as cough (24.18%), sputum production (14.26%), chest distress (21.04%) and shortness of breath (9.23%) increased, so did the abnormal rates on chest radiograph or CT. The low counts of WBC, lymphocyte and CD3, CD4, CD8 positiveT cells touched bottom. From 8 to 16 d, the patients presented progressive cough (29.96%), sputum production (13.09%), chest distress (29.96%) and shortness of breath (35.34%). All patients had infiltrates on chest radiograph or CT, some even with multi-infiltrates. Two weeks later, patients' respiratory symptoms started to alleviate, the infiltrates on the lung began to absorb gradually, the counts of WBC, lymphocyte and CD3, CD4, CD8 positive T cells were restored to normality.CONCLUSION: The data reported here provide evidence that the course of SARS could be divided into four stages, namely the initial stage, progressive stage, fastigium and convalescent stage.

  12. Some uses of predictive probability of success in clinical drug development

    Directory of Open Access Journals (Sweden)

    Mauro Gasparini

    2013-03-01

    Full Text Available Predictive probability of success is a (subjective Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations imposed in the world of pharmaceutical development.Within a single trial, predictive probability of success can be identified with expected power, i.e. the evaluation of the success probability of the trial. Success means, for example, obtaining a significant result of a standard superiority test.Across trials, predictive probability of success can be the probability of a successful completion of an entire part of clinical development, for example a successful phase III development in the presence of phase II data.Calculations of predictive probability of success in the presence of normal data with known variance will be illustrated, both for within-trial and across-trial predictions.

  13. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t

  14. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn [RaySearch Laboratories, Sveavägen 44, Stockholm SE-111 34 (Sweden); Forsgren, Anders [Optimization and Systems Theory, Department of Mathematics, KTH Royal Institute of Technology, Stockholm SE-100 44 (Sweden)

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.

  15. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...... (range 0-15 years). One-third of patients experienced a transient improvement in motor symptoms with use of levodopa. Median survival from disease onset was 6.9 years (range 1-16 years, 95% CI 6.3-7.5) with no apparent variation according to gender or subtype. Conclusions: Our nationwide approach...

  16. Some uses of predictive probability of success in clinical drug development

    OpenAIRE

    Mauro Gasparini; Lilla Di Scala; Frank Bretz; Amy Racine-Poon

    2013-01-01

    Predictive probability of success is a (subjective) Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations impose...

  17. Cognitive Laboratory Experiences : On Pre-testing Computerised Questionnaires

    NARCIS (Netherlands)

    Snijkers, G.J.M.E.

    2002-01-01

    In the literature on questionnaire design and survey methodology, pre-testing is mentioned as a way to evaluate questionnaires (i.e. investigate whether they work as intended) and control for measurement errors (i.e. assess data quality). As the American Statistical Association puts it (ASA, 1999, p

  18. Topological characteristics of brainstem lesions in clinically definite and clinically probable cases of multiple sclerosis: An MRI-study

    Energy Technology Data Exchange (ETDEWEB)

    Brainin, M.; Omasits, M.; Reisner, T.; Neuhold, A.; Wicke, L.

    1987-11-01

    Disseminated lesions in the white matter of the cerebral hemispheres and confluent lesions at the borders of the lateral ventricles as seen on MRI are both considered acceptable paraclinical evidence for the diagnosis of multiple sclerosis. Similar changes are, however, also found in vascular diseases of the brain. We therefore aimed at identifying those additional traits in the infratentorial region, which in our experience are not frequently found in cerebrovascular pathology. We evaluated MR brain scans of 68 patients and found pontine lesions in 71% of cases with a clinically definite diagnosis (17 out of 24) and in 33% of cases with a probable diagnosis (14 out of 43). Lesions in the medulla oblongata were present in 50% and 16%, respectively, and in the midbrain in 25% and 7%, respectively. With rare exceptions all brainstem lesions were contiguous with the cisternal or ventricular cerebrospinal fluid spaces. In keeping with post-mortem reports the morphological spectrum ranged from large confluent patches to solitary, well delineated paramedian lesions or discrete linings of the cerebrospinal fluid border zones and were most clearly depicted from horizontal and sagittal T2 weighted SE-sequences. If there is a predilection for the outer or inner surfaces of the brainstem, such lesions can be considered an additional typical feature of multiple sclerosis and can be more reliably weighted as paraclinical evidence for a definite diagnosis.

  19. Clinical radiobiology of glioblastoma multiforme. Estimation of tumor control probability from various radiotherapy fractionation schemes

    Energy Technology Data Exchange (ETDEWEB)

    Pedicini, Piernicola [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy); Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Fiorentino, Alba [Sacro Cuore - Don Calabria Hospital, Radiation Oncology Department, Negrar, Verona (Italy); Simeon, Vittorio [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Laboratory of Preclinical and Translational Research, Rionero-in-Vulture (Italy); Tini, Paolo; Pirtoli, Luigi [University of Siena and Tuscany Tumor Institute, Unit of Radiation Oncology, Department of Medicine Surgery and Neurological Sciences, Siena (Italy); Chiumento, Costanza [Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Salvatore, Marco [I.R.C.C.S. SDN Foundation, Unit of Nuclear Medicine, Napoli (Italy); Storto, Giovanni [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy)

    2014-10-15

    The aim of this study was to estimate a radiobiological set of parameters from the available clinical data on glioblastoma (GB). A number of clinical trial outcomes from patients affected by GB and treated with surgery and adjuvant radiochemotherapy were analyzed to estimate a set of radiobiological parameters for a tumor control probability (TCP) model. The analytical/graphical method employed to fit the clinical data allowed us to estimate the intrinsic tumor radiosensitivity (α), repair capability (b), and repopulation doubling time (T{sub d}) in a first phase, and subsequently the number of clonogens (N) and kick-off time for accelerated proliferation (T{sub k}). The results were used to formulate a hypothesis for a scheduleexpected to significantly improve local control. The 95 % confidence intervals (CI{sub 95} {sub %}) of all parameters are also discussed. The pooled analysis employed to estimate the parameters summarizes the data of 559 patients, while the studies selected to verify the results summarize data of 104 patients. The best estimates and the CI{sub 95} {sub %} are α = 0.12 Gy{sup -1} (0.10-0.14), b = 0.015 Gy{sup -2} (0.013-0.020), α/b = 8 Gy (5.0-10.8), T{sub d} = 15.4 days (13.2-19.5), N = 1 . 10{sup 4} (1.2 . 10{sup 3} - 1 . 10{sup 5}), and T{sub k} = 37 days (29-46). The dose required to offset the repopulation occurring after 1 day (D{sub prolif}) and starting after T{sub k} was estimated as 0.30 Gy/day (0.22-0.39). The analysis confirms a high value for the α/b ratio. Moreover, a high intrinsic radiosensitivity together with a long kick-off time for accelerated repopulation and moderate repopulation kinetics were found. The results indicate a substantial independence of the duration of the overall treatment and an improvement in the treatment effectiveness by increasing the total dose without increasing the dose fraction. (orig.) [German] Schaetzung eines strahlenbiologischen Parametersatzes auf der Grundlage klinischer Daten bei

  20. Choreographer Pre-Testing Code Analysis and Operational Testing.

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, David J. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Harrison, Christopher B. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Perr, C. W. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Hurd, Steven A [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2014-07-01

    Choreographer is a "moving target defense system", designed to protect against attacks aimed at IP addresses without corresponding domain name system (DNS) lookups. It coordinates actions between a DNS server and a Network Address Translation (NAT) device to regularly change which publicly available IP addresses' traffic will be routed to the protected device versus routed to a honeypot. More details about how Choreographer operates can be found in Section 2: Introducing Choreographer. Operational considerations for the successful deployment of Choreographer can be found in Section 3. The Testing & Evaluation (T&E) for Choreographer involved 3 phases: Pre-testing, Code Analysis, and Operational Testing. Pre-testing, described in Section 4, involved installing and configuring an instance of Choreographer and verifying it would operate as expected for a simple use case. Our findings were that it was simple and straightforward to prepare a system for a Choreographer installation as well as configure Choreographer to work in a representative environment. Code Analysis, described in Section 5, consisted of running a static code analyzer (HP Fortify) and conducting dynamic analysis tests using the Valgrind instrumentation framework. Choreographer performed well, such that only a few errors that might possibly be problematic in a given operating situation were identified. Operational Testing, described in Section 6, involved operating Choreographer in a representative environment created through EmulyticsTM . Depending upon the amount of server resources dedicated to Choreographer vis-á-vis the amount of client traffic handled, Choreographer had varying degrees of operational success. In an environment with a poorly resourced Choreographer server and as few as 50-100 clients, Choreographer failed to properly route traffic over half the time. Yet, with a well-resourced server, Choreographer handled over 1000 clients without missrouting. Choreographer

  1. On pre-test sensitisation and peer assessment to enhance learning gain in science education

    NARCIS (Netherlands)

    Bos, Antonius Bernardus Hendrikus

    2009-01-01

    *The main part of this thesis focuses on designing, optimising, and studying the embedding of two types of interventions: pre-testing and peer assessment, both supported by or combined with ICT-tools. * Pre-test sensitisation is used intentionally to boost the learning gain of the main intervention,

  2. A Clinical model to identify patients with high-risk coronary artery disease

    NARCIS (Netherlands)

    Y. Yang (Yelin); L. Chen (Li); Y. Yam (Yeung); S. Achenbach (Stephan); M. Al-Mallah (Mouaz); D.S. Berman (Daniel); M.J. Budoff (Matthew); F. Cademartiri (Filippo); T.Q. Callister (Tracy); H.-J. Chang (Hyuk-Jae); V.Y. Cheng (Victor); K. Chinnaiyan (Kavitha); R.C. Cury (Ricardo); A. Delago (Augustin); A. Dunning (Allison); G.M. Feuchtner (Gudrun); M. Hadamitzky (Martin); J. Hausleiter (Jörg); R.P. Karlsberg (Ronald); P.A. Kaufmann (Philipp); Y.-J. Kim (Yong-Jin); J. Leipsic (Jonathon); T.M. LaBounty (Troy); F.Y. Lin (Fay); E. Maffei (Erica); G.L. Raff (Gilbert); L.J. Shaw (Leslee); T.C. Villines (Todd); J.K. Min (James K.); B.J.W. Chow (Benjamin)

    2015-01-01

    textabstractObjectives This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Background Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify th

  3. Inverse probability weighting to estimate causal effect of a singular phase in a multiphase randomized clinical trial for multiple myeloma

    Directory of Open Access Journals (Sweden)

    Annalisa Pezzi

    2016-11-01

    Full Text Available Abstract Background Randomization procedure in randomized controlled trials (RCTs permits an unbiased estimation of causal effects. However, in clinical practice, differential compliance between arms may cause a strong violation of randomization balance and biased treatment effect among those who comply. We evaluated the effect of the consolidation phase on disease-free survival of patients with multiple myeloma in an RCT designed for another purpose, adjusting for potential selection bias due to different compliance to previous treatment phases. Methods We computed two propensity scores (PS to model two different selection processes: the first to undergo autologous stem cell transplantation, the second to begin consolidation therapy. Combined stabilized inverse probability treatment weights were then introduced in the Cox model to estimate the causal effect of consolidation therapy miming an ad hoc RCT protocol. Results We found that the effect of consolidation therapy was restricted to the first 18 months of the phase (HR: 0.40, robust 95 % CI: 0.17-0.96, after which it disappeared. Conclusions PS-based methods could be a complementary approach within an RCT context to evaluate the effect of the last phase of a complex therapeutic strategy, adjusting for potential selection bias caused by different compliance to the previous phases of the therapeutic scheme, in order to simulate an ad hoc randomization procedure. Trial registration ClinicalTrials.gov: NCT01134484 May 28, 2010 (retrospectively registered EudraCT: 2005-003723-39 December 17, 2008 (retrospectively registered

  4. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Van den Bergh, Laura [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center - Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Haustermans, Karin [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Heemsbergen, Wilma [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands); Van den Heuvel, Frank [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Lebesque, Joos V. [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands)

    2012-03-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints

  5. MSA-C is the predominant clinical phenotype of MSA in Japan: analysis of 142 patients with probable MSA.

    Science.gov (United States)

    Yabe, Ichiro; Soma, Hiroyuki; Takei, Asako; Fujiki, Naoto; Yanagihara, Tetsuro; Sasaki, Hidenao

    2006-11-15

    We investigated the clinical features and mode of disease progression in 142 patients with probable multiple system atrophy (MSA) according to the Consensus Criteria. The subjects included 84 men and 58 women with a mean age at onset of 58.2+/-7.1 years (range: 38-79 years). Cerebellar signs were detected in 87.3% of these patients at the time of initial examination, and were found in 95.1% of them at latest follow-up. MSA-C was diagnosed in 83.8% of the patients at their first examination. Parkinsonism was initially detected in 28.9% of the patients, increasing to 51.4% at the latest follow-up. Among all of the subjects, only 16.2% were classified as having MSA-P on initial examination. At the latest follow-up, parkinsonian features had become predominant over cerebellar features in 24.6% of the 65 patients with MSA-C who were followed for more than 3 years. Although parkinsonism usually masked the signs of cerebellar involvement in MSA-C patients, none of the patients with MSA-P at an early stage showed predominance of cerebellar features at the latest follow-up. Parkinsonism is the predominant feature of MSA among Western patients, even at an early stage, but this study showed that cerebellar deficits are the main feature in Japanese patients. This difference of disease manifestations between ethnic groups suggests that genetic factors may influence the clinical phenotype of MSA.

  6. Free Fall Misconceptions: Results of a Graph Based Pre-Test of Sophomore Civil Engineering Students

    Science.gov (United States)

    Montecinos, Alicia M.

    2014-01-01

    A partially unusual behaviour was found among 14 sophomore students of civil engineering who took a pre test for a free fall laboratory session, in the context of a general mechanics course. An analysis contemplating mathematics models and physics models consistency was made. In all cases, the students presented evidence favoring a correct free…

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  8. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  9. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  10. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  11. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  12. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  13. Pre-test CFD Calculations for a Bypass Flow Standard Problem

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson

    2011-11-01

    The bypass flow in a prismatic high temperature gas-cooled reactor (HTGR) is the flow that occurs between adjacent graphite blocks. Gaps exist between blocks due to variances in their manufacture and installation and because of the expansion and shrinkage of the blocks from heating and irradiation. Although the temperature of fuel compacts and graphite is sensitive to the presence of bypass flow, there is great uncertainty in the level and effects of the bypass flow. The Next Generation Nuclear Plant (NGNP) program at the Idaho National Laboratory has undertaken to produce experimental data of isothermal bypass flow between three adjacent graphite blocks. These data are intended to provide validation for computational fluid dynamic (CFD) analyses of the bypass flow. Such validation data sets are called Standard Problems in the nuclear safety analysis field. Details of the experimental apparatus as well as several pre-test calculations of the bypass flow are provided. Pre-test calculations are useful in examining the nature of the flow and to see if there are any problems associated with the flow and its measurement. The apparatus is designed to be able to provide three different gap widths in the vertical direction (the direction of the normal coolant flow) and two gap widths in the horizontal direction. It is expected that the vertical bypass flow will range from laminar to transitional to turbulent flow for the different gap widths that will be available.

  14. Pre-test analysis of ATLAS SBO with RCP seal leakage scenario using MARS code

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Quang Huy; Lee, Sang Young; Oh, Seung Jong [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    This study presents a pre-test calculation for the Advanced Thermal-hydraulic Test Loop for Accident Simulation (ATLAS) SBO experiment with RCP seal leakage scenario. Initially, turbine-driven auxfeed water pumps are used. Then, outside cooling water injection method is used for long term cooling. The analysis results would be useful for conducting the experiment to verify the APR 1400 extended SBO optimum mitigation strategy using outside cooling water injection in future. The pre-test calculation for ATLAS extended SBO with RCP seal leakage and outside cooling water injection scenario is performed. After Fukushima nuclear accident, the capability of coping with the extended station blackout (SBO) becomes important. Many NPPs are applying FLEX approach as main coping strategies for extended SBO scenarios. In FLEX strategies, outside cooling water injection to reactor cooling system (RCS) and steam generators (SGs) is considered as an effective method to remove residual heat and maintain the inventory of the systems during the accident. It is worthwhile to examine the soundness of outside cooling water injection method for extended SBO mitigation by both calculation and experimental demonstration. From the calculation results, outside cooling water injection into RCS and SGs is verified as an effective method during extended SBO when RCS and SGs depressurization is sufficiently performed.

  15. Early-onset drusen in a girl with bloom syndrome: probable clinical importance of an ocular manifestation.

    Science.gov (United States)

    Aslan, Deniz; Oztürk, Gülyüz; Kaya, Zühre; Bideci, Aysun; Ozdogãan, Sibel; Ozdek, Sengül; Gürsel, Türkiz

    2004-04-01

    Ophthalmic examination of a girl admitted with the complaint of growth failure revealed retinal hard drusen. It was surprising to observe drusen in a child because they represent an age-related degenerative change in normal individuals. After further evaluation, she was diagnosed to have Bloom syndrome, a premature aging syndrome. To the authors' knowledge, this is the first case of Bloom syndrome associated with drusen. It is probable that not only aging but also other fundamental cell processes, especially uncontrolled cell proliferation, might be similarly affected and might follow a more rapid course in this inherited condition presenting with drusen. The authors suggest paying extra attention to drusen during the ophthalmic assessment in the diagnosis of all Bloom syndrome patients; it may be prudent to watch more carefully for the development of cancer in patients with drusen than those without drusen.

  16. Probability of Extraprostatic Disease According to the Percentage of Positive Biopsy Cores in Clinically Localized Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Thiago N. Valette

    2015-06-01

    Full Text Available ABSTRACTObjectivePrediction of extraprostatic disease in clinically localized prostate cancer is relevant for treatment planning of the disease. The purpose of this study was to explore the usefulness of the percentage of positive biopsy cores to predict the chance of extraprostatic cancer.Materials and MethodsWe evaluated 1787 patients with localized prostate cancer submitted to radical prostatectomy. The percentage of positive cores in prostate biopsy was correlated with the pathologic outcome of the surgical specimen. In the final analysis, a correlation was made between categorical ranges of positive cores (10% intervals and the risk of extraprostatic extension and/or bladder neck invasion, seminal vesicles involvement or metastasis to iliac lymph nodes. Student's t test was used for statistical analysis.ResultsFor each 10% of positive cores we observed a progressive higher prevalence of extraprostatic disease. The risk of cancer beyond the prostate capsule for ConclusionThe percentage of positive cores in prostate biopsy can predict the risk of cancer outside the prostate. Our study shows that the percentage of positive prostate biopsy fragments helps predict the chance of extraprostatic cancer and may have a relevant role in the patient's management.

  17. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories.

    Science.gov (United States)

    Robson, Barry; Boray, Srinidhi

    2015-11-01

    We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management.

  18. A Teaching Method on Basic Chemistry for Freshman : Teaching Method with Pre-test and Post-test

    OpenAIRE

    立木, 次郎; 武井, 庚二

    2003-01-01

    This report deals with a teaching method on basic chemistry for freshman. This teaching method contains guidance and instruction to how to understand basic chemistry. Pre-test and post-test have been put into practice each time. Each test was returned to students at class in the following weeks.

  19. A Teaching Method on Basic Chemistry for Freshman (II) : Teaching Method with Pre-test and Post-test

    OpenAIRE

    立木, 次郎; 武井, 庚二

    2004-01-01

    This report deals with review of a teaching method on basic chemistry for freshman in this first semester. We tried to review this teaching method with pre-test and post-test by means of the official and private questionnaires. Several hints and thoughts on teaching skills are obtained from this analysis.

  20. Comparison of patient comprehension of rapid HIV pre-test fundamentals by information delivery format in an emergency department setting

    Directory of Open Access Journals (Sweden)

    Clark Melissa A

    2007-09-01

    Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients

  1. Strong association between serological status and probability of progression to clinical visceral leishmaniasis in prospective cohort studies in India and Nepal.

    Directory of Open Access Journals (Sweden)

    Epco Hasker

    Full Text Available INTRODUCTION: Asymptomatic persons infected with the parasites causing visceral leishmaniasis (VL usually outnumber clinically apparent cases by a ratio of 4-10 to 1. We assessed the risk of progression from infection to disease as a function of DAT and rK39 serological titers. METHODS: We used available data on four cohorts from villages in India and Nepal that are highly endemic for Leishmania donovani. In each cohort two serosurveys had been conducted. Based on results of initial surveys, subjects were classified as seronegative, moderately seropositive or strongly seropositive using both DAT and rK39. Based on the combination of first and second survey results we identified seroconvertors for both markers. Seroconvertors were subdivided in high and low titer convertors. Subjects were followed up for at least one year following the second survey. Incident VL cases were recorded and verified. RESULTS: We assessed a total of 32,529 enrolled subjects, for a total follow-up time of 72,169 person years. Altogether 235 incident VL cases were documented. The probability of progression to disease was strongly associated with initial serostatus and with seroconversion; this was particularly the case for those with high titers and most prominently among seroconvertors. For high titer DAT convertors the hazard ratio reached as high as 97.4 when compared to non-convertors. The strengths of the associations varied between cohorts and between markers but similar trends were observed between the four cohorts and the two markers. DISCUSSION: There is a strongly increased risk of progressing to disease among DAT and/or rK39 seropositives with high titers. The options for prophylactic treatment for this group merit further investigation, as it could be of clinical benefit if it prevents progression to disease. Prophylactic treatment might also have a public health benefit if it can be corroborated that these asymptomatically infected individuals are infectious

  2. Pre-test estimates of temperature decline for the LANL Fenton Hill Long-Term Flow Test

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, B.A. [Los Alamos National Lab., NM (United States); Kruger, P. [Stanford Univ., CA (United States). Stanford Geothermal Program

    1992-06-01

    Pre-test predications for the Long-Term Flow Test (LTFT) of the experimental Hot Dry Rock (HDR) reservoir at Fenton Hill were made using two models. Both models are dependent on estimates of the ``effective`` reservoir volume accessed by the fluid and the mean fracture spacing (MFS) of major joints for fluid flow. The effective reservoir volume was estimated using a variety of techniques, and the range of values for the MFS was set through experience in modeling the thermal cooldown of other experimental HDR reservoirs. The two pre-test predictions for cooldown to 210{degrees}C (a value taken to compare the models) from initial temperature of 240{degrees}C are 6.1 and 10.7 years. Assuming that a minimum of 10{degrees}C is required to provide an unequivocal indication of thermal cooldown, both models predict that the reservoir will not exhibit observable cooldown for at least two years.

  3. HIV pre-test information, discussion or counselling? A review of guidance relevant to the WHO European Region.

    Science.gov (United States)

    Bell, Stephen A; Delpech, Valerie; Raben, Dorthe; Casabona, Jordi; Tsereteli, Nino; de Wit, John

    2016-02-01

    In the context of a shift from exceptionalism to normalisation, this study examines recommendations/evidence in current pan-European/global guidelines regarding pre-test HIV testing and counselling practices in health care settings. It also reviews new research not yet included in guidelines. There is consensus that verbal informed consent must be gained prior to testing, individually, in private, confidentially, in the presence of a health care provider. All guidelines recommend pre-test information/discussion delivered verbally or via other methods (information sheet). There is agreement about a minimum standard of information to be provided before a test, but guidelines differ regarding discussion about issues encouraging patients to think about implications of the result. There is heavy reliance on expert consultation in guideline development. Referenced scientific evidence is often more than ten years old and based on US/UK research. Eight new papers are reviewed. Current HIV testing and counselling guidelines have inconsistencies regarding the extent and type of information that is recommended during pre-test discussions. The lack of new research underscores a need for new evidence from a range of European settings to support the process of expert consultation in guideline development.

  4. Comparison of different coupling CFD–STH approaches for pre-test analysis of a TALL-3D experiment

    Energy Technology Data Exchange (ETDEWEB)

    Papukchiev, Angel, E-mail: angel.papukchiev@grs.de [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany); Jeltsov, Marti; Kööp, Kaspar; Kudinov, Pavel [KTH Royal Institute of Technology, Stockholm (Sweden); Lerchl, Georg [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany)

    2015-08-15

    Highlights: • Thermal-hydraulic system codes and CFD tools are coupled. • Pre-test calculations for the TALL-3D facility are performed. • Complex flow and heat transfer phenomena are modeled. • Comparative analyses have been performed. - Abstract: The system thermal-hydraulic (STH) code ATHLET was coupled with the commercial 3D computational fluid dynamics (CFD) software package ANSYS CFX to improve ATHLET simulation capabilities for flows with pronounced 3D phenomena such as flow mixing and thermal stratification. Within the FP7 European project THINS (Thermal Hydraulics of Innovative Nuclear Systems), validation activities for coupled thermal-hydraulic codes are being carried out. The TALL-3D experimental facility, operated by KTH Royal Institute of Technology in Stockholm, is designed for thermal-hydraulic experiments with lead-bismuth eutectic (LBE) coolant at natural and forced circulation conditions. GRS carried out pre-test simulations with ATHLET–ANSYS CFX for the TALL-3D experiment T01, while KTH scientists perform these analyses with the coupled code RELAP5/STAR CCM+. In the experiment T01 the main circulation pump is stopped, which leads to interesting thermal-hydraulic transient with local 3D phenomena. In this paper, the TALL-3D behavior during T01 is analyzed and the results of the coupled pre-test calculations, performed by GRS (ATHLET–ANSYS CFX) and KTH (RELAP5/STAR CCM+) are directly compared.

  5. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  6. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  7. Pre-test habituation improves the reliability of a handheld test of mechanical nociceptive threshold in dairy cows

    DEFF Research Database (Denmark)

    Raundal, P. M.; Andersen, P. H.; Toft, Nils;

    2015-01-01

    Mechanical nociceptive threshold (MNT) testing has been used to investigate aspects of painful states in bovine claws. We investigated a handheld tool, where the applied stimulation force was monitored continuously relative to a pre-encoded based target force. The effect on MNT of two pre......-testing habituation procedures was performed in two different experiments comprising a total of 88 sound Holsteins dairy cows kept either inside or outside their home environment. MNT testing was performed using five consecutive mechanical nociceptive stimulations per cow per test at a fixed pre-encoded target rate...... of 2.1 N/s. The habituation procedure performed in dairy cows kept in their home environment led to lowered intra-individual coefficient of variation of MNT (P force during stimulations (P

  8. Pre-Test Analysis Predictions for the Shell Buckling Knockdown Factor Checkout Tests - TA01 and TA02

    Science.gov (United States)

    Thornburgh, Robert P.; Hilburger, Mark W.

    2011-01-01

    This report summarizes the pre-test analysis predictions for the SBKF-P2-CYL-TA01 and SBKF-P2-CYL-TA02 shell buckling tests conducted at the Marshall Space Flight Center (MSFC) in support of the Shell Buckling Knockdown Factor (SBKF) Project, NASA Engineering and Safety Center (NESC) Assessment. The test article (TA) is an 8-foot-diameter aluminum-lithium (Al-Li) orthogrid cylindrical shell with similar design features as that of the proposed Ares-I and Ares-V barrel structures. In support of the testing effort, detailed structural analyses were conducted and the results were used to monitor the behavior of the TA during the testing. A summary of predicted results for each of the five load sequences is presented herein.

  9. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  10. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  11. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  12. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  13. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  14. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  15. Effects of clinical mastitis caused by gram-positive and gram-negative bacteria and other organisms on the probability of conception in New York State Holstein dairy cows.

    Science.gov (United States)

    Hertl, J A; Gröhn, Y T; Leach, J D G; Bar, D; Bennett, G J; González, R N; Rauch, B J; Welcome, F L; Tauer, L W; Schukken, Y H

    2010-04-01

    The objective of this study was to estimate the effects of different types of clinical mastitis (CM) on the probability of conception in New York State Holstein cows. Data were available on 55,372 artificial inseminations (AI) in 23,695 lactations from 14,148 cows in 7 herds. We used generalized linear mixed models to model whether or not a cow conceived after a particular AI. Independent variables included AI number (first, second, third, fourth), parity, season when AI occurred, farm, type of CM (due to gram-positive bacteria, gram-negative bacteria, or other organisms) in the 6 wk before and after an AI, and occurrence of other diseases. Older cows were less likely to conceive. Inseminations occurring in the summer were least likely to be successful. Retained placenta decreased the probability of conception. Conception was also less likely with each successive AI. The probability of conception associated with the first AI was 0.29. The probability of conception decreased to 0.26, 0.25, and 0.24 for the second, third, and fourth AI, respectively. Clinical mastitis occurring any time between 14 d before until 35 d after an AI was associated with a lower probability of conception; the greatest effect was an 80% reduction associated with gram-negative CM occurring in the week after AI. In general, CM due to gram-negative bacteria had a more detrimental effect on probability of conception than did CM caused by gram-positive bacteria or other organisms. Furthermore, CM had more effect on probability of conception immediately around the time of AI. Additional information about CM (i.e., its timing with respect to AI, and whether the causative agent is gram-positive or gram-negative bacteria, or other organisms) is valuable to dairy personnel in determining why some cows are unable to conceive in a timely manner. These findings are also beneficial for the management of mastitic cows (especially those with gram-negative CM) when mastitis occurs close to AI.

  16. Elaboration of a clinical and paraclinical score to estimate the probability of herpes simplex virus encephalitis in patients with febrile, acute neurologic impairment.

    Science.gov (United States)

    Gennai, S; Rallo, A; Keil, D; Seigneurin, A; Germi, R; Epaulard, O

    2016-06-01

    Herpes simplex virus (HSV) encephalitis is associated with a high risk of mortality and sequelae, and early diagnosis and treatment in the emergency department are necessary. However, most patients present with non-specific febrile, acute neurologic impairment; this may lead clinicians to overlook the diagnosis of HSV encephalitis. We aimed to identify which data collected in the first hours in a medical setting were associated with the diagnosis of HSV encephalitis. We conducted a multicenter retrospective case-control study in four French public hospitals from 2007 to 2013. The cases were the adult patients who received a confirmed diagnosis of HSV encephalitis. The controls were all the patients who attended the emergency department of Grenoble hospital with a febrile acute neurologic impairment, without HSV detection by polymerase chain reaction (PCR) in the cerebrospinal fluid (CSF), in 2012 and 2013. A multivariable logistic model was elaborated to estimate factors significantly associated with HSV encephalitis. Finally, an HSV probability score was derived from the logistic model. We identified 36 cases and 103 controls. Factors independently associated with HSV encephalitis were the absence of past neurological history (odds ratio [OR] 6.25 [95 % confidence interval (CI): 2.22-16.7]), the occurrence of seizure (OR 8.09 [95 % CI: 2.73-23.94]), a systolic blood pressure ≥140 mmHg (OR 5.11 [95 % CI: 1.77-14.77]), and a C-reactive protein probability score was calculated summing the value attributed to each independent factor. HSV encephalitis diagnosis may benefit from the use of this score based upon some easily accessible data. However, diagnostic evocation and probabilistic treatment must remain the rule.

  17. Demographic, clinical and treatment related predictors for event-free probability following low-dose radiotherapy for painful heel spurs - a retrospective multicenter study of 502 patients

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, Ralph [Dept. of Radiotherapy, St. Josefs-Hospital. Wiesbaden (Germany); Micke, Oliver [Dept. of Radiotherapy, Muenster Univ. Hospital (Germany); Reichl, Berthold [Dept. of Radiotherapy, Weiden Hospital (DE)] (and others)

    2007-03-15

    A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p <0.001); >58/{<=}58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis {<=} 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p <0.001), an age >58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs.

  18. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  19. Dynamical Simulation of Probabilities

    Science.gov (United States)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  20. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  1. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  2. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  3. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  4. Pre-Test pan Work Plan sebagai Strategi Pembelajaran Efektif pada Praktikum Bahan Teknik Lanjut Jurusan Pendidikan Teknik Mesin FT UNY

    Directory of Open Access Journals (Sweden)

    Nurdjito Nurdjito

    2013-09-01

    Full Text Available To find the most effective learning strategy for the practicum in the laboratory of materials of the department of Mechanical Engineering Education, Faculty of Engineering, Yogyakarta State University (YSU, a study that aims to determine the effect of applying pre-test and work plan on the learning activities and the achievement of students in the laboratory was conducted. This action research used the purposive random sampling technique. Pre-test and work plan were conducted as the treatment. The data of study was collected through a test to analyse the students’ achievement scores, then they were analyzed using t-test with SPSS. The results of this study indicated that the application of pre-test and work plan in addition to the standard module was proven to be more effective than the  normative learning using the module with t = 3.055 p = 0.003 <0.05. The implementation of the pre-test and work plan in addition to the use of standard modules is able to  improve the students’ motivation, independence and readiness to learn as well as the cooperation among the students, therefore the achievement is also improved. The mastery of competencies increased significantly proved by the increasing values of mode 66 to 85 (the experiment, and mean 73.12 into 79.32 (experiment.

  5. Pre-test metyrapone impairs memory recall in fear conditioning tasks: lack of interaction with β-adrenergic activity

    Science.gov (United States)

    Careaga, Mariella B. L.; Tiba, Paula A.; Ota, Simone M.; Suchecki, Deborah

    2015-01-01

    Cognitive processes, such as learning and memory, are essential for our adaptation to environmental changes and consequently for survival. Numerous studies indicate that hormones secreted during stressful situations, such as glucocorticoids (GCs), adrenaline and noradrenaline, regulate memory functions, modulating aversive memory consolidation and retrieval, in an interactive and complementary way. Thus, the facilitatory effects of GCs on memory consolidation as well as their suppressive effects on retrieval are substantially explained by this interaction. On the other hand, low levels of GCs are also associated with negative effects on memory consolidation and retrieval and the mechanisms involved are not well understood. The present study sought to investigate the consequences of blocking the rise of GCs on fear memory retrieval in multiple tests, assessing the participation of β-adrenergic signaling on this effect. Metyrapone (GCs synthesis inhibitor; 75 mg/kg), administered 90 min before the first test of contextual or tone fear conditioning (TFC), negatively affected animals’ performances, but this effect did not persist on a subsequent test, when the conditioned response was again expressed. This result suggested that the treatment impaired fear memory retrieval during the first evaluation. The administration immediately after the first test did not affect the animals’ performances in contextual fear conditioning (CFC), suggesting that the drug did not interfere with processes triggered by memory reactivation. Moreover, metyrapone effects were independent of β-adrenergic signaling, since concurrent administration with propranolol (2 mg/kg), a β-adrenergic antagonist, did not modify the effects induced by metyrapone alone. These results demonstrate that pre-test metyrapone administration led to negative effects on fear memory retrieval and this action was independent of a β-adrenergic signaling. PMID:25784866

  6. Grouped to Achieve: Are There Benefits to Assigning Students to Heterogeneous Cooperative Learning Groups Based on Pre-Test Scores?

    Science.gov (United States)

    Werth, Arman Karl

    Cooperative learning has been one of the most widely used instructional practices around the world since the early 1980's. Small learning groups have been in existence since the beginning of the human race. These groups have grown in their variance and complexity overtime. Classrooms are getting more diverse every year and instructors need a way to take advantage of this diversity to improve learning. The purpose of this study was to see if heterogeneous cooperative learning groups based on student achievement can be used as a differentiated instructional strategy to increase students' ability to demonstrate knowledge of science concepts and ability to do engineering design. This study includes two different groups made up of two different middle school science classrooms of 25-30 students. These students were given an engineering design problem to solve within cooperative learning groups. One class was put into heterogeneous cooperative learning groups based on student's pre-test scores. The other class was grouped based on random assignment. The study measured the difference between each class's pre-post gains, student's responses to a group interaction form and interview questions addressing their perceptions of the makeup of their groups. The findings of the study were that there was no significant difference between learning gains for the treatment and comparison groups. There was a significant difference between the treatment and comparison groups in student perceptions of their group's ability to stay on task and manage their time efficiently. Both the comparison and treatment groups had a positive perception of the composition of their cooperative learning groups.

  7. Pre-Test Assessment of the Use Envelope of the Normal Force of a Wind Tunnel Strain-Gage Balance

    Science.gov (United States)

    Ulbrich, N.

    2016-01-01

    The relationship between the aerodynamic lift force generated by a wind tunnel model, the model weight, and the measured normal force of a strain-gage balance is investigated to better understand the expected use envelope of the normal force during a wind tunnel test. First, the fundamental relationship between normal force, model weight, lift curve slope, model reference area, dynamic pressure, and angle of attack is derived. Then, based on this fundamental relationship, the use envelope of a balance is examined for four typical wind tunnel test cases. The first case looks at the use envelope of the normal force during the test of a light wind tunnel model at high subsonic Mach numbers. The second case examines the use envelope of the normal force during the test of a heavy wind tunnel model in an atmospheric low-speed facility. The third case reviews the use envelope of the normal force during the test of a floor-mounted semi-span model. The fourth case discusses the normal force characteristics during the test of a rotated full-span model. The wind tunnel model's lift-to-weight ratio is introduced as a new parameter that may be used for a quick pre-test assessment of the use envelope of the normal force of a balance. The parameter is derived as a function of the lift coefficient, the dimensionless dynamic pressure, and the dimensionless model weight. Lower and upper bounds of the use envelope of a balance are defined using the model's lift-to-weight ratio. Finally, data from a pressurized wind tunnel is used to illustrate both application and interpretation of the model's lift-to-weight ratio.

  8. Pre-test metyrapone impairs memory recall in fear conditioning tasks: lack of interaction with β-adrenergic activity

    Directory of Open Access Journals (Sweden)

    Mariella B.L. Careaga

    2015-03-01

    Full Text Available Cognitive processes, such as learning and memory, are essential for our adaptation to environmental changes and consequently for survival. Numerous studies indicate that hormones secreted during stressful situations, such as glucocorticoids (GCs, adrenaline and noradrenaline, regulate memory functions, modulating aversive memory consolidation and retrieval, in an interactive and complementary way. Thus, the facilitatory effects of GCs on memory consolidation as well as their suppressive effects on retrieval are substantially explained by this interaction. On the other hand, low levels of GCs are also associated with negative effects on memory consolidation and retrieval and the mechanisms involved are not well understood. The present study sought to investigate the consequences of blocking the rise of GCs on fear memory retrieval in multiple tests, assessing the participation of β-adrenergic signaling on this effect. Metyrapone (GCs synthesis inhibitor, administered 90 min before the first test of contextual or auditory fear conditioning, negatively affected animals’ performances, but this effect did not persist on a subsequent test, when the conditioned response was again expressed. This result suggested that the treatment impaired fear memory retrieval during the first evaluation. The administration immediately after the first test did not affect the animals’ performances in contextual fear conditioning, suggesting that the drug did not interfere with processes triggered by memory reactivation. Moreover, metyrapone effects were independent of β-adrenergic signaling, since concurrent administration with propranolol, a β-adrenergic antagonist, did not modify the effects induced by metyrapone alone. These results demonstrate that pre-test metyrapone administration led to negative effects on fear memory retrieval and this action was independent of a β-adrenergic signaling.

  9. Pre-Test Assessment of the Upper Bound of the Drag Coefficient Repeatability of a Wind Tunnel Model

    Science.gov (United States)

    Ulbrich, N.; L'Esperance, A.

    2017-01-01

    A new method is presented that computes a pre{test estimate of the upper bound of the drag coefficient repeatability of a wind tunnel model. This upper bound is a conservative estimate of the precision error of the drag coefficient. For clarity, precision error contributions associated with the measurement of the dynamic pressure are analyzed separately from those that are associated with the measurement of the aerodynamic loads. The upper bound is computed by using information about the model, the tunnel conditions, and the balance in combination with an estimate of the expected output variations as input. The model information consists of the reference area and an assumed angle of attack. The tunnel conditions are described by the Mach number and the total pressure or unit Reynolds number. The balance inputs are the partial derivatives of the axial and normal force with respect to all balance outputs. Finally, an empirical output variation of 1.0 microV/V is used to relate both random instrumentation and angle measurement errors to the precision error of the drag coefficient. Results of the analysis are reported by plotting the upper bound of the precision error versus the tunnel conditions. The analysis shows that the influence of the dynamic pressure measurement error on the precision error of the drag coefficient is often small when compared with the influence of errors that are associated with the load measurements. Consequently, the sensitivities of the axial and normal force gages of the balance have a significant influence on the overall magnitude of the drag coefficient's precision error. Therefore, results of the error analysis can be used for balance selection purposes as the drag prediction characteristics of balances of similar size and capacities can objectively be compared. Data from two wind tunnel models and three balances are used to illustrate the assessment of the precision error of the drag coefficient.

  10. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice prob...

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  13. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  14. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  15. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  16. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  17. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  18. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  19. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  20. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  1. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  2. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  3. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  4. Evaluation of a reproductive health awareness program for adolescence in urban Tanzania-A quasi-experimental pre-test post-test research

    Directory of Open Access Journals (Sweden)

    Iida Mariko

    2011-06-01

    Full Text Available Abstract Background Sub-Saharan Africa is among the countries where 10% of girls become mothers by the age of 16 years old. The United Republic of Tanzania located in Sub-Saharan Africa is one country where teenage pregnancy is a problem facing adolescent girls. Adolescent pregnancy has been identified as one of the reasons for girls dropping out from school. This study's purpose was to evaluate a reproductive health awareness program for the improvement of reproductive health for adolescents in urban Tanzania. Methods A quasi-experimental pre-test and post-test research design was conducted to evaluate adolescents' knowledge, attitude, and behavior about reproductive health before and after the program. Data were collected from students aged 11 to 16, at Ilala Municipal, Dar es Salaam, Tanzania. An anonymous 23-item questionnaire provided the data. The program was conducted using a picture drama, reproductive health materials and group discussion. Results In total, 313 questionnaires were distributed and 305 (97.4% were useable for the final analysis. The mean age for girls was 12.5 years and 13.2 years for boys. A large minority of both girls (26.8% and boys (41.4% had experienced sex and among the girls who had experienced sex, 51.2% reported that it was by force. The girls' mean score in the knowledge pre-test was 5.9, and 6.8 in post-test, which increased significantly (t = 7.9, p = 0.000. The mean behavior pre-test score was 25.8 and post-test was 26.6, which showed a significant increase (t = 3.0, p = 0.003. The boys' mean score in the knowledge pre-test was 6.4 and 7.0 for the post-test, which increased significantly (t = 4.5, p = 0.000. The mean behavior pre-test score was 25.6 and 26.4 in post-test, which showed a significant increase (t = 2.4, p = 0.019. However, the pre-test and post-test attitude scores showed no statistically significant difference for either girls or boys. Conclusions Teenagers have sexual experiences including

  5. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  6. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  7. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  8. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  9. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  10. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  11. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  12. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  13. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  14. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  15. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  16. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  17. Varga: On Probability.

    Science.gov (United States)

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  18. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  19. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  20. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  1. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  2. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  3. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  4. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  5. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  6. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  7. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  8. Improving Ranking Using Quantum Probability

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.

  9. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  10. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  11. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  12. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  13. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  14. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  15. Probabilities for Solar Siblings

    CERN Document Server

    Valtonen, M; Bobylev, V V; Myllari, A

    2015-01-01

    We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  16. The Influences of Pre-testing Reviews and Delays on Differentialassociative Processing versus A Condition in which Students chose their Learning Strategy.

    Science.gov (United States)

    Hannon, Brenda

    2013-10-01

    Recent studies show that a new strategy called differential-associative processing is effective for learning related concepts. However our knowledge about differential-associative processing is still limited. Therefore the goals of the present study are to assess the duration of knowledge that is acquired from using differential-associative processing, to determine whether the efficacy of differential-associative processing changes with the addition of a 10-minute pre-testing review, and to compare differential-associate processing to two conditions in which students select their own learning strategy. The results revealed that differential-associative processing was a better strategy for learning related concepts than were either of the two comparison conditions. They also revealed that a 10-minute pre-testing review had a positive additive influence on differential-associative processing. Finally, although the knowledge acquired from using differential-associative processing declined with an increase in delay between learning and testing, this decline was equivalent to the decline observed in both comparison conditions.

  17. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  18. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  19. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  20. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  1. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-04-01

    Full Text Available In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  2. Probability state modeling theory.

    Science.gov (United States)

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  3. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    Science.gov (United States)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  4. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  5. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  6. Pre-test analysis of an integral effect test facility for thermal hydraulic similarities of 6 inches cold leg break and DVI line break using MARS-1D

    Energy Technology Data Exchange (ETDEWEB)

    Ah, D. J.; Park, H. S.; Choi, K. Y.; Kwon, T. S.; Baek, W. P. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    Pre-test analyses of a small-break loss-of-coolant accident (SBLOCA) and a DVI line break accident, have been performed for the integral effect test loop of Korea Atomic Energy Research Institute (KAERI-ITL), the construction of which will be started soon. The KAERI-ITL is being designed with a full-height and 1/310 volume scale based on the design features of the APR1400 (Korean Next Generation Reactor). Based on the same control logics and accident scenarios, the similarity between the KAERI-ITL and the prototype plant, APR1400, is evaluated using the MARS code. It is found that the KAERI-ITL and APR1400 have similar thermal hydraulic responses during the transient under the identical accident scenarios. It is also verified that the volume scaling law, applied to the design of the KAERI-ITL, gives reasonable results to keep the similarity between APR1400 and KAERI-ITL.

  7. The Experiences Talk on the Pre-test Training for the Exam of Computer's Certificates%计算机考证考前培训经验谈

    Institute of Scientific and Technical Information of China (English)

    顾敏

    2013-01-01

    计算机高新考试培训工作的重点是教师要熟悉考试形式,要弄通题库的题型,并为学生作好考前培训辅导工作,再加上激发学生学习的主动性,考试过关率就会大大提高。%The training work of computer high-new exam is focused on the teachers to be familiar with forms of the test, it should be gotten a good grasp for the question type in question bank, and done tutoring job well for the pre-test training of students, as well as stimulated the learning initiative in students, the pass rate of exam is much to be improved higher.

  8. Probable Linezolid-Induced Pancytopenia

    Directory of Open Access Journals (Sweden)

    Nita Lakhani

    2005-01-01

    Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.

  9. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  10. Hidden Variables or Positive Probabilities?

    CERN Document Server

    Rothman, T; Rothman, Tony

    2001-01-01

    Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...

  11. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to charac......An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  14. Landau-Zener Probability Reviewed

    CERN Document Server

    Valencia, C

    2008-01-01

    We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.

  15. Understanding Students' Beliefs about Probability.

    Science.gov (United States)

    Konold, Clifford

    The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…

  16. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  17. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  18. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  19. Linear Positivity and Virtual Probability

    CERN Document Server

    Hartle, J B

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...

  20. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  1. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  2. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  3. Probability Ranking in Vector Spaces

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.

  4. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  5. Local Causality, Probability and Explanation

    CERN Document Server

    Healey, Richard A

    2016-01-01

    In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.

  6. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  7. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  8. The probabilities of unique events.

    Directory of Open Access Journals (Sweden)

    Sangeet S Khemlani

    Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.

  9. Diurnal distribution of sunshine probability

    Energy Technology Data Exchange (ETDEWEB)

    Aydinli, S.

    1982-01-01

    The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.

  10. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  11. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  12. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Three lectures on free probability

    OpenAIRE

    2012-01-01

    These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.

  14. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be... determine their new intermediate probabilities. (g) Multiply each applicant's probability pursuant...

  15. Field-Based Video Pre-Test Counseling, Oral Testing, and Telephonic Post-Test Counseling: Implementation of an HIV Field Testing Package among High-Risk Indian Men

    Science.gov (United States)

    Snyder, Hannah; Yeldandi, Vijay V.; Kumar, G. Prem; Liao, Chuanhong; Lakshmi, Vemu; Gandham, Sabitha R.; Muppudi, Uma; Oruganti, Ganesh; Schneider, John A.

    2012-01-01

    In India, men who have sex with men (MSM) and truck drivers are high-risk groups that often do not access HIV testing due to stigma and high mobility. This study evaluated a field testing package (FTP) that identified HIV positive participants through video pre-test counseling, OraQuick oral fluid HIV testing, and telephonic post-test counseling…

  16. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  17. Cluster Membership Probability: Polarimetric Approach

    CERN Document Server

    Medhi, Biman J

    2013-01-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...

  18. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  19. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  20. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  1. Detonation probabilities of high explosives

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  2. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  3. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  4. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  5. Pre-Test CFD for the Design and Execution of the Enhanced Injection and Mixing Project at NASA Langley Research Center

    Science.gov (United States)

    Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.

    2014-01-01

    With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to

  6. Diagnostic accuracy of MRI in adults with suspect brachial plexus lesions: A multicentre retrospective study with surgical findings and clinical follow-up as reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Tagliafico, Alberto, E-mail: alberto.tagliafico@unige.it [Institute of Anatomy, Department of Experimental Medicine, University of Genoa, Largo Rosanna Benzi 8, 16132 Genoa (Italy); Succio, Giulia; Serafini, Giovanni [Department of Radiology, Santa Corona Hospital, Pietra Ligure, Italy via XXV Aprile, 38- Pietra Ligure, 17027 Savona (Italy); Martinoli, Carlo [Radiology Department, DISC, Università di Genova, Largo Rosanna Benzi 8, 16138 Genova (Italy)

    2012-10-15

    Objective: To evaluate brachial plexus MRI accuracy with surgical findings and clinical follow-up as reference standard in a large multicentre study. Materials and methods: The research was approved by the Institutional Review Boards, and all patients provided their written informed consent. A multicentre retrospective trial that included three centres was performed between March 2006 and April 2011. A total of 157 patients (men/women: 81/76; age range, 18–84 years) were evaluated: surgical findings and clinical follow-up of at least 12 months were used as the reference standard. MR imaging was performed with different equipment at 1.5 T and 3.0 T. The patient group was divided in five subgroups: mass lesion, traumatic injury, entrapment syndromes, post-treatment evaluation, and other. Sensitivity, specificity with 95% confidence intervals (CIs), positive predictive value (PPV), pre-test-probability (the prevalence), negative predictive value (NPV), pre- and post-test odds (OR), likelihood ratio for positive results (LH+), likelihood ratio for negative results (LH−), accuracy and post-test probability (post-P) were reported on a per-patient basis. Results: The overall sensitivity and specificity with 95% CIs were: 0.810/0.914; (0.697–0.904). Overall PPV, pre-test probability, NPV, LH+, LH−, and accuracy: 0.823, 0.331, 0.905, 9.432, 0.210, 0.878. Conclusions: The overall diagnostic accuracy of brachial plexus MRI calculated on a per-patient base is relatively high. The specificity of brachial plexus MRI in patients suspected of having a space-occupying mass is very high. The sensitivity is also high, but there are false-positive interpretations as well.

  7. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  8. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  9. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  10. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  11. Fuzzy Markov chains: uncertain probabilities

    OpenAIRE

    2002-01-01

    We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.

  12. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  13. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  14. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  15. Understanding Y haplotype matching probability.

    Science.gov (United States)

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  16. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  17. Cluster pre-existence probability

    Energy Technology Data Exchange (ETDEWEB)

    Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)

    2011-10-15

    Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)

  18. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  19. Sm Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Sneden, C; Cowan, J J

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).

  20. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  1. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  2. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ

  3. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  4. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  5. Objective probability and quantum fuzziness

    CERN Document Server

    Mohrhoff, U

    2007-01-01

    This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...

  6. Empirical and Computational Tsunami Probability

    Science.gov (United States)

    Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.

    2008-12-01

    A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical

  7. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  8. PROBABILITY MODEL OF GUNTHER GENERATOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.

  9. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  10. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  11. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  12. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  13. Estimation of the mediastinal involvement probability in non-small cell lung cancer: a statistical definition of the clinical target volume for 3-dimensional conformal radiotherapy?; Estimation de la propabilite d'envahissement tumoral mediastinal: une definition statistique du volume-cible anatomoclinique pour la radiotherapie conformationnelle des cancers bronchiques non a petites cellules?

    Energy Technology Data Exchange (ETDEWEB)

    Giraud, P.; Dubray, B.; Helfre, S.; Dauphinot, C.; Rosenwald, J.C.; Cosset, J.M. [Institut Curie, Dept. d' Oncologie-Radiotherapie, 75 - Paris (France); Rycke, Y. de [Institut Curie, Dept. de Biostatistiques, 75 - Paris (France); Minet, P. [Centre Hospitalier Universitaire, Service de Radiotherapie, Liege (Belgium); Danhier, S. [Hopital Europeen Georges-Pompidou, Service de Radiotherapie, 75 - Paris (France)

    2001-12-01

    Purpose. - Conformal irradiation of non-small cell lung carcinoma (NSCLC) is largely based on a precise definition of the nodal clinical target volume (CTVn). The reduction of the number of nodal stations to be irradiated would render tumor dose escalation more achievable. The aim of this work was to design an mathematical tool based on documented data, that would predict the risk of metastatic involvement for each nodal station. Methods and material. - From the large surgical series published in the literature we looked at the main pre-treatment parameters that modify the risk of nodal invasion. The probability of involvement for the 17 nodal stations described by the American Thoracic Society (ATS) was computed from all these publications and then weighted according to the French epidemiological data. Starting from the primitive location of the tumour as the main characteristic, we built a probabilistic tree for each nodal station representing the risk distribution as a function of each tumor feature. From the statistical point of view, we used the inversion of probability trees method described by Weinstein and Feinberg. Results. -Taking into account all the different parameters of I the pre-treatment staging relative to each level of the ATS map brings up to 20,000 different combinations. The first chosen parameters in the tree were, depending on the tumour location, the histological classification, the metastatic stage, the nodal stage weighted in function of the sensitivity and specificity of the diagnostic examination used (PET scan, CAT scan) and the tumoral stage. A software is proposed to compute a predicted probability of involvement of each nodal station for any given clinical presentation.Conclusion. -To better define the CTVn in NSCLC 3DRT, we propose a software that evaluates the mediastinal nodal involvement risk from easily accessible individual pretreatment parameters. (authors)

  14. Hf Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...

  15. Gd Transition Probabilities and Abundances

    CERN Document Server

    Den Hartog, E A; Sneden, C; Cowan, J J

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...

  16. The neurologic examination in patients with probable Alzheimer's disease.

    Science.gov (United States)

    Huff, F J; Boller, F; Lucchelli, F; Querriera, R; Beyer, J; Belle, S

    1987-09-01

    Abnormal findings on a standardized neurologic examination were compared between patients with a clinical diagnosis of probable Alzheimer's disease (AD) and healthy control subjects. Aside from mental status findings, the most useful examination findings for differentiating AD from control subjects were the presence of release signs, olfactory deficit, impaired stereognosis or graphesthesia, gait disorder, tremor, and abnormalities on cerebellar testing. These abnormalities probably reflect the different areas of the central nervous system that are affected pathologically in AD. In the clinical diagnosis of AD, particular attention should be given to these aspects of the neurologic examination.

  17. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  18. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  19. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.

  20. Probability landscapes for integrative genomics

    Directory of Open Access Journals (Sweden)

    Benecke Arndt

    2008-05-01

    Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a

  1. Dutch translation and cross-cultural adaptation of the PROMIS® physical function item bank and cognitive pre-test in Dutch arthritis patients

    NARCIS (Netherlands)

    Oude Voshaar, Martijn A.H.; Klooster, ten Peter M.; Taal, Erik; Krishnan, Eswar; Laar, van de Mart A.F.J.

    2012-01-01

    INTRODUCTION: Patient-reported physical function is an established outcome domain in clinical studies in rheumatology. To overcome the limitations of the current generation of questionnaires, the Patient-Reported Outcomes Measurement Information System (PROMIS®) project in the USA has developed cal

  2. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  3. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  4. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections

  5. Pre-Test and Post-Test Applications to Shape the Education of Phlebotomists in A Quality Management Program: An Experience in A Training Hospital

    Directory of Open Access Journals (Sweden)

    Aykal Güzin

    2016-09-01

    Full Text Available Background: After the introduction of modern laboratory instruments and information systems, preanalytic phase is the new field of battle. Errors in preanalytical phase account for approximately half of total errors in clinical laboratory. The objective of this study was to share an experience of an education program that was believed to be successful in decreasing the number of rejected samples received from the Emergency Department (ED.

  6. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  7. The Black Hole Formation Probability

    CERN Document Server

    Clausen, Drew; Ott, Christian D

    2014-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...

  8. Failure-probability driven dose painting

    Energy Technology Data Exchange (ETDEWEB)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Berthelsen, Anne K. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Bentzen, Søren M. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Departments of Human Oncology and Medical Physics, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2013-08-15

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.

  9. Joint Probability Models of Radiology Images and Clinical Annotations

    Science.gov (United States)

    Arnold, Corey Wells

    2009-01-01

    Radiology data, in the form of images and reports, is growing at a high rate due to the introduction of new imaging modalities, new uses of existing modalities, and the growing importance of objective image information in the diagnosis and treatment of patients. This increase has resulted in an enormous set of image data that is richly annotated…

  10. Clinical utility of acoustic radiation force impulse imaging for identification of malignant liver lesions: a meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ying, Li; Lin, Xiao; Xie, Zuo-Liu; Tang, Fei-Yun; Hu, Yuan-Ping [First Affiliated Hospital of Wenzhou Medical College, Department of Ultrasonography, Wenzhou (China); Shi, Ke-Qing [First Affiliated Hospital of Wenzhou Medical College, Department of Infection and Liver Diseases, Institution of Hepatology, Wenzhou (China)

    2012-12-15

    To assess the performance of acoustic radiation force impulse (ARFI) imaging for identification of malignant liver lesions using meta-analysis. PubMed, the Cochrane Library, the ISI Web of Knowledge and the China National Knowledge Infrastructure were searched. The studies published in English or Chinese relating to evaluation accuracy of ARFI imaging for identification of malignant liver lesions were collected. A hierarchical summary receiver operating characteristic (HSROC) curve was used to examine the ARFI imaging accuracy. Clinical utility of ARFI imaging for identification of malignant liver lesions was evaluated by Fagan plot analysis. A total of eight studies which included 590 liver lesions were analysed. The summary sensitivity and specificity for identification of malignant liver lesions were 0.86 (95 % confidence interval (CI) 0.74-0.93) and 0.89 (95 % CI 0.81-0.94), respectively. The HSROC was 0.94 (95 % CI 0.91-0.96). After ARFI imaging results over the cut-off value for malignant liver lesions (''positive'' result), the corresponding post-test probability for the presence (if pre-test probability was 50 %) was 89 %; in ''negative'' measurement, the post-test probability was 13 %. ARFI imaging has a high accuracy in the classification of liver lesions. (orig.)

  11. Conditional probability modulates visual search efficiency.

    Science.gov (United States)

    Cort, Bryan; Anderson, Britt

    2013-01-01

    We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  12. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  13. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  14. Pre-test analysis of an integral effect test facility for thermal-hydraulic similarities of 6 inches coldleg break and DVI injection line break using MARS-1D

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Tae Soon; Choi, Ki Yong; Park, Hyun Sik; Euh, Dong Jin; Baek, Won Pil [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    A pre-test analysis of a small-break loss-of-coolant accident (SBLOCA, DVI Line break) has been performed for the integral effect test loop of Korea Atomic Energy Research Institute (Korea Atomic Energy Research Institute-ITL), the construction of which will be started soon. The Korea Atomic Energy Research Institute-ITL is a full-height and 1/310 volume-scaled test facility based on the design features of the APR1400 (Korean Next Generation Reactor). This paper briefly introduces the basic design features of the Korea Atomic Energy Research Institute-ITL and presents the results of pre-test analysis for a postulated cold leg SBLOCA and DVI line break. Based on the same control logics and accident scenarios, the similarity between the Korea Atomic Energy Research Institute-ITL and the prototype plant, APR1400, is evaluated by using the MARS code, which is a multi-dimensional best-estimate thermal hydraulic code being developed by Korea Atomic Energy Research Institute. It is found that the Korea Atomic Energy Research Institute-ITL and APR 1400 have similar thermal hydraulic responses against the analyzed SBLOCA and DVI Line break scenario. It is also verified that the volume scaling law, applied to the design of the Korea Atomic Energy Research Institute-ITL, gives a reasonable results to keep a similarity with APR1400. 11 refs., 19 figs., 3 tabs. (Author)

  15. Subcutaneous mucormycosis caused by Rhizopus oryzae: probable nosocomial acquired infection

    Directory of Open Access Journals (Sweden)

    Flávio de Queiroz Telles Filho

    1985-08-01

    Full Text Available The Authors present a case of subcutaneous mucormycosis occurring in a patient with clinical and biochemical evidence of diabetic ketoacidosis. The clinical, mycological and histopathological features are described, emphasizing the relevance of a rapid diagnosis in order to stablish early treatment. The clinical forms of mucormycosis and the main associated conditions are briefly reviewed as well as the most probable conditions which may lead to the enhanced susceptibility to infection in the diabetic patient in ketoacidosis. The recovery of Rhizopus oryzae from the air of the room of the patient suggests a nosocomial infection acquired through contamination of venous puncture site by air borne spores.

  16. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  17. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  18. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  19. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  20. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  1. Clinical applicability of D-dimer assay in the diagnosis of pulmonary embolism reduces with aging

    Directory of Open Access Journals (Sweden)

    Luca Masotti

    2007-12-01

    Full Text Available Despite modern algorithms have been proposed for diagnosis of pulmonary embolism (PE, it remains understimed and often missed in clinical practice, especially in elderly patients, resulting in high morbidity and mortality when early and correctly untreated. One of the main controversial issue is represented by the role and applicability of D-dimer in the diagnostic work up of geriatric patients. Most recent guidelines in young-adult patients suggest to perform D-dimer assay by ELISA or immunoturbidimetric methods only in non high pre-test clinical probability (PTP patients; in these patients negative D-dimer can safely rule out the diagnosis of PE. This strategy is safe also in elderly patients; however the percentage of patients with non high PTP and negative D-dimer reduces progressively with age, making difficult its clinical applicability. The Authors, starting from two case reports, up date the diagnostic management of PE underling the limitations of D-dimer assay in elderly patients.

  2. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  4. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  5. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  6. On the computability of conditional probability

    CERN Document Server

    Ackerman, Nathanael L; Roy, Daniel M

    2010-01-01

    We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...

  7. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  8. Bell Could Become the Copernicus of Probability

    Science.gov (United States)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  9. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  10. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  11. UT Biomedical Informatics Lab (BMIL probability wheel

    Directory of Open Access Journals (Sweden)

    Sheng-Cheng Huang

    2016-01-01

    Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  12. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  13. Total variation denoising of probability measures using iterated function systems with probabilities

    Science.gov (United States)

    La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.

    2017-01-01

    In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.

  14. Bayesian Probabilities and the Histories Algebra

    OpenAIRE

    Marlow, Thomas

    2006-01-01

    We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.

  15. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  16. Non-Boolean probabilities and quantum measurement

    Energy Technology Data Exchange (ETDEWEB)

    Niestegge, Gerd

    2001-08-03

    A non-Boolean extension of the classical probability model is proposed. The non-Boolean probabilities reproduce typical quantum phenomena. The proposed model is more general and more abstract, but easier to interpret, than the quantum mechanical Hilbert space formalism and exhibits a particular phenomenon (state-independent conditional probabilities) which may provide new opportunities for an understanding of the quantum measurement process. Examples of the proposed model are provided, using Jordan operator algebras. (author)

  17. Data analysis recipes: Probability calculus for inference

    CERN Document Server

    Hogg, David W

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.

  18. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... of pain. Pretest likelihood of disease was estimated, and all patients underwent myocardial perfusion scintigraphy (MPS) followed by CA an average of 78 days later. For analysis, the investigators focused on the approximate groups of patients with more severe disease, ie, typical angina (n=187), Canadian...

  19. Probabilities are single-case, or nothing

    CERN Document Server

    Appleby, D M

    2004-01-01

    Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state.

  20. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  1. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  2. When Index Term Probability Violates the Classical Probability Axioms Quantum Probability can be a Necessary Theory for Information Retrieval

    CERN Document Server

    Melucci, Massimo

    2012-01-01

    Probabilistic models require the notion of event space for defining a probability measure. An event space has a probability measure which ensues the Kolmogorov axioms. However, the probabilities observed from distinct sources, such as that of relevance of documents, may not admit a single event space thus causing some issues. In this article, some results are introduced for ensuring whether the observed prob- abilities of relevance of documents admit a single event space. More- over, an alternative framework of probability is introduced, thus chal- lenging the use of classical probability for ranking documents. Some reflections on the convenience of extending the classical probabilis- tic retrieval toward a more general framework which encompasses the issues are made.

  3. Analytical Study of Thermonuclear Reaction Probability Integrals

    CERN Document Server

    Chaudhry, M A; Mathai, A M

    2000-01-01

    An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.

  4. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  5. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  6. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  7. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  8. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  9. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  10. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  11. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  12. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  13. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  14. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  15. Transition probability spaces in loop quantum gravity

    CERN Document Server

    Guo, Xiao-Kan

    2016-01-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is achieved by first checking such structures in covariant quantum mechanics, and then passing to spin foam models via the general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the Hilbert space of the canonical theory and the relevant quantum logical structure. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize property transitions and causality in this categorical context in connection with presheaves on quantaloids and respectively causal categories. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  16. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  17. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  18. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  19. Survival probability in patients with liver trauma.

    Science.gov (United States)

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  20. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  1. Basic Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first

  2. Advanced Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob

  3. Tomographic probability representation for quantum fermion fields

    CERN Document Server

    Andreev, V A; Man'ko, V I; Son, Nguyen Hung; Thanh, Nguyen Cong; Timofeev, Yu P; Zakharov, S D

    2009-01-01

    Tomographic probability representation is introduced for fermion fields. The states of the fermions are mapped onto probability distribution of discrete random variables (spin projections). The operators acting on the fermion states are described by fermionic tomographic symbols. The product of the operators acting on the fermion states is mapped onto star-product of the fermionic symbols. The kernel of the star-product is obtained. The antisymmetry of the fermion states is formulated as the specific symmetry property of the tomographic joint probability distribution associated with the states.

  4. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  5. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  6. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  7. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  8. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  9. Inclusion probability with dropout: an operational formula.

    Science.gov (United States)

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  10. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  11. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  12. Teaching Elementary Probability Through its History.

    Science.gov (United States)

    Kunoff, Sharon; Pines, Sylvia

    1986-01-01

    Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)

  13. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  14. Comparison of the clinical value of three pre-probability scores in prediction of pulmonary embolism admitted to the department of cardiology%三种量表在心内科对肺栓塞预测价值的比较

    Institute of Scientific and Technical Information of China (English)

    鲁锦国; 陈静; 陈鑫; 蒋萍; 唐成; 苏晞

    2012-01-01

    目的 比较3种肺栓塞(PE)临床评估量表对心内科疑诊PE的预测价值.方法 对2010年1~10月于我院心内科住院时疑诊PE的患者同时行Wells、Geneva和改良Geneva量表评分,并行64层CT肺动脉造影(CTPA).通过受试者工作特征(ROC)曲线下面积比较3种临床评估量表对PE的预测价值.结果 175例患者中,CTPA诊断PE的患者33例(18.9%).PE患者的Wells、Geneva量表评分均高于非PE者,差异有统计学意义(P<0.05),而改良Geneva量表评分在两组间差异无统计学意义(P>0.05).Wells量表评分显示PE低、中、高度可能性患者中,PE患者分别占9.3%(10/108)、32.3%(21/65)和100%(2/2);Geneva量表评分显示,PE患者分别占15.8%(21/133)、23.1%(9/39)和100%(3/3);而改良Geneva量表评分显示,PE患者分别占14.8%(16/108)、24.2% (16/66)和100%(1/1).Wells、Geneva和改良Geneva量表评分对PE的预测的ROC曲线下面积分别为:0.77 ±0.06、0.63±0.06和0.61 ±0.05.其中Wells量表评分的ROC曲线下面积最大,差异有统计学意义(P<0.05).结论 在心内科,3种量表评分中,Wells量表评分对PE的预测价值较高,可作为PE的临床基本筛查方法.%Objective To compare the clinical value of three pre-probability scores in prediction of pulmonary embolism (PE) admitted to the department of cardiology. Methods One hundred and seventy five consecutive patients with suspected PE underwent prospective CT pulmonary angiography ( CTPA) at the time of initial diagnosis. Three clinical predication scoring systems (Wells', Geneva' and revised Geneva') were used to evaluate the probability of PE in these patients. The predictive accuracy of three scores was compared by area under the curve ( AUC ) of receiver operating characteristic ( ROC ) curves. Results The overall prevalence of PE was 18. 9% (n = 33) for the patients with suspected PE in the department of cardiology. The Wells score and Geneva score of the patients with PE were

  15. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  16. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  17. De Finetti's contribution to probability and statistics

    OpenAIRE

    Cifarelli, Donato Michele; Regazzini, Eugenio

    1996-01-01

    This paper summarizes the scientific activity of de Finetti in probability and statistics. It falls into three sections: Section 1 includes an essential biography of de Finetti and a survey of the basic features of the scientific milieu in which he took the first steps of his scientific career; Section 2 concerns de Finetti's work in probability: (a) foundations, (b) processes with independent increments, (c) sequences of exchangeable random variables, and (d) contributions which fall within ...

  18. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  19. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  20. Survival probability for open spherical billiards

    Science.gov (United States)

    Dettmann, Carl P.; Rahman, Mohammed R.

    2014-12-01

    We study the survival probability for long times in an open spherical billiard, extending previous work on the circular billiard. We provide details of calculations regarding two billiard configurations, specifically a sphere with a circular hole and a sphere with a square hole. The constant terms of the long-time survival probability expansions have been derived analytically. Terms that vanish in the long time limit are investigated analytically and numerically, leading to connections with the Riemann hypothesis.

  1. Data analysis recipes: Probability calculus for inference

    OpenAIRE

    Hogg, David W.

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods,...

  2. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  3. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  4. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Roger E. Feeley

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  5. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.

  6. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  7. Approximation of Failure Probability Using Conditional Sampling

    Science.gov (United States)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  8. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  9. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  10. A Thermodynamical Approach for Probability Estimation

    CERN Document Server

    Isozaki, Takashi

    2012-01-01

    The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.

  11. Probabilities and Signalling in Quantum Field Theory

    CERN Document Server

    Dickinson, Robert; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.

  12. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  13. Probability, Arrow of Time and Decoherence

    CERN Document Server

    Bacciagaluppi, G

    2007-01-01

    This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.

  14. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  15. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  16. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  17. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  18. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  19. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  20. Explosion probability of unexploded ordnance: expert beliefs.

    Science.gov (United States)

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  1. Probability to retrieve testicular spermatozoa in azoospermic patients

    Institute of Scientific and Technical Information of China (English)

    H.-J.Glander; L.-C.Horn; W.Dorschner; U.Paasch; J.Kratzsch

    2000-01-01

    Aim: The degree of probability to retrieve spermatozoa from testicular tissue for intracytoplasmic sperm injection into oocytes is of interest for counselling of infertility patients. We investigated the relation of sperm retrieval to clinical data and histological pattern in testicular biopsies from azoospermic patients. Methods: In 264 testicular biopsies from 142 azoospermic patients, the testicular tissue was shredded to separate the spermatozoa, histological semi - thin sections of which were then evaluated using Johnsen score. Results: The retrieval of spermatozoa correlated significantly ( P 18 U/L, testicular volume < 5 mL, mean Johnsen score<5, and maximum Johnsen score < 7.

  2. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  3. Exact probability distribution functions for Parrondo's games

    Science.gov (United States)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  4. Atomic transition probabilities of Nd I

    Science.gov (United States)

    Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.

    2011-12-01

    Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.

  5. Survival probability in diffractive dijet photoproduction

    CERN Document Server

    Klasen, M

    2009-01-01

    We confront the latest H1 and ZEUS data on diffractive dijet photoproduction with next-to-leading order QCD predictions in order to determine whether a rapidity gap survival probability of less than one is supported by the data. We find evidence for this hypothesis when assuming global factorization breaking for both the direct and resolved photon contributions, in which case the survival probability would have to be E_T^jet-dependent, and for the resolved or in addition the related direct initial-state singular contribution only, where it would be independent of E_T^jet.

  6. Conditional Probabilities and Collapse in Quantum Measurements

    Science.gov (United States)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  7. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  8. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  9. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  10. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  11. Concepts of probability in radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    Bernhard Weninger

    2011-12-01

    Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.

  12. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  13. Probability, statistics, and decision for civil engineers

    CERN Document Server

    Benjamin, Jack R

    2014-01-01

    Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and

  14. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  15. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  16. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.

  17. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  18. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi...

  19. Probability of boundary conditions in quantum cosmology

    Science.gov (United States)

    Suenobu, Hiroshi; Nambu, Yasusada

    2017-02-01

    One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.

  20. Phonotactic Probability Effects in Children Who Stutter

    Science.gov (United States)

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…

  1. Comonotonic Book-Making with Nonadditive Probabilities

    NARCIS (Netherlands)

    Diecidue, E.; Wakker, P.P.

    2000-01-01

    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the

  2. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  3. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  4. Interstitial lung disease probably caused by imipramine.

    Science.gov (United States)

    Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H

    2014-01-01

    Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.

  5. PROBABILITY SAMPLING DESIGNS FOR VETERINARY EPIDEMIOLOGY

    OpenAIRE

    Xhelil Koleci; Coryn, Chris L.S.; Kristin A. Hobson; Rruzhdi Keci

    2011-01-01

    The objective of sampling is to estimate population parameters, such as incidence or prevalence, from information contained in a sample. In this paper, the authors describe sources of error in sampling; basic probability sampling designs, including simple random sampling, stratified sampling, systematic sampling, and cluster sampling; estimating a population size if unknown; and factors influencing sample size determination for epidemiological studies in veterinary medicine.

  6. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...

  7. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  8. Entanglement Mapping VS. Quantum Conditional Probability Operator

    Science.gov (United States)

    Chruściński, Dariusz; Kossakowski, Andrzej; Matsuoka, Takashi; Ohya, Masanori

    2011-01-01

    The relation between two methods which construct the density operator on composite system is shown. One of them is called an entanglement mapping and another one is called a quantum conditional probability operator. On the base of this relation we discuss the quantum correlation by means of some types of quantum entropy.

  9. Probable Bright Supernova discovered by PSST

    Science.gov (United States)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-09-01

    A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  10. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  11. Updating piping probabilities with survived historical loads

    NARCIS (Netherlands)

    Schweckendiek, T.; Kanning, W.

    2009-01-01

    Piping, also called under-seepage, is an internal erosion mechanism, which can cause the failure of dikes or other flood defence structures. The uncertainty in the resistance of a flood defence against piping is usually large, causing high probabilities of failure for this mechanism. A considerable

  12. Assessing Schematic Knowledge of Introductory Probability Theory

    Science.gov (United States)

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  13. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  14. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  15. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  16. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  17. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  18. Reduced reward-related probability learning in schizophrenia patients

    Directory of Open Access Journals (Sweden)

    Yılmaz A

    2012-01-01

    Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation

  19. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  20. Correlation between hypertension and clinical probable Parkinson disease: Cohort analysis of 4 335 people in Linxian County with nutritional intervention%高血压与临床很可能帕金森病的关联性:林县营养干预4 335人队列人群资料分析

    Institute of Scientific and Technical Information of China (English)

    范金虎; 张亚黎; 刘颖; 孙秀娣; 乔友林

    2006-01-01

    BACKGROUND: Linxian County of China is one of the areas with the highest incidence of esophageal cancer and gastric cardia cancer in the world, and nutrition-deficiency is widely existing in local people. In recent years, many researches around the world revealed that the cause of Parkinson disease (PD) is related to factors of gene, age, environment, diet, nutrition and smoking. More and more studies confirmed that primary hypertension may be in relation to vascular Parkinsonism (VP) and long-term hypertension was apt to VP.OBJECTIVE: To investigate the relationship between hypertension and clinical probable Parkinson disease (PPD) in nutrition-deficient population of Linxian County and provide a theoretical basis for early prevention and treatment of PD.DESIGN: Cross-sectional study.PARTICIPANTS: A total of 4 335 subjects aged over 55 years were selected. These subjects have taken part in the nutritional intervention study of Linxian County and first entered in the cohort study in 1985. They were enrolled in the nutritional intervention study in Linxian County in 1985.METHODS: A prospective cohort study was conducted. ①Case screening: PD questionnaire (used in American Gebai County) combined with general neurological examination were adopted. ②The diagnosis of PD: Clinical diagnostic criteria of UK Parkinson Disease Society Brain Bank were taken as the criteria for screening PD. Further evaluations were undertaken for clinical PPD and clinical possible PD on subjects who had PD symptoms.The diagnostic criteria of clinical PPD: Subjects were diagnosed as having clinical PPD if they presented any two of the following two cardinal features (resting tremor, hypermyotonia, bradykinesia and impairment of postural reflexes) or presented any one of the following features (resting tremor, hypermyotonia and bradykinesia). Diagnostic criteria of clinical possible PD: Subjects were diagnosed as having clinical possible PD when presented any one of the following four

  1. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  2. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  3. Probability of Boundary Conditions in Quantum Cosmology

    CERN Document Server

    Suenobu, Hiroshi

    2016-01-01

    One of the main interest in quantum cosmology is to determine which type of boundary conditions for the wave function of the universe can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation numerically and evaluate probabilities for an observable representing evolution of the classical universe, especially, the number of e-foldings of the inflation. To express boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify them introducing two real parameters which discriminate boundary conditions and estimate values of these parameters resulting in observationally preferable predictions. We obtain the probability for these parameters under the requirement of the sufficient e-foldings of the inflation.

  4. Volcano shapes, entropies, and eruption probabilities

    Science.gov (United States)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  5. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...

  6. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  7. Quantum probabilities: an information-theoretic interpretation

    CERN Document Server

    Bub, Jeffrey

    2010-01-01

    This Chapter develops a realist information-theoretic interpretation of the nonclassical features of quantum probabilities. On this view, what is fundamental in the transition from classical to quantum physics is the recognition that \\emph{information in the physical sense has new structural features}, just as the transition from classical to relativistic physics rests on the recognition that space-time is structurally different than we thought. Hilbert space, the event space of quantum systems, is interpreted as a kinematic (i.e., pre-dynamic) framework for an indeterministic physics, in the sense that the geometric structure of Hilbert space imposes objective probabilistic or information-theoretic constraints on correlations between events, just as the geometric structure of Minkowski space in special relativity imposes spatio-temporal kinematic constraints on events. The interpretation of quantum probabilities is more subjectivist in spirit than other discussions in this book (e.g., the chapter by Timpson)...

  8. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  9. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  10. Earthquake probabilities: theoretical assessments and reality

    Science.gov (United States)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  11. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  12. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  13. Interpreting Prediction Market Prices as Probabilities

    OpenAIRE

    Wolfers, Justin; Zitzewitz, Eric

    2006-01-01

    While most empirical analysis of prediction markets treats prices of binary options as predictions of the probability of future events, Manski (2004) has recently argued that there is little existing theory supporting this practice. We provide relevant analytic foundations, describing sufficient conditions under which prediction markets prices correspond with mean beliefs. Beyond these specific sufficient conditions, we show that for a broad class of models prediction market prices are usuall...

  14. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  15. The Origin of Probability and Entropy

    Science.gov (United States)

    Knuth, Kevin H.

    2008-11-01

    Measuring is the quantification of ordering. Thus the process of ordering elements of a set is a more fundamental activity than measuring. Order theory, also known as lattice theory, provides a firm foundation on which to build measure theory. The result is a set of new insights that cast probability theory and information theory in a new light, while simultaneously opening the door to a better understanding of measures as a whole.

  16. Non-signalling Theories and Generalized Probability

    Science.gov (United States)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  17. Probable Cause: A Decision Making Framework.

    Science.gov (United States)

    1984-08-01

    draw upon several approaches to the p study of causality; specifically, work in attribution theory (Hider, 1958; Kelley, 1973), methodology (Cook...psychology (Michotte, 1946; Piaget , 1974). From our perspective, much of the difficulty in assessing causality is due to the fact that judgments of...that violate probability and statistical theory . We therefore consider such cases because they highlight the various characteristics of each system

  18. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  19. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  20. A quantum probability model of causal reasoning.

    Science.gov (United States)

    Trueblood, Jennifer S; Busemeyer, Jerome R

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  1. Probability of metastable states in Yukawa clusters

    Science.gov (United States)

    Ludwig, Patrick; Kaehlert, Hanno; Baumgartner, Henning; Bonitz, Michael

    2008-11-01

    Finite strongly coupled systems of charged particles in external traps are of high interest in many fields. Here we analyze the occurrence probabilities of ground- and metastable states of spherical, three-dimensional Yukawa clusters by means of molecular dynamics and Monte Carlo simulations and an analytical method. We find that metastable states can occur with a higher probability than the ground state, thus confirming recent dusty plasma experiments with so-called Yukawa balls [1]. The analytical method [2], based on the harmonic approximation of the potential energy, allows for a very intuitive explanation of the probabilities when combined with the simulation results [3].[1] D. Block, S. Käding, A. Melzer, A. Piel, H. Baumgartner, and M. Bonitz, Physics of Plasmas 15, 040701 (2008)[2] F. Baletto and R. Ferrando, Reviews of Modern Physics 77, 371 (2005)[3] H. Kählert, P. Ludwig, H. Baumgartner, M. Bonitz, D. Block, S. Käding, A. Melzer, and A. Piel, submitted for publication (2008)

  2. A Quantum Probability Model of Causal Reasoning

    Science.gov (United States)

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  3. Bacteria survival probability in bactericidal filter paper.

    Science.gov (United States)

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.

  4. Probability of Default and Default Correlations

    Directory of Open Access Journals (Sweden)

    Weiping Li

    2016-07-01

    Full Text Available We consider a system where the asset values of firms are correlated with the default thresholds. We first evaluate the probability of default of a single firm under the correlated assets assumptions. This extends Merton’s probability of default of a single firm under the independent asset values assumption. At any time, the distance-to-default for a single firm is derived in the system, and this distance-to-default should provide a different measure for credit rating with the correlated asset values into consideration. Then we derive a closed formula for the joint default probability and a general closed formula for the default correlation via the correlated multivariate process of the first-passage-time default correlation model. Our structural model encodes the sensitivities of default correlations with respect to the underlying correlation among firms’ asset values. We propose the disparate credit risk management from our result in contrast to the commonly used risk measurement methods considering default correlations into consideration.

  5. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  6. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  7. Medidas de rede e apoio social no Estudo Pró-Saúde: pré-testes e estudo piloto Social network and social support measures from the Pró-Saúde Study: pre-tests and pilot study

    Directory of Open Access Journals (Sweden)

    Dóra Chor

    2001-08-01

    Full Text Available Neste artigo, relatamos a metodologia de seleção de perguntas sobre rede e apoio social, incluídas em um estudo de coorte de 4.030 funcionários de uma universidade pública no Rio de Janeiro. Em primeiro lugar, a adequação de conceitos foi explorada em discussões de grupos de voluntários. Em seguida, o questionário do Medical Outcomes Study foi submetido a procedimentos padronizados de tradução e versão. As perguntas foram a seguir avaliadas em cinco etapas de pré-testes e estudo piloto. Nenhuma pergunta apresentou proporção de não-resposta acima de 5%. Os coeficientes de correlação de Pearson entre os itens foram distantes de zero e da unidade; a correlação entre cada item e o escore de sua dimensão foi superior a 0,80 em quase todos os casos. Finalmente, os coeficientes Alpha de Cronbach foram superiores a 0,70 em todas as dimensões. Os resultados sugerem que aspectos de rede e apoio social serão mensurados adequadamente, permitindo a investigação de suas associações com desenlaces relacionados à saúde em um grupo populacional no Brasil.We describe methodological steps in the selection of questions on social networks and support for a cohort study of 4,030 employees from a public university in Rio de Janeiro. First, group discussions with volunteers were conducted to explore the adequacy of related concepts. Next, questions in the Medical Outcomes Study questionnaire were submitted to standard "forward-" and "back-translation" procedures. The questions were subsequently evaluated through five stages of pre-tests and a pilot study. No question had a proportion of non-response greater than 5%. Pearson correlation coefficients between questions were distant from both zero and unity; correlation between all items and their dimension score was higher than 0.80 in most cases. Finally, Cronbach Alpha coefficients were above 0.70 within each dimension. Results suggest that social networks and support will be adequately

  8. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  9. A new estimator of the discovery probability.

    Science.gov (United States)

    Favaro, Stefano; Lijoi, Antonio; Prünster, Igor

    2012-12-01

    Species sampling problems have a long history in ecological and biological studies and a number of issues, including the evaluation of species richness, the design of sampling experiments, and the estimation of rare species variety, are to be addressed. Such inferential problems have recently emerged also in genomic applications, however, exhibiting some peculiar features that make them more challenging: specifically, one has to deal with very large populations (genomic libraries) containing a huge number of distinct species (genes) and only a small portion of the library has been sampled (sequenced). These aspects motivate the Bayesian nonparametric approach we undertake, since it allows to achieve the degree of flexibility typically needed in this framework. Based on an observed sample of size n, focus will be on prediction of a key aspect of the outcome from an additional sample of size m, namely, the so-called discovery probability. In particular, conditionally on an observed basic sample of size n, we derive a novel estimator of the probability of detecting, at the (n+m+1)th observation, species that have been observed with any given frequency in the enlarged sample of size n+m. Such an estimator admits a closed-form expression that can be exactly evaluated. The result we obtain allows us to quantify both the rate at which rare species are detected and the achieved sample coverage of abundant species, as m increases. Natural applications are represented by the estimation of the probability of discovering rare genes within genomic libraries and the results are illustrated by means of two expressed sequence tags datasets.

  10. Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation

    Science.gov (United States)

    Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin

    2016-12-01

    If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.

  11. Data analysis & probability task & drill sheets

    CERN Document Server

    Cook, Tanya

    2011-01-01

    For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro

  12. Normativity And Probable Reasoning: Hume On Induction

    OpenAIRE

    Tejedor, Chon

    2011-01-01

    En este artículo examino el debate entre los intérpretes epistémicos y descriptivistas de la discusión humeana de la inducción y el razonamiento probable. Los intérpretes epistémicos consideran a Hume como concernido principalmente con cuestiones relacionadas con la autoridad y justificación epistémica de nuestros principios y creencias inductivas. Los intérpretes descriptivistas, por contra, sugieren que lo que Hume pretende es explicar cómo se producen nuestras creencias, no dictaminar si e...

  13. Elemental mercury poisoning probably causes cortical myoclonus.

    Science.gov (United States)

    Ragothaman, Mona; Kulkarni, Girish; Ashraf, Valappil V; Pal, Pramod K; Chickabasavaiah, Yasha; Shankar, Susarla K; Govindappa, Srikanth S; Satishchandra, Parthasarthy; Muthane, Uday B

    2007-10-15

    Mercury toxicity causes postural tremors, commonly referred to as "mercurial tremors," and cerebellar dysfunction. A 23-year woman, 2 years after injecting herself with elemental mercury developed disabling generalized myoclonus and ataxia. Electrophysiological studies confirmed the myoclonus was probably of cortical origin. Her deficits progressed over 2 years and improved after subcutaneous mercury deposits at the injection site were surgically cleared. Myoclonus of cortical origin has never been described in mercury poisoning. It is important to ask patients presenting with jerks about exposure to elemental mercury even if they have a progressive illness, as it is a potentially reversible condition as in our patient.

  14. Atomic transition probabilities of Gd i

    Science.gov (United States)

    Lawler, J. E.; Bilty, K. A.; Den Hartog, E. A.

    2011-05-01

    Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.

  15. Atomic transition probabilities of Er i

    Science.gov (United States)

    Lawler, J. E.; Wyart, J.-F.; Den Hartog, E. A.

    2010-12-01

    Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.

  16. Intermediate Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner--developing sp

  17. Probability of inflation in loop quantum cosmology

    Science.gov (United States)

    Ashtekar, Abhay; Sloan, David

    2011-12-01

    Inflationary models of the early universe provide a natural mechanism for the formation of large scale structure. This success brings to forefront the question of naturalness: Does a sufficiently long slow roll inflation occur generically or does it require a careful fine tuning of initial parameters? In recent years there has been considerable controversy on this issue (Hollands and Wald in Gen Relativ Gravit, 34:2043, 2002; Kofman et al. in J High Energy Phys 10:057, 2002); (Gibbons and Turok in Phys Rev D 77:063516, 2008). In particular, for a quadratic potential, Kofman et al. (J High Energy Phys 10:057, 2002) have argued that the probability of inflation with at least 65 e-foldings is close to one, while Gibbons and Turok (Phys Rev D 77:063516, 2008) have argued that this probability is suppressed by a factor of ~10-85. We first clarify that such dramatically different predictions can arise because the required measure on the space of solutions is intrinsically ambiguous in general relativity. We then show that this ambiguity can be naturally resolved in loop quantum cosmology (LQC) because the big bang is replaced by a big bounce and the bounce surface can be used to introduce the structure necessary to specify a satisfactory measure. The second goal of the paper is to present a detailed analysis of the inflationary dynamics of LQC using analytical and numerical methods. By combining this information with the measure on the space of solutions, we address a sharper question than those investigated in Kofman et al. (J High Energy Phys 10:057, 2002), Gibbons and Turok (Phys Rev D 77:063516, 2008), Ashtekar and Sloan (Phys Lett B 694:108, 2010): What is the probability of a sufficiently long slow roll inflation which is compatible with the seven year WMAP data? We show that the probability is very close to 1. The material is so organized that cosmologists who may be more interested in the inflationary dynamics in LQC than in the subtleties associated with

  18. Numerical Ultimate Ruin Probabilities under Interest Force

    Directory of Open Access Journals (Sweden)

    Juma Kasozi

    2005-01-01

    Full Text Available This work addresses the issue of ruin of an insurer whose portfolio is exposed to insurance risk arising from the classical surplus process. Availability of a positive interest rate in the financial world forces the insurer to invest into a risk free asset. We derive a linear Volterra integral equation of the second kind and apply an order four Block-by-block method in conjuction with the Simpson rule to solve the Volterra equation for ultimate ruin. This probability is arrived at by taking a linear combination of some two solutions to the Volterra integral equation. The several numerical examples given show that our results are excellent and reliable.

  19. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  20. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  1. Acceleration Detection of Large (Probably Prime Numbers

    Directory of Open Access Journals (Sweden)

    Dragan Vidakovic

    2013-02-01

    Full Text Available In order to avoid unnecessary applications of Miller-Rabin algorithm to the number in question, we resortto trial division by a few initial prime numbers, since such a division take less time. How far we should gowith such a division is the that we are trying to answer in this paper?For the theory of the matter is fullyresolved. However, that in practice we do not have much use.Therefore, we present a solution that isprobably irrelevant to theorists, but it is very useful to people who have spent many nights to producelarge (probably prime numbers using its own software.

  2. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  3. Random iteration with place dependent probabilities

    CERN Document Server

    Kapica, R

    2011-01-01

    Markov chains arising from random iteration of functions $S_{\\theta}:X\\to X$, $\\theta \\in \\Theta$, where $X$ is a Polish space and $\\Theta$ is arbitrary set of indices are considerd. At $x\\in X$, $\\theta$ is sampled from distribution $\\theta_x$ on $\\Theta$ and $\\theta_x$ are different for different $x$. Exponential convergence to a unique invariant measure is proved. This result is applied to case of random affine transformations on ${\\mathbb R}^d$ giving existence of exponentially attractive perpetuities with place dependent probabilities.

  4. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  5. Necessity of Exact Calculation for Transition Probability

    Institute of Scientific and Technical Information of China (English)

    LIU Fu-Sui; CHEN Wan-Fang

    2003-01-01

    This paper shows that exact calculation for transition probability can make some systems deviate fromFermi golden rule seriously. This paper also shows that the corresponding exact calculation of hopping rate inducedby phonons for deuteron in Pd-D system with the many-body electron screening, proposed by Ichimaru, can explainthe experimental fact observed in Pd-D system, and predicts that perfection and low-dimension of Pd lattice are veryimportant for the phonon-induced hopping rate enhancement in Pd-D system.

  6. Optimal Reinsurance with Heterogeneous Reference Probabilities

    Directory of Open Access Journals (Sweden)

    Tim J. Boonen

    2016-07-01

    Full Text Available This paper studies the problem of optimal reinsurance contract design. We let the insurer use dual utility, and the premium is an extended Wang’s premium principle. The novel contribution is that we allow for heterogeneity in the beliefs regarding the underlying probability distribution. We characterize layer-reinsurance as an optimal reinsurance contract. Moreover, we characterize layer-reinsurance as optimal contracts when the insurer faces costs of holding regulatory capital. We illustrate this in cases where both firms use the Value-at-Risk or the conditional Value-at-Risk.

  7. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  8. Probable Mechanisms of Needling Therapies for Myofascial Pain Control

    Directory of Open Access Journals (Sweden)

    Li-Wei Chou

    2012-01-01

    Full Text Available Myofascial pain syndrome (MPS has been defined as a regional pain syndrome characterized by muscle pain caused by myofascial trigger points (MTrPs clinically. MTrP is defined as the hyperirritable spot in a palpable taut band of skeletal muscle fibers. Appropriate treatment to MTrPs can effectively relieve the clinical pain of MPS. Needling therapies, such as MTrP injection, dry needling, or acupuncture (AcP can effectively eliminate pain immediately. AcP is probably the first reported technique in treating MPS patients with dry needling based on the Traditional Chinese Medicine (TCM theory. The possible mechanism of AcP analgesia were studied and published in recent decades. The analgesic effect of AcP is hypothesized to be related to immune, hormonal, and nervous systems. Compared to slow-acting hormonal system, nervous system acts in a faster manner. Given these complexities, AcP analgesia cannot be explained by any single mechanism. There are several principles for selection of acupoints based on the TCM principles: “Ah-Shi” point, proximal or remote acupoints on the meridian, and extra-meridian acupoints. Correlations between acupoints and MTrPs are discussed. Some clinical and animal studies of remote AcP for MTrPs and the possible mechanisms of remote effectiveness are reviewed and discussed.

  9. Calculation of fractional electron capture probabilities

    CERN Document Server

    Schoenfeld, E

    1998-01-01

    A 'Table of Radionuclides' is being prepared which will supersede the 'Table de Radionucleides' formerly issued by the LMRI/LPRI (France). In this effort it is desirable to have a uniform basis for calculating theoretical values of fractional electron capture probabilities. A table has been compiled which allows one to calculate conveniently and quickly the fractional probabilities P sub K , P sub L , P sub M , P sub N and P sub O , their ratios and the assigned uncertainties for allowed and non-unique first forbidden electron capture transitions of known transition energy for radionuclides with atomic numbers from Z=3 to 102. These results have been applied to a total of 28 transitions of 14 radionuclides ( sup 7 Be, sup 2 sup 2 Na, sup 5 sup 1 Cr, sup 5 sup 4 Mn, sup 5 sup 5 Fe, sup 6 sup 8 Ge , sup 6 sup 8 Ga, sup 7 sup 5 Se, sup 1 sup 0 sup 9 Cd, sup 1 sup 2 sup 5 I, sup 1 sup 3 sup 9 Ce, sup 1 sup 6 sup 9 Yb, sup 1 sup 9 sup 7 Hg, sup 2 sup 0 sup 2 Tl). The values are in reasonable agreement with measure...

  10. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  11. Probability-consistent spectrum and code spectrum

    Institute of Scientific and Technical Information of China (English)

    沈建文; 石树中

    2004-01-01

    In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.

  12. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  13. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  14. Probability distributions with summary graph structure

    CERN Document Server

    Wermuth, Nanny

    2010-01-01

    A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class are multivariate regression chain graphs. They describe the independences of stepwise processes, in which at each step single or joint responses are generated given the relevant explanatory variables in their past. For joint densities that then result after possible marginalising or conditioning, we use summary graphs. These graphs reflect the independence structure implied by the generating process for the reduced set of variables and they preserve the implied independences after additional marginalising and conditioning. They can identify generating dependences which remain unchanged and alert to possibly severe distortions due to direct and indirect confounding. Operators for matrix representations of graphs are used to de...

  15. Exact feature probabilities in images with occlusion

    CERN Document Server

    Pitkow, Xaq

    2010-01-01

    To understand the computations of our visual system, it is important to understand also the natural environment it evolved to interpret. Unfortunately, existing models of the visual environment are either unrealistic or too complex for mathematical description. Here we describe a naturalistic image model and present a mathematical solution for the statistical relationships between the image features and model variables. The world described by this model is composed of independent, opaque, textured objects which occlude each other. This simple structure allows us to calculate the joint probability distribution of image values sampled at multiple arbitrarily located points, without approximation. This result can be converted into probabilistic relationships between observable image features as well as between the unobservable properties that caused these features, including object boundaries and relative depth. Using these results we explain the causes of a wide range of natural scene properties, including high...

  16. System Geometries and Transit/Eclipse Probabilities

    Directory of Open Access Journals (Sweden)

    Howard A.

    2011-02-01

    Full Text Available Transiting exoplanets provide access to data to study the mass-radius relation and internal structure of extrasolar planets. Long-period transiting planets allow insight into planetary environments similar to the Solar System where, in contrast to hot Jupiters, planets are not constantly exposed to the intense radiation of their parent stars. Observations of secondary eclipses additionally permit studies of exoplanet temperatures and large-scale exo-atmospheric properties. We show how transit and eclipse probabilities are related to planet-star system geometries, particularly for long-period, eccentric orbits. The resulting target selection and observational strategies represent the principal ingredients of our photometric survey of known radial-velocity planets with the aim of detecting transit signatures (TERMS.

  17. Haavelmo's Probability Approach and the Cointegrated VAR

    DEFF Research Database (Denmark)

    Juselius, Katarina

    Some key econometric concepts and problems addressed by Trygve Haavelmo and Ragnar Frisch are discussed within the general frame- work of a cointegrated VAR. The focus is on problems typical of time- series data such as multicollinearity, spurious correlation and regres- sion results, time......) the plausibility of the multivari- ate normality assumption underlying the VAR, (3) cointegration as a solution to the problem of spurious correlation and multicollinearity when data contain deterministic and stochastic trends, (4) the exis- tence of a universe, (5) the association between Frisch’s con...... dependent residuals, normalization, reduced rank, model selection, missing variables, simultaneity, autonomy and iden- ti…cation. Speci…cally the paper discusses (1) the conditions under which the VAR model represents a full probability formulation of a sample of time-series observations, (2...

  18. Quantum probabilities for inflation from holography

    Energy Technology Data Exchange (ETDEWEB)

    Hartle, James B. [Department of Physics, University of California, Santa Barbara, 93106 (United States); Hawking, S.W. [DAMTP, CMS, Wilberforce Road, Cambridge, CB3 0WA (United Kingdom); Hertog, Thomas, E-mail: hartle@physics.ucsb.edu, E-mail: S.W.Hawking@damtp.cam.ac.uk, E-mail: Thomas.Hertog@fys.kuleuven.be [Institute for Theoretical Physics, KU Leuven, Leuven, 3001 (Belgium)

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  19. Priority probability deceleration deadline-aware TCP

    Institute of Scientific and Technical Information of China (English)

    Jin Ye; Jing Lin; Jiawei Huang

    2015-01-01

    In modern data centers, because of the deadline-agnostic congestion control in transmission control protocol (TCP), many deadline-sensitive flows can not finish before their deadlines. Therefore, providing a higher deadline meeting ratio becomes a critical chal enge in the typical online data intensive (OLDI) ap-plications of data center networks (DCNs). However, a problem named as priority synchronization is found in this paper, which de-creases the deadline meeting ratio badly. To solve this problem, we propose a priority probability deceleration (P2D) deadline-aware TCP. By using the novel probabilistic deceleration, P2D prevents the priority synchronization problem. Simulation results show that P2 D increases the deadline meeting ratio by 20%compared with D2TCP.

  20. Objective Lightning Probability Forecast Tool Phase II

    Science.gov (United States)

    Lambert, Winnie

    2007-01-01

    This presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.

  1. Transits Probabilities Around Hypervelocity and Runaway Stars

    CERN Document Server

    Fragione, Giacomo

    2016-01-01

    In the blooming field of exoplanetary science, NASA's Kepler Space Telescope has revolutionized our understanding of exoplanets. Kepler's very precise and long-duration photometry is ideal for detecting planetary transits around Sun-like stars. The forthcoming Transiting Exoplanet Survey Satellite (TESS) is expected to continue Kepler's legacy. In this paper, we explore the possibility of detecting planetary transits around hypervelocity and runaway stars, which should host a very compact system as consequence of their turbulent origin. We find that the probability of a multi-planetary transit is $10^{-3}\\lesssim P\\lesssim 10^{-1}$. We therefore need to observe $\\sim 10-1000$ high-velocity stars to spot a transit. We predict that the European Gaia satellite, along with TESS, could spot such transits.

  2. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth;

    2003-01-01

    from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based......biquitous computing environments are highly dynamic, with new unforeseen circumstances and constantly changing environments, which introduces new risks that cannot be assessed through traditional means of risk analysis. Mobile entities in a ubiquitous computing environment require the ability...... to perform an autonomous assessment of the risk incurred by a specific interaction with another entity in a given context. This assessment will allow a mobile entity to decide whether sufficient evidence exists to mitigate the risk and allow the interaction to proceed. Such evidence might include records...

  3. On the probability of dinosaur fleas.

    Science.gov (United States)

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-11

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.

  4. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...... to Gumbel distributions, and these fits are found to represent the data measured with good accuracy. The pressure distributions found have been used in a calibration of partial factors, which should achieve a certain theoretical target reliability index. For a target annual reliability index of 4.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  5. Trending in Probability of Collision Measurements

    Science.gov (United States)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  6. Probable warfarin interaction with menthol cough drops.

    Science.gov (United States)

    Coderre, Karen; Faria, Claudio; Dyer, Earl

    2010-01-01

    Warfarin is a widely used and effective oral anticoagulant; however, the agent has an extensive drug and food interaction profile. We describe a 46-year-old African-American man who was receiving warfarin for a venous thromboembolism and experienced a decrease in his international normalized ratio (INR). No corresponding reduction had been made in his warfarin dosage, and no changes had been made in his concomitant drug therapy or diet. The patient's INR fell from a therapeutic value of 2.6 (target range 2-3) to 1.6 while receiving a weekly warfarin dose of 50 mg. His INR remained stable at 1.6 for 3 weeks despite incremental increases in his warfarin dose. The patient reported that he had been taking 8-10 menthol cough drops/day due to dry conditions at his workplace during the time period that the INR decreased. Five days after discontinuing the cough drops, his INR increased from 1.6 to 2.9. Over the subsequent 5 weeks, his INR was stabilized at a much lower weekly warfarin dose of 40 mg. Use of the Naranjo adverse drug reaction probability scale indicated that the decreased INR was probably related to the concomitant use of menthol cough drops during warfarin therapy. The mechanism for this interaction may be related to the potential for menthol to affect the cytochrome P450 system as an inducer and inhibitor of certain isoenzymes that would potentially interfere with the metabolism of warfarin. To our knowledge, this is the second case report of an interaction between warfarin and menthol. Patients receiving warfarin should be closely monitored, as they may choose to take over-the-counter products without considering the potential implications, and counseled about a possible interaction with menthol cough drops.

  7. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  8. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  9. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    Science.gov (United States)

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  10. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren

    2013-01-01

    To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors.......To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors....

  11. Ignorance is not bliss: Statistical power is not probability of trial success.

    Science.gov (United States)

    Zierhut, M L; Bycott, P; Gibbs, M A; Smith, B P; Vicini, P

    2016-04-01

    The purpose of this commentary is to place probability of trial success, or assurance, in the context of decision making in drug development, and to illustrate its properties in an intuitive manner for the readers of Clinical Pharmacology and Therapeutics. The hope is that this will stimulate a dialog on how assurance should be incorporated into a quantitative decision approach for clinical development and trial design that uses all available information.

  12. Non-pharmacological strategies on pain relief during labor: pre-testing of an instrument Estrategias no farmacológicas en el alivio del dolor durante el trabajo de parto: pre-test de un instrumento Estratégias não farmacológicas no alívio da dor durante o trabalho de parto: pré-teste de um instrumento

    Directory of Open Access Journals (Sweden)

    Rejane Marie Barbosa Davim

    2007-12-01

    Full Text Available This descriptive study aimed to evaluate the effectiveness of Non-Pharmacological Strategies (NFS on pain relief of parturients as part of a research instrument to be utilized in a Doctoral Dissertation. In order to evaluate the NFS, the Analogous Visual Scale (AVS was used on 30 parturients attended at the Humanized Labor Unit of a school-maternity hospital in Natal, RN, Brazil. Of the six NFS (respiratory exercises, muscular relaxation, lumbossacral massage, shower washing, deambulation and pelvic swing, two were excluded post-test (deambulation and pelvic swing for not being accepted by the parturients. The remaining NFS (respiratory exercises, muscular relaxation, lumbossacral massage, and shower washing which reached satisfactory acceptation and applicability rates, were found to be effective in relieving pain of these parturients, and thus deemed adequate for use in the Doctoral Dissertation data collection process.Estudio descriptivo, objetivizando evaluar la efectividad de Estratégias No Farmacológicas (ENF en el alivio del dolor de parturientas para hacer parte de un instrumento de investigación a ser utilizado en la preparación de Tesis de Doctorado. Para evaluar las ENF, utilizamos la Escala Analógica Visual (EAV, con 30 parturientas, teniendo como local de la investigación la Unidad de Parto Humanizado de una Maternidad Escuela en Natal/RN. Verificamos que, de las seis ENF (ejercicios respiratorios, relajamiento muscular, masaje lombo-sacra, baño de lluvia, deambulación y balance pélvico, dos fueron excluidas después del Pre-Test (deambulación y balance pélvico, por presentar dificultades en su aplicación. Los demás (ejercicios respiratorios, relajamiento muscular, masaje lombo-sacra y baño de lluvia tuvieron índices satisfactorios de aceptación y aplicación, demostrando ser efectivas en el alivio del dolor de esas parturientas, siendo por lo tanto adecuadas su utilización como instrumento de recolecta de datos en

  13. Probability of rupture of multiple fault segments

    Science.gov (United States)

    Andrews, D.J.; Schwerer, E.

    2000-01-01

    Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.

  14. Atomic Transition Probabilities for Neutral Cerium

    Science.gov (United States)

    Chisholm, John; Nitz, D.; Sobeck, J.; Den Hartog, E. A.; Wood, M. P.; Lawler, J. E.

    2010-01-01

    Among the rare earth species, the spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are some of the most complex. Like other rare earth species, Ce has many lines in the visible which are suitable for elemental abundance studies. Recent work on Ce II transition probabilities [1] is now being augmented with similar work on Ce I for future studies using such lines from astrophysical sources. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2500 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. This work was supported by the National Science Foundation's REU program and the Department of Defense's ASSURE program through NSF Award AST-0453442 and NSF Grant CTS0613277. [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).

  15. Transit probabilities around hypervelocity and runaway stars

    Science.gov (United States)

    Fragione, G.; Ginsburg, I.

    2017-04-01

    In the blooming field of exoplanetary science, NASA's Kepler Space Telescope has revolutionized our understanding of exoplanets. Kepler's very precise and long-duration photometry is ideal for detecting planetary transits around Sun-like stars. The forthcoming Transiting Exoplanet Survey Satellite (TESS) is expected to continue Kepler's legacy. Along with transits, the Doppler technique remains an invaluable tool for discovering planets. The next generation of spectrographs, such as G-CLEF, promise precision radial velocity measurements. In this paper, we explore the possibility of detecting planets around hypervelocity and runaway stars, which should host a very compact system as consequence of their turbulent origin. We find that the probability of a multiplanetary transit is 10-3 ≲ P ≲ 10-1. We therefore need to observe ∼10-1000 high-velocity stars to spot a transit. However, even if transits are rare around runaway and hypervelocity stars, the chances of detecting such planets using radial velocity surveys is high. We predict that the European Gaia satellite, along with TESS and the new-generation spectrographs G-CLEF and ESPRESSO, will spot planetary systems orbiting high-velocity stars.

  16. Grain Exchange Probabilities Within a Gravel Bed

    Science.gov (United States)

    Haschenburger, J.

    2008-12-01

    Sediment transfers in gravel-bed rivers involve the vertical exchange of sediments during floods. These exchanges regulate the virtual velocity of sediment and bed material texture. This study describes general tendencies in the vertical exchange of gravels within the substrate that result from multiple floods. Empirical observations come from Carnation Creek, a small gravel-bed river with large woody debris located on the west coast of Vancouver Island, British Columbia. Frequent floods and the relatively limited armor layer facilitate streambed activity and relatively high bedload transport rates, typically under partial sediment transport conditions. Over 2000 magnetically tagged stones, ranging in size from 16 to 180 mm, were deployed on the bed surface between 1991 and 1992. These tracers have been recovered 10 times over 12 flood seasons to quantify their vertical position in the streambed. For analysis, the bed is divided into layers based on armor layer thickness. Once tracers are well mixed within the streambed, grains in the surface layer are most likely to be mixed into the subsurface, while subsurface grains are most likely to persist within the subsurface. Fractional exchange probabilities approach size independence when the most active depth of the substrate is considered. Overall these results highlight vertical mixing as an important process in the dispersion of gravels.

  17. Logic and probability in quantum mechanics

    CERN Document Server

    1976-01-01

    During the academic years 1972-1973 and 1973-1974, an intensive sem­ inar on the foundations of quantum mechanics met at Stanford on a regular basis. The extensive exploration of ideas in the seminar led to the org~ization of a double issue of Synthese concerned with the foundations of quantum mechanics, especially with the role of logic and probability in quantum meChanics. About half of the articles in the volume grew out of this seminar. The remaining articles have been so­ licited explicitly from individuals who are actively working in the foun­ dations of quantum mechanics. Seventeen of the twenty-one articles appeared in Volume 29 of Syn­ these. Four additional articles and a bibliography on -the history and philosophy of quantum mechanics have been added to the present volume. In particular, the articles by Bub, Demopoulos, and Lande, as well as the second article by Zanotti and myself, appear for the first time in the present volume. In preparing the articles for publication I am much indebted to ...

  18. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  19. Essays on probability elicitation scoring rules

    Science.gov (United States)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  20. XI Symposium on Probability and Stochastic Processes

    CERN Document Server

    Pardo, Juan; Rivero, Víctor; Bravo, Gerónimo

    2015-01-01

    This volume features lecture notes and a collection of contributed articles from the XI Symposium on Probability and Stochastic Processes, held at CIMAT Mexico in September 2013. Since the symposium was part of the activities organized in Mexico to celebrate the International Year of Statistics, the program included topics from the interface between statistics and stochastic processes. The book starts with notes from the mini-course given by Louigi Addario-Berry with an accessible description of some features of the multiplicative coalescent and its connection with random graphs and minimum spanning trees. It includes a number of exercises and a section on unanswered questions. Further contributions provide the reader with a broad perspective on the state-of-the art of active areas of research. Contributions by: Louigi Addario-Berry Octavio Arizmendi Fabrice Baudoin Jochen Blath Loïc Chaumont J. Armando Domínguez-Molina Bjarki Eldon Shui Feng Tulio Gaxiola Adrián González Casanova Evgueni Gordienko Daniel...

  1. Do aftershock probabilities decay with time?

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."

  2. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  3. [Osteomyelitis: a probable, uncommon etiology agent].

    Science.gov (United States)

    Cuoco, F; Borzani, I; Torcoletti, M; Beltrami, V; Petaccia, A; Corona, F

    2015-06-01

    The relation of infectious agents to arthritis is an area of great interest to the rheumatologist. Septic arthritis of bacterial origin accounts for approximately 6.5% of all childhood arthritides. Septic arthritis usually results from haematogenous spread from a focus of infection elsewhere in the body, but also by direct extension of an infection from overlying soft tissues or bone or traumatic invasion of the joint. As a result, if a focus of underlying osteomyelitis breaks throught the metaphysis, it may enter the joint and result in septic arthritis. Systemic signs of illness are fever, severe bone pain, and tenderness with or without local swelling. A wide range of microorganism can cause septic arthritis in children; Staphylococcus aureus and nongroup A and B streptococci are most common overall. However, different organisms are more common at some ages and in certain circumstances. Kingella kingae is an emerging pathogen in young children under 4 years of age. The clinical presentation of K. kingae invasive infection is often subtle and may be associated to mild to moderate biologic inflammatory responses. Affected children often have few signs and symptoms of osteoarticular infections. Early MRI is useful in differentiating K kingae from Gram-positive cocci in osteoarticular infections. Cartilaginous involvement, modest soft tissue and bone reaction suggest K. kingae. It's very important to include K. kingae in differential diagnosis of osteoarticular infections in young children. We report an unusual case of osteomyelitis: clinical manifestations and MRI are suggestive for K kingae infection.

  4. Implications of conflicting definitions of probability to health risk communication: a case study of familial cancer and genetic counselling.

    Science.gov (United States)

    O'Doherty, Kieran C

    2007-02-01

    The question of what probability actually is has long been debated in philosophy and statistics. Although the concept of probability is fundamental to many applications in the health sciences, these debates are generally not well known to health professionals. This paper begins with an outline of some of the different interpretations of probability. Examples are provided of how each interpretation manifests in clinical practice. The discipline of genetic counselling (familial cancer) is used to ground the discussion. In the second part of the paper, some of the implications that different interpretations of probability may have in practice are examined. The main purpose of the paper is to draw attention to the fact that there is much contention as to the nature of the concept of probability. In practice, this creates the potential for ambiguity and confusion. This paper constitutes a call for deeper engagement with the ways in which probability and risk are understood in health research and practice.

  5. MATHEMATICAL EXPECTATION ABOUT DISCRETE RANDOM VARIABLE WITH INTERVAL PROBABILITY OR FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The character and an algorithm about DRVIP(discrete random variable with interval probability) and the second kind DRVFP (discrete random variable with crisp event-fuzzy probability) are researched. Using the fuzzy resolution theorem, the solving mathematical expectation of a DRVFP can be translated into solving mathematical expectation of a series of RVIP. It is obvious that solving mathematical expectation of a DRVIP is a typical linear programming problem. A very functional calculating formula for solving mathematical expectation of DRVIP was obtained by using the Dantzig's simplex method. The example indicates that the result obtained by using the functional calculating formula fits together completely with the result obtained by using the linear programming method, but the process using the formula deduced is simpler.

  6. Probably good diagrams for learning: representational epistemic recodification of probability theory.

    Science.gov (United States)

    Cheng, Peter C-H

    2011-07-01

    The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic.

  7. BETASCAN: probable beta-amyloids identified by pairwise probabilistic analysis.

    Directory of Open Access Journals (Sweden)

    Allen W Bryan

    2009-03-01

    Full Text Available Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s, there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid

  8. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  9. Polarization Mode Dispersion Probability Distribution for Arbitrary Mode Coupling

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The probability distribution of the differential group delay for arbitrary mode coupling is simulated with Monte-Carlo method. Fitting the simulation results, we obtain probability distribution function for arbitrary mode coupling.

  10. Probability-summation model of multiple laser-exposure effects.

    Science.gov (United States)

    Menendez, A R; Cheney, F E; Zuclich, J A; Crump, P

    1993-11-01

    A probability-summation model is introduced to provide quantitative criteria for discriminating independent from interactive effects of multiple laser exposures on biological tissue. Data that differ statistically from predictions of the probability-summation model indicate the action of sensitizing (synergistic/positive) or desensitizing (hardening/negative) biophysical interactions. Interactions are indicated when response probabilities vary with changes in the spatial or temporal separation of exposures. In the absence of interactions, probability-summation parsimoniously accounts for "cumulative" effects. Data analyzed using the probability-summation model show instances of both sensitization and desensitization of retinal tissue by laser exposures. Other results are shown to be consistent with probability-summation. The relevance of the probability-summation model to previous laser-bioeffects studies, models, and safety standards is discussed and an appeal is made for improved empirical estimates of response probabilities for single exposures.

  11. 28 CFR 2.101 - Probable cause hearing and determination.

    Science.gov (United States)

    2010-07-01

    ... who have given information upon which revocation may be based) at a postponed probable cause hearing... attendance, unless good cause is found for not allowing confrontation. Whenever a probable cause hearing...

  12. Clinical features and related factors in Parkinson′s disease patients with probable rapid eye movement sleep behavoir disorder%帕金森病患者伴很可能的快速眼动睡眠行为障碍和相关因素的研究

    Institute of Scientific and Technical Information of China (English)

    扈杨; 左丽君; 余舒扬; 曹辰杰; 陈泽颉; 王方; 张巍

    2013-01-01

    分和(4.49±3.38)分、(9.22±5.68)分和(6.06±4.14)分、(41.42±9.97)分和(34.81±9.46)分、(9.87±3.09)分和(8.01±4.13)分]( P <0.05);两组简易精神状态检查量表(MMSE)、蒙特利尔认知评估量表(MoCA)、改良淡漠评定量表(MAES)和不宁腿综合征严重程度评定量表(RLSRS)的评分无显著差异(P>0.05);(5)病程、H-Y分期、NMS个数及UPDRSⅠ、HAMD、HAMA、SCOPA-AUT、PQSI、ESS以及FS-14评分与RBDSQ 评分均具有显著相关性( r 分别为0.256、0.311、0.324、0.306、0.275、0.287、0.409、0.352、0.26和0.243, P <0.05);(6) RBDSQ 评分与 PDQL-39评分呈显著负相关(r=-0.203,P<0.05)。结论帕金森病患者P-RBD发生率较高,P-RBD组病程更长,病情更重,NMS更多,与部分NMS,包括情绪、总体睡眠质量、日间过度思睡及自主神经功能障碍显著相关,严重影响帕金森病患者的生活质量。%Objective To explore the clinical features and associated factors of probable rapid eye movement sleep behavior disorder ( P-RBD ) and its influences on life quality in patients with Parkinson′s disease ( PD) .Methods 102 PD patients who visited the department of neurology ,Beijing Tiantan Hospital from April 2012 to January 2013 were consecutively recruited and evaluated by rapid eye movement sleep behavior disorder screening questionnaire ( RBDSQ) ,scales of motor symptoms ( MS) and non-motor symptoms ( NMS) ,Parkinson′s disease quality of life questionnaire-39 ( PDQL-39 ) .Results ( 1 ) 30 of 102 PD patients ( 29.41%) had P-RBD ( RBDSQ≥6 points),which mean RBDSQ score was 8.23 ±1.89;72 of 102(70.59%)PD patients did not have P-RBD,which mean RBDSQ score was 2.21 ±1.3 .( 2 ) P-RBD and NP-RBD groups were not different in gender , age, education level,age of onset,side of onset and clinical phenotypes except disease duration [3.50(1.13-6.75)and 2.00(1.00-3.00)](P=0.022).(3)There was a

  13. [Arthritis and clinical history].

    Science.gov (United States)

    Silva, Lígia; Sampaio, Luzia; Pinto, José; Ventura, Francisco S

    2011-01-01

    In front of a patient with arthritis, clinical good-sense tells that the most probable diagnosis are the most prevalent ones. Nevertheless, we have to exclude a multiplicity of other aetiologies, less frequent, but with highest implications in the therapeutic conduct. Infections by Brucella and by Borrelia are rare causes of chronic arthritis, yet are diagnosis to consider, even when the clinical manifestations aren't the most typical, as there still exist endemic areas in Portugal. Here we report two clinical cases about patients with arthritis for more than one year, subject to ineffective exams ant treatments. Only the clinical history could put on evidence clinical-epidemiological data, suggestive of Brucellosis and Lyme Disease, namely the professional contact with infected animals, and the history of probable erythema migrans, that pointed toward the correct diagnosis. So, with directed therapeutic, there was complete resolution of the inflammatory symptoms.

  14. 21 CFR 1316.10 - Administrative probable cause.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  15. A consistent set of infinite-order probabilities

    NARCIS (Netherlands)

    Atkinson, David; Peijnenburg, Jeanne

    2013-01-01

    Some philosophers have claimed that it is meaningless or paradoxical to consider the probability of a probability. Others have however argued that second-order probabilities do not pose any particular problem. We side with the latter group. On condition that the relevant distinctions are taken into

  16. Probability Constructs in Preschool Education and How they Are Taught

    Science.gov (United States)

    Antonopoulos, Konstantinos; Zacharos, Konstantinos

    2013-01-01

    The teaching of Probability Theory constitutes a new trend in mathematics education internationally. The purpose of this research project was to explore the degree to which preschoolers understand key concepts of probabilistic thinking, such as sample space, the probability of an event and probability comparisons. At the same time, we evaluated an…

  17. 14 CFR 417.224 - Probability of failure analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle...

  18. Analysis of 2-d ultrasound cardiac strain imaging using joint probability density functions.

    Science.gov (United States)

    Ma, Chi; Varghese, Tomy

    2014-06-01

    Ultrasound frame rates play a key role for accurate cardiac deformation tracking. Insufficient frame rates lead to an increase in signal de-correlation artifacts resulting in erroneous displacement and strain estimation. Joint probability density distributions generated from estimated axial strain and its associated signal-to-noise ratio provide a useful approach to assess the minimum frame rate requirements. Previous reports have demonstrated that bi-modal distributions in the joint probability density indicate inaccurate strain estimation over a cardiac cycle. In this study, we utilize similar analysis to evaluate a 2-D multi-level displacement tracking and strain estimation algorithm for cardiac strain imaging. The effect of different frame rates, final kernel dimensions and a comparison of radio frequency and envelope based processing are evaluated using echo signals derived from a 3-D finite element cardiac model and five healthy volunteers. Cardiac simulation model analysis demonstrates that the minimum frame rates required to obtain accurate joint probability distributions for the signal-to-noise ratio and strain, for a final kernel dimension of 1 λ by 3 A-lines, was around 42 Hz for radio frequency signals. On the other hand, even a frame rate of 250 Hz with envelope signals did not replicate the ideal joint probability distribution. For the volunteer study, clinical data was acquired only at a 34 Hz frame rate, which appears to be sufficient for radio frequency analysis. We also show that an increase in the final kernel dimensions significantly affect the strain probability distribution and joint probability density function generated, with a smaller effect on the variation in the accumulated mean strain estimated over a cardiac cycle. Our results demonstrate that radio frequency frame rates currently achievable on clinical cardiac ultrasound systems are sufficient for accurate analysis of the strain probability distribution, when a multi-level 2-D

  19. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed.

  20. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  1. Pattern formation, logistics, and maximum path probability

    Science.gov (United States)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  2. Total probabilities of ensemble runoff forecasts

    Science.gov (United States)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-04-01

    Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative

  3. Ruin Probabilities of a Surplus Process Described by PDMPs

    Institute of Scientific and Technical Information of China (English)

    Jing-min He; Rong Wu; Hua-yue Zhang

    2008-01-01

    In this paper we mainly study the ruin probability of a surplus process described by a piecewise deterministic Markov process (PDMP). An integro-differentiai equation for the ruin probability is derived. Under a certain assumption, it can be transformed into the ruin probability of a risk process whose premiums depend on the current reserves. Using the same argument as that in Asmussen and Nielsen[2], the ruin probability and its upper bounds are obtained. Finally, we give an analytic expression for ruin probability and its upper bounds when the claim-size is exponentially distributed.

  4. Calculation Model and Simulation of Warship Damage Probability

    Institute of Scientific and Technical Information of China (English)

    TENG Zhao-xin; ZHANG Xu; YANG Shi-xing; ZHU Xiao-ping

    2008-01-01

    The combat efficiency of mine obstacle is the focus of the present research. Based on the main effects that mine obstacle has on the target warship damage probability such as: features of mines with maneuverability, the success rate of mine-laying, the hit probability, mine reliability and action probability, a calculation model of target warship mine-encounter probability is put forward under the condition that the route selection of target warships accords with even distribution and the course of target warships accords with normal distribution. And a damage probability model of mines with maneuverability to target warships is set up, a simulation way proved the model to be a high practicality.

  5. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise.

  6. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention."

  7. Nonlinear neurobiological probability weighting functions for aversive outcomes.

    Science.gov (United States)

    Berns, Gregory S; Capra, C Monica; Chappelow, Jonathan; Moore, Sara; Noussair, Charles

    2008-02-15

    While mainstream economic models assume that individuals treat probabilities objectively, many people tend to overestimate the likelihood of improbable events and underestimate the likelihood of probable events. However, a biological account for why probabilities would be treated this way does not yet exist. While undergoing fMRI, we presented individuals with a series of lotteries, defined by the voltage of an impending cutaneous electric shock and the probability with which the shock would be received. During the prospect phase, neural activity that tracked the probability of the expected outcome was observed in a circumscribed network of brain regions that included the anterior cingulate, visual, parietal, and temporal cortices. Most of these regions displayed responses to probabilities consistent with nonlinear probability weighting. The neural responses to passive lotteries predicted 79% of subsequent decisions when individuals were offered choices between different lotteries, and exceeded that predicted by behavior alone near the indifference point.

  8. Inverse probability weighting for covariate adjustment in randomized studies.

    Science.gov (United States)

    Shen, Changyu; Li, Xiaochun; Li, Lingling

    2014-02-20

    Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented.

  9. Surprisingly Rational: Evidence that people follow probability theory when judging probabilities, and that biases in judgment are due to noise

    CERN Document Server

    Costello, Fintan

    2012-01-01

    The systematic biases and errors seen in people's probability judgments are typically taken as evidence that people do not reason about probability using the rules of probability theory. We show the contrary: that these biases are a consequence of people correctly following probability theory, but with random variation or noise affecting the reasoning process. Taking P_E(A) to represent a person's estimate for the probability of some event A, this random variation account predicts that on average P_E(A)+P_E(B)- P_E(A or B)-P_E(A and B)=0 for all pairs of events A,B, just as required by probability theory. Analysing data from an experiment asking people to estimate such probabilities for a number of pairs A,B we find striking confirmation of this prediction.

  10. Principles of failure probability assessment (PoF)

    Energy Technology Data Exchange (ETDEWEB)

    Giribone, R.; Pocachard, M.

    2003-07-01

    The aim of this paper is to offer some methodological guidance for assessing the probability of getting a failure (PoF) in an item subject to a continuous degradation mechanism and systematic inspection program. It is to be emphasised that this is not the final result of probability assessment: as a matter of fact, given a failure of a certain type, there is not a one to one correspondence between the probability of occurrence of a failure and the probability of having a certain harm of a given intensity. Between the two, there is a need to consider various accidental scenarios, each occurring with a given probability. The final outcome is the product of these two probabilities. (orig.)

  11. Revising probability estimates: Why increasing likelihood means increasing impact.

    Science.gov (United States)

    Maglio, Sam J; Polman, Evan

    2016-08-01

    Forecasted probabilities rarely stay the same for long. Instead, they are subject to constant revision-moving upward or downward, uncertain events become more or less likely. Yet little is known about how people interpret probability estimates beyond static snapshots, like a 30% chance of rain. Here, we consider the cognitive, affective, and behavioral consequences of revisions to probability forecasts. Stemming from a lay belief that revisions signal the emergence of a trend, we find in 10 studies (comprising uncertain events such as weather, climate change, sex, sports, and wine) that upward changes to event-probability (e.g., increasing from 20% to 30%) cause events to feel less remote than downward changes (e.g., decreasing from 40% to 30%), and subsequently change people's behavior regarding those events despite the revised event-probabilities being the same. Our research sheds light on how revising the probabilities for future events changes how people manage those uncertain events. (PsycINFO Database Record

  12. Probability distribution fitting of schedule overruns in construction projects

    OpenAIRE

    P E D Love; C-P Sing; WANG, X; Edwards, D.J.; H Odeyinka

    2013-01-01

    The probability of schedule overruns for construction and engineering projects can be ascertained using a ‘best fit’ probability distribution from an empirical distribution. The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed. Skewness and kurtosis values revealed that schedule overruns are non-Gaussian. Theoretical probability distributions were then fitted to the schedule overrun data; including the Kolmogorov–...

  13. Probability and social science : methodologial relationships between the two approaches ?

    OpenAIRE

    Courgeau, Daniel

    2012-01-01

    This work examines in depth the methodological relationships that probability and statistics have maintained with the social sciences. It covers both the history of thought and current methods. First, it examines in detail the history of the different paradigms and axioms for probability, from their emergence in the seventeenth century up to the most recent developments of the three major concepts: objective, subjective and logicist probability. It shows the statistical inference they perm...

  14. Exciton-Dependent Pre-formation Probability of Composite Particles

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jing-Shang; WANG Ji-Min; DUAN Jun-Feng

    2007-01-01

    In Iwamoto-Harada model the whole phase space is full of fermions. When the momentum distributions of the exciton states are taken into account, the pre-formation probability of light composite particles could be improved,and the exciton state-dependent pre-formation probability has been proposed. The calculated results indicate that the consideration of the momentum distribution enhances the pre-formation probability of [1,m] configuration, and suppresses that of [l > 1, m] configurations seriously.

  15. Can Personality Type Explain Heterogeneity in Probability Distortions?

    OpenAIRE

    C. Monica Capra; Bing Jiang; Jan Engelmann; Gregory Berns

    2012-01-01

    There are two regularities we have learned from experimental studies of choice under risk. The first is that the majority of people weigh objective probabilities non-linearly. The second regularity, although less commonly acknowledged, is that there is a large amount of heterogeneity in how people distort probabilities. Despite this, little effort has been made to identify the source of heterogeneity. In this paper, we explore the possibility that the probability distortions are linked to the...

  16. Upper Bounds for Ruin Probability with Stochastic Investment Return

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    Risk models with stochastic investment return are widely held in practice, as well as in more challenging research fields. Risk theory is mainly concerned with ruin probability, and a tight bound for ruin probability is the best for practical use. This paper presents a discrete time risk model with stochastic investment return. Conditional expectation properties and martingale inequalities are used to obtain both exponential and non-exponential upper bounds for the ruin probability.

  17. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  18. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  19. Recovery of Sparse Probability Measures via Convex Programming

    OpenAIRE

    Pilanci, Mert; El Ghaoui, Laurent; Chandrasekaran, Venkat

    2012-01-01

    We consider the problem of cardinality penalized optimization of a convex function over the probability simplex with additional convex constraints. The classical ℓ_1 regularizer fails to promote sparsity on the probability simplex since ℓ_1 norm on the probability simplex is trivially constant. We propose a direct relaxation of the minimum cardinality problem and show that it can be efficiently solved using convex programming. As a first application we consider recovering a spa...

  20. Evaluation for Success Probability of Chaff Centroid Jamming

    Institute of Scientific and Technical Information of China (English)

    GAO Dong-hua; SHI Xiu-hua

    2008-01-01

    As the chaff centroid jamming can introduce the guiding error of the anti-warship missile's seeker and decrease its hitting probability, a new quantitative analysis method and a mathematic model are proposed in this paper to evaluate the success jamming probability. By using this method, the optimal decision scheme of chaff centroid jamming in different threat situations can be found, and also the success probability of this scheme can be calculated quantitatively. Thus, the operation rules of the centroid jamming and the tactical approach for increasing the success probability can be determined.

  1. Considerations on probability: from games of chance to modern science

    Directory of Open Access Journals (Sweden)

    Paola Monari

    2015-12-01

    Full Text Available The article sets out a number of considerations on the distinction between variability and uncertainty over the centuries. Games of chance have always been useful random experiments which through combinatorial calculation have opened the way to probability theory and to the interpretation of modern science through statistical laws. The article also looks briefly at the stormy nineteenth-century debate concerning the definitions of probability which went over the same grounds – sometimes without any historical awareness – as the debate which arose at the very beginnings of probability theory, when the great probability theorists were open to every possible meaning of the term.

  2. Probability representation of kinetic equation for open quantum system

    CERN Document Server

    Man'ko, V I; Shchukin, E V

    2003-01-01

    The tomographic probability distribution is used to decribe the kinetic equations for open quantum systems. Damped oscillator is studied. Purity parameter evolution for different damping regime is considered.

  3. Probability, statistics and queueing theory, with computer science applications

    CERN Document Server

    Allen, Arnold O

    1978-01-01

    Probability, Statistics, and Queueing Theory: With Computer Science Applications focuses on the use of statistics and queueing theory for the design and analysis of data communication systems, emphasizing how the theorems and theory can be used to solve practical computer science problems. This book is divided into three parts. The first part discusses the basic concept of probability, probability distributions commonly used in applied probability, and important concept of a stochastic process. Part II covers the discipline of queueing theory, while Part III deals with statistical inference. T

  4. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  5. PROBABILITY INEQUALITIES FOR SUMS OF INDEPENDENT UNBOUNDED RANDOM VARIABLES

    Institute of Scientific and Technical Information of China (English)

    张涤新; 王志诚

    2001-01-01

    The tail probability inequalities for the sum of independent unbounded random variables on a probability space ( Ω , T, P) were studied and a new method was proposed to treat the sum of independent unbounded random variables by truncating the original probability space (Ω, T, P ). The probability exponential inequalities for sums of independent unbounded random variables were given. As applications of the results, some interesting examples were given. The examples show that the method proposed in the paper and the results of the paper are quite useful in the study of the large sample properties of the sums of independent unbounded random variables.

  6. Predicting Ebola Infection and Survival Using qRT-PCR and Basic Clinical Labs: Comparing Ebola Virus Survival, Clinical Features, and Laboratory Values in 3 Non Human Primate Models

    Science.gov (United States)

    2016-12-22

    sample collection and yielded a panel of 46 routine clinical lab values [21]. Viral RNA Viral burden was measured on sera collected at pre...tested predictor. The measurement of the area under the ROC curve, known as ROC AUC, transfers the performance curve into a value range between 0.5 and... present our findings in a systematic format concentrating on laboratory values that we think reflect EBOV disease pathogenesis and that are easily

  7. Methods for fitting a parametric probability distribution to most probable number data.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2012-07-01

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  8. A Quantum Theoretical Explanation for Probability Judgment Errors

    Science.gov (United States)

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  9. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    Science.gov (United States)

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based, probability cluster…

  10. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    Science.gov (United States)

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response…

  11. Subjective probabilities for state-dependent continuous utility

    NARCIS (Netherlands)

    Wakker, P.P.

    1987-01-01

    For the expected utility model with state dependent utilities, Karni, Schmeidler & Vind (1983, Econometrica) show how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they do for

  12. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they d

  13. Gaussian processes in non-commutative probability theory

    NARCIS (Netherlands)

    Guţǎ, M.I.

    2002-01-01

    The generalisation of the notion of Gaussian processes from probability theory is investigated in the context of non-commutative probability theory. A non-commutative Gaussian process is viewed as a linear map from an infinite dimensional (real) Hilbert space into an algebra with involution and a po

  14. Transient handover blocking probabilities in road covering cellular mobile networks

    NARCIS (Netherlands)

    Boucherie, Richard J.; Wal, van der Jan

    2003-01-01

    This paper investigates handover and fresh call blocking probabilities for subscribers moving along a road in a traffic jam passing through consecutive cells of a wireless network. It is observed and theoretically motivated that the hand- over blocking probabilities show a sharp peak in the initial

  15. Transient handover blocking probabilities in road covering cellular mobile networks

    NARCIS (Netherlands)

    Boucherie, R.J.; Wal, van der J.

    2002-01-01

    This paper investigates handover and fresh call blocking probabilities for subscribers moving along a road in a traffic jam passing through consecutive cells of a wireless network. It is observed and theoretically motivated that the handover blocking probabilities show a sharp peak in the initial pa

  16. Internal cartel stability with time-dependent detection probabilities

    NARCIS (Netherlands)

    Hinloopen, J.

    2006-01-01

    To account for the illegal nature of price-fixing agreements, per-period detection probabilities that can vary over time are introduced in a dynamic oligopoly. The resulting ICCs for internal cartel stability indicate that for discount factors up to 10% per-period detection probabilities of 5% are n

  17. Probability representation entropy for spin-state tomogram

    OpenAIRE

    Man'ko, O. V.; Man'ko, V. I.

    2004-01-01

    Probability representation entropy (tomographic entropy) of arbitrary quantum state is introduced. Using the properties of spin tomogram to be standard probability distribution function the tomographic entropy notion is discussed. Relation of the tomographic entropy to Shannon entropy and von Neumann entropy is elucidated.

  18. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  19. Approximating the Finite-Time Ruin Probability under Interest Force

    NARCIS (Netherlands)

    Brekelmans, R.C.M.; De Waegenaere, A.M.B.

    2000-01-01

    We present an algorithm to determine both a lower and an upper bound for the finite-time probability of ruin for a risk process with constant interest force. We split the time horizon into smaller intervals of equal length and consider the probability of ruin in case premium income for a time interv

  20. Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment.

    Science.gov (United States)

    Tversky, Amos; Kahneman, Daniel

    1983-01-01

    Judgments under uncertainty are often mediated by intuitive heuristics that are not bound by the conjunction rule of probability. Representativeness and availability heuristics can make a conjunction appear more probable than one of its constituents. Alternative interpretations of this conjunction fallacy are discussed and attempts to combat it…

  1. Randomness as an Equilibrium. Potential and Probability Density

    OpenAIRE

    2002-01-01

    Randomness is viewed through an analogy between a physical quantity, density of gas, and a mathematical construct -- probability density. Boltzmann's deduction of equilibrium distribution of ideal gas placed in an external potential field than provides a way of viewing probability density from a perspective of forces/potentials, hidden behind it.

  2. First hitting probabilities for semi markov chains and estimation

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos

    2017-01-01

    We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain...

  3. Canonical transforms, quantumness and probability representation of quantum mechanics

    CERN Document Server

    Man'ko, Margarita A

    2011-01-01

    The linear canonical transforms of position and momentum are used to construct the tomographic probability representation of quantum states where the fair probability distribution determines the quantum state instead of the wave function or density matrix. The example of Moshinsky shutter problem is considered.

  4. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    Science.gov (United States)

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  5. Probability Quantization for Multiplication-Free Binary Arithmetic Coding

    Science.gov (United States)

    Cheung, K. -M.

    1995-01-01

    A method has been developed to improve on Witten's binary arithmetic coding procedure of tracking a high value and a low value. The new method approximates the probability of the less probable symbol, which improves the worst-case coding efficiency.

  6. Data analysis & probability drill sheets : grades 6-8

    CERN Document Server

    Forest, Chris

    2011-01-01

    For grades 6-8, our Common Core State Standards-based resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. Each drill sheet contains warm-up and timed drill activities for the student to practice data analysis & probability concepts.

  7. Probability Theory, Not the Very Guide of Life

    Science.gov (United States)

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  8. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  9. 4th Workshop on Quantum Probability and Applications

    CERN Document Server

    Waldenfels, Wilhelm

    1990-01-01

    These proceedings of the workshop on quantum probability held in Heidelberg, September 26-30, 1988 contains a representative selection of research articles on quantum stochastic processes, quantum stochastic calculus, quantum noise, geometry, quantum probability, quantum central limit theorems and quantum statistical mechanics.

  10. Preservice Elementary Teachers and the Fundamentals of Probability

    Science.gov (United States)

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  11. The Probability Approach to English If-Conditional Sentences

    Science.gov (United States)

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  12. Probability Modeling and Thinking: What Can We Learn from Practice?

    Science.gov (United States)

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  13. Calculation of Probability Maps Directly from Ordinary Kriging Weights

    Directory of Open Access Journals (Sweden)

    Jorge Kazuo Yamamoto

    2010-03-01

    Full Text Available Probability maps are useful to analyze ores or contaminants in soils and they are helpful to make a decision duringexploration work. These probability maps are usually derived from the indicator kriging approach. Ordinary krigingweights can be used to derive probability maps as well. For testing these two approaches a sample data base was randomlydrawn from an exhaustive data set. From the exhaustive data set actual cumulative distribution functions were determined.Thus, estimated and actual conditional cumulative distribution functions were compared. The vast majority of correlationcoeffi cients between estimated and actual probability maps is greater than 0.75. Not only does the ordinary kriging approachwork, but it also gives slightly better results than median indicator kriging. Moreover, probability maps from ordinary krigingweights are much easier than the traditional approach based on either indicator kriging or median indicator kriging.

  14. Estimation of State Transition Probabilities: A Neural Network Model

    Science.gov (United States)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  15. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed.

  16. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  17. Effectiveness of a computer-aided neuroanatomy program for entry-level physical therapy students: anatomy and clinical examination of the dorsal column-medial lemniscal system.

    Science.gov (United States)

    McKeough, D Michael; Mattern-Baxter, Katrin; Barakatt, Edward

    2010-01-01

    The purpose of this study was to determine if a computer-aided instruction learning module improves students' knowledge of the neuroanatomy/physiology and clinical examination of the dorsal column-medial lemniscal (DCML) system. Sixty-one physical therapy students enrolled in a clinical neurology course in entry-level PT educational programs at two universities participated in the study. Students from University-1 (U1;) had not had a previous neuroanatomy course, while students from University-2 (U2;) had taken a neuroanatomy course in the previous semester. Before and after working with the learning module, students took a paper-and-pencil test on the neuroanatomy/physiology and clinical examination of the DCML system. Kruskal-Wallis one-way ANOVA and Mann-Whitney tests were used to determine if differences existed between neuroanatomy/physiology examination scores and clinical examination scores before and after taking the learning module, and between student groups based on university attended. For students from U1, neuroanatomy/physiology post-test scores improved significantly over pre-test scores (p Neuroanatomy/physiology pre-test scores from U2 were significantly better than those from U1 (p < 0.001); there was no significant difference in post-test scores (p = 0.062). Clinical examination pre-test and post-test scores from U2 were significantly better than those from U1 (p < 0.001). Clinical examination post-test scores improved significantly from the pre-test scores for both U1 (p < 0.001) and U2 (p < 0.001).

  18. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be in

  19. Introducing the Core Probability Framework and Discrete-Element Core Probability Model for efficient stochastic macroscopic modelling

    NARCIS (Netherlands)

    Calvert, S.C.; Taale, H.; Hoogendoorn, S.P.

    2014-01-01

    In this contribution the Core Probability Framework (CPF) is introduced with the application of the Discrete-Element Core Probability Model (DE-CPM) as a new DNL for dynamic macroscopic modelling of stochastic traffic flow. The model is demonstrated for validation in a test case and for computationa

  20. Probability of ventricular fibrillation: allometric model based on the ST deviation

    Directory of Open Access Journals (Sweden)

    Arini Pedro D

    2011-01-01

    Full Text Available Abstract Background Allometry, in general biology, measures the relative growth of a part in relation to the whole living organism. Using reported clinical data, we apply this concept for evaluating the probability of ventricular fibrillation based on the electrocardiographic ST-segment deviation values. Methods Data collected by previous reports were used to fit an allometric model in order to estimate ventricular fibrillation probability. Patients presenting either with death, myocardial infarction or unstable angina were included to calculate such probability as, VFp = δ + β (ST, for three different ST deviations. The coefficients δ and β were obtained as the best fit to the clinical data extended over observational periods of 1, 6, 12 and 48 months from occurrence of the first reported chest pain accompanied by ST deviation. Results By application of the above equation in log-log representation, the fitting procedure produced the following overall coefficients: Average β = 0.46, with a maximum = 0.62 and a minimum = 0.42; Average δ = 1.28, with a maximum = 1.79 and a minimum = 0.92. For a 2 mm ST-deviation, the full range of predicted ventricular fibrillation probability extended from about 13% at 1 month up to 86% at 4 years after the original cardiac event. Conclusions These results, at least preliminarily, appear acceptable and still call for full clinical test. The model seems promising, especially if other parameters were taken into account, such as blood cardiac enzyme concentrations, ischemic or infarcted epicardial areas or ejection fraction. It is concluded, considering these results and a few references found in the literature, that the allometric model shows good predictive practical value to aid medical decisions.