WorldWideScience

Sample records for clinical pre-test probability

  1. VIDAS D-dimer in combination with clinical pre-test probability to rule out pulmonary embolism. A systematic review of management outcome studies.

    Science.gov (United States)

    Carrier, Marc; Righini, Marc; Djurabi, Reza Karami; Huisman, Menno V; Perrier, Arnaud; Wells, Philip S; Rodger, Marc; Wuillemin, Walter A; Le Gal, Grégoire

    2009-05-01

    Clinical outcome studies have shown that it is safe to withhold anticoagulant therapy in patients with suspected pulmonary embolism (PE) who have a negative D-dimer result and a low pretest probability (PTP) either using a PTP model or clinical gestalt. It was the objective of the present study to assess the safety of the combination of a negative VIDAS D-dimer result in combination with a non-high PTP using the Wells or Geneva models to exclude PE. A systematic literature search strategy was conducted using MEDLINE, EMBASE, the Cochrane Register of Controlled Trials and all EBM Reviews. Seven studies (6 prospective management studies and 1 randomised controlled trial) reporting failure rates at three months were included in the analysis. Non-high PTP was defined as "unlikely" using the Wells' model, or "low/intermediate" PTP using either the Geneva score, the Revised Geneva Score, or clinical gestalt. Two reviewers independently extracted data onto standardised forms. A total of 5,622 patients with low/intermediate or unlikely PTP were assessed using the VIDAS D-dimer. PE was ruled out by a negative D-dimer test in 2,248 (40%, 95% confidence intervals [CI] 38.7 to 41.3%) of them. The three-month thromboembolic risk in patients left untreated on the basis of a low/intermediate or unlikely PTP and a negative D-dimer test was 3/2,166 (0.14%, 95% CI 0.05 to 0.41%). In conclusion, the combination of a negative VIDAS D-dimer result and a non-high PTP effectively and safely excludes PE in an important proportion of outpatients with suspected PE. PMID:19404542

  2. A methodological proposal to research patients’ demands and pre-test probabilities using paper forms in primary care settings

    Directory of Open Access Journals (Sweden)

    Gustavo Diniz Ferreira Gusso

    2013-04-01

    Full Text Available Objective: The purpose of this study is to present a methodology for assessing patients’ demands and calculating pre-test probabilities using paper forms in Primary Care. Method: Most developing countries do not use Electronic Health Records (EHR in primary care settings. This makes it difficult to access information regarding what occurs within the health center working process. Basically, there are two methodologies to assess patients’ demands and problems or diagnosis stated by doctors. The first is based on single attendance at each appointment, while the second is based on episodes of care; the latter deals with each problem in a longitudinal manner. The methodology developed in this article followed the approach of confronting the ‘reason for the appointment’ and ‘the problem registered’ by doctors. Paper forms were developed taking this concept as central. All appointments were classified by the International Classification of Primary Care (ICPC. Discussion: Even in paper form, confrontation between ‘reason for the appointment’ and ‘problem registered’ is useful for measuring the pre-test probabilities of each problem-based appointment. This approach can be easily reproduced in any health center and enables a better understanding of population profile. Prevalence of many illnesses and diseases are not known in each reality, and studies conducted in other settings, such as secondary and tertiary care, are not adequate for primary health care. Conclusion: This study offers adequate technology for primary health care workers that have potential to transform each health center into a research-led practice, contributing directly to patient care.

  3. Accuracy of dual-source CT coronary angiography: first experience in a high pre-test probability population without heart rate control

    Energy Technology Data Exchange (ETDEWEB)

    Scheffel, Hans; Alkadhi, Hatem; Desbiolles, Lotus; Frauenfelder, Thomas; Schertler, Thomas; Husmann, Lars; Marincek, Borut; Leschka, Sebastian [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Plass, Andre; Vachenauer, Robert; Grunenfelder, Juerg; Genoni, Michele [Clinic for Cardiovascular Surgery, Zurich (Switzerland); Gaemperli, Oliver; Schepis, Tiziano [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); Kaufmann, Philipp A. [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); University of Zurich, Center for Integrative Human Physiology, Zurich (Switzerland)

    2006-12-15

    The aim of this study was to assess the diagnostic accuracy of dual-source computed tomography (DSCT) for evaluation of coronary artery disease (CAD) in a population with extensive coronary calcifications without heart rate control. Thirty patients (24 male, 6 female, mean age 63.1{+-}11.3 years) with a high pre-test probability of CAD underwent DSCT coronary angiography and invasive coronary angiography (ICA) within 14{+-}9 days. No beta-blockers were administered prior to the scan. Two readers independently assessed image quality of all coronary segments with a diameter {>=}1.5 mm using a four-point score (1: excellent to 4: not assessable) and qualitatively assessed significant stenoses as narrowing of the luminal diameter >50%. Causes of false-positive (FP) and false-negative (FN) ratings were assigned to calcifications or motion artifacts. ICA was considered the standard of reference. Mean body mass index was 28.3{+-}3.9 kg/m{sup 2} (range 22.4-36.3 kg/m{sup 2}), mean heart rate during CT was 70.3{+-}14.2 bpm (range 47-102 bpm), and mean Agatston score was 821{+-}904 (range 0-3,110). Image quality was diagnostic (scores 1-3) in 98.6% (414/420) of segments (mean image quality score 1.68{+-}0.75); six segments in three patients were considered not assessable (1.4%). DSCT correctly identified 54 of 56 significant coronary stenoses. Severe calcifications accounted for false ratings in nine segments (eight FP/one FN) and motion artifacts in two segments (one FP/one FN). Overall sensitivity, specificity, positive and negative predictive value for evaluating CAD were 96.4, 97.5, 85.7, and 99.4%, respectively. First experience indicates that DSCT coronary angiography provides high diagnostic accuracy for assessment of CAD in a high pre-test probability population with extensive coronary calcifications and without heart rate control. (orig.)

  4. Pre-Test Assessment

    Science.gov (United States)

    Berry, Thomas

    2008-01-01

    Pre-tests are a non-graded assessment tool used to determine pre-existing subject knowledge. Typically pre-tests are administered prior to a course to determine knowledge baseline, but here they are used to test students prior to topical material coverage throughout the course. While counterintuitive, the pre-tests cover material the student is…

  5. Probability, clinical decision making and hypothesis testing

    Directory of Open Access Journals (Sweden)

    A Banerjee

    2009-01-01

    Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.

  6. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability

    OpenAIRE

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-01-01

    Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomog...

  7. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study

    NARCIS (Netherlands)

    Noordman, J.; Weijden, T.T. van der; Dulmen, S. van

    2014-01-01

    AIMS: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. BACKGROUND: Continuing professional education may be

  8. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study.

    NARCIS (Netherlands)

    Noordman, J.; Weijden, T. van der; Dulmen, S. van

    2014-01-01

    Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be

  9. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If...

  10. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study.

    OpenAIRE

    Noordman, J.; van der Weijden, T; Van Dulmen, S.

    2014-01-01

    Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be necessary to refresh and reflect on the communication and motivational interviewing skills of experienced primary care practice nurses. A video-feedback method was designed to improve these skills...

  11. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie;

    2008-01-01

    The preoperative prevalence of deep venous thrombosis (DVT) in patients with colorectal cancer may be as high as 8%. In order to minimize the risk of pulmonary embolism, it is important to rule out preoperative DVT. A large study has confirmed that a negative D-dimer test in combination with a low...

  12. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  13. [Clinical probability of PE: should we use a clinical prediction rule?].

    Science.gov (United States)

    Le Gal, G; Righini, M; Perrier, A

    2008-12-01

    The determination of the clinical pretest probability using clinical prediction models is an important step in the assessment of patients with suspected pulmonary embolism (PE). It helps establish which test or sequence of tests can effectively corroborate or safely rule out PE. For example, it has been demonstrated that it is safe to withhold anticoagulant therapy in patients with negative d-dimer results and low pretest probability at initial presentation. Clinical probability will also increase the diagnostic yield of ventilation perfusion lung scan. Compared with clinical gestalt, clinical prediction rules provide a standardized and more reproducible estimate of a patient's probability of having a PE. Clinical prediction models combine aspects of the history and physical examination to categorize a patient's probability of having a disease. The models classify patients as having a low, moderate, or high likelihood of having PE. Clinical prediction models have been validated and are well established for the diagnosis of PE in symptomatic patients. They allow all physicians, whatever their expertise, to reliably determine the clinical pretest probability of PE, and thus safely manage their patients using diagnostic and therapeutic algorithms. PMID:19084205

  14. Assessment of clinical utility of 18F-FDG PET in patients with head and neck cancer: a probability analysis

    International Nuclear Information System (INIS)

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  15. Clinical features of probable severe acute respiratory syndrome in Beijing

    Institute of Scientific and Technical Information of China (English)

    Hai-Ying Lu; Xiao-Yuan Xu; Yu Lei; Yang-Feng Wu; Bo-Wen Chen; Feng Xiao; Gao-Qiang Xie; De-Min Han

    2005-01-01

    AIM: To summarize clinical features of probable severe acute respiratory syndrome (SARS) in Beijing.METHODS: Retrospective cases involving 801 patients admitted to hospitals in Beijing between March and June 2003, with a diagnosis of probable SARS, moderate type.The series of clinical manifestation, laboratory and radiograph data obtained from 801 cases were analyzed. RESULTS: One to three days after the onset of SARS, the major clinical symptoms were fever (in 88.14% of patients), fatigue, headache, myalgia, arthralgia (25-36%), etc. The counts of WBC (in 22.56% of patients) lymphocyte (70.25%)and CD3, CD4, CD8 positive T cells (70%) decreased. From 4-7 d, the unspecific symptoms became weak; however, the rates of low respiratory tract symptoms, such as cough (24.18%), sputum production (14.26%), chest distress (21.04%) and shortness of breath (9.23%) increased, so did the abnormal rates on chest radiograph or CT. The low counts of WBC, lymphocyte and CD3, CD4, CD8 positiveT cells touched bottom. From 8 to 16 d, the patients presented progressive cough (29.96%), sputum production (13.09%), chest distress (29.96%) and shortness of breath (35.34%). All patients had infiltrates on chest radiograph or CT, some even with multi-infiltrates. Two weeks later, patients' respiratory symptoms started to alleviate, the infiltrates on the lung began to absorb gradually, the counts of WBC, lymphocyte and CD3, CD4, CD8 positive T cells were restored to normality.CONCLUSION: The data reported here provide evidence that the course of SARS could be divided into four stages, namely the initial stage, progressive stage, fastigium and convalescent stage.

  16. Bayesian probability of success for clinical trials using historical data.

    Science.gov (United States)

    Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F

    2015-01-30

    Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. PMID:25339499

  17. Assessment of clinical utility of {sup 18}F-FDG PET in patients with head and neck cancer: a probability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Goerres, Gerhard W.; Mosna-Firlejczyk, Katarzyna; Schulthess, Gustav K. von [Division of Nuclear Medicine, University Hospital Zurich, Raemistrasse 100, 8091, Zurich (Switzerland); Steurer, Johann; Bachmann, Lucas M. [Horten Centre, University of Zurich, Zurich (Switzerland)

    2003-04-01

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  18. Ventilation-perfusion scanning and pulmonary angiography: correlation in clinical high-probability pulmonary embolism

    International Nuclear Information System (INIS)

    During a 3-year period, 173 clinically selected patients underwent pulmonary angiography to confirm or exclude acute pulmonary embolism. All patients had undergone ventilation-perfusion (V/Q) scanning (167 patients) or perfusion scanning alone (six) before angiography. Angiography was done because the results of the V/Q scanning did not satisfy the clinician's needs for certainty. The results of the V/Q and studies were compared to determine the relative accuracy of V/Q scanning in this clinical setting. Pulmonary embolism was found in seven (15%) of 47 patients with low-probability scans, 11 (32%) of 34 patients with intermediate-probability scans, 22 (39%) of 57 patients with indeterminate scans, and 23 (66%) of 35 patients with high-probability scans. In this clinically selected population, low-probability scans were more accurate in excluding pulmonary embolism than were high-probability scans in establishing that diagnosis

  19. Knowledge of the D-dimer test result influences clinical probability assessment of pulmonary embolism

    NARCIS (Netherlands)

    R.A. Douma; J.B.F. Kessels; H.R. Büller; V.E.A. Gerdes

    2010-01-01

    Background: In patients with suspected pulmonary embolism (PE), an unlikely or non-high probability assessment combined with a normal D-dimer test can safely exclude the diagnosis. We studied the influence of early D-dimer knowledge on clinical probability assessment. Methods: A questionnaire was se

  20. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch;

    2016-01-01

    the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  1. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t

  2. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn [RaySearch Laboratories, Sveavägen 44, Stockholm SE-111 34 (Sweden); Forsgren, Anders [Optimization and Systems Theory, Department of Mathematics, KTH Royal Institute of Technology, Stockholm SE-100 44 (Sweden)

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.

  3. Clinical and radiological parameters of patients with lung thromboembolism, diagnosed by high probability ventilation / perfusion scintigraphies

    International Nuclear Information System (INIS)

    Background: pulmonary embolism (PE) remains an elusive diagnosis, and still causes too many unexpected deaths. Because of this, noninvasive investigations are done when pulmonary embolism is suspected. Objective: to determine the clinical and x-rays findings in patients with diagnosis of pulmonary embolism by high probability ventilation/perfusion (V/Q) lung scan. Materials and methods: inpatient medical records of 91 patients with clinical suspected PE and high and low probability V/Q lung scan were analyzed (PIOPED criterion). Results: there were statistics correlation with four clinical findings: hemoptysis (p value=0,02, odds ratio=8,925), taquicardia (p value=0,02 odds ratio=3,5), chest pain (p value=0,01, odds ratio=1,87), and recent surgery (p value=0,02, odds ratio=2,762). The 70,7% chest x-rays were normal (p value < 0,001). Conclusion: the clinical and x-rays findings in patients with diagnosis of PE by high probability V/Q lung scan were: hemoptysis, taquicardia, chest pain, recent surgery and normal chest x-ray. This is important because would help to choose the patients in whom the V/Q lung scan will have the maximal performance (Au)

  4. Low-probability ventilation-perfusion scintigrams: clinical outcomes in 99 patients

    International Nuclear Information System (INIS)

    To evaluate the reliability of low probability ventilation-perfursion (V-P) scintigrams in excluding pulmonary embolism (PE), the authors reviewed the clinical records of 99 consecutive patients whose V-P studies had been interpreted as indicative of a low probability of PE. None of the 99 patients were referred for pulmonary angiography. Seven of the hospitalized patients died during the index admission and seven additional hospitalized patients died 1-5 months after discharge from the hospital. None were thought clinically to have died as a result of PE, and autopsy disclosed no PE in two. Follow-up information was obtained for 69 surviving patients not treated with anticoagulants. None of these patients were thought clinically to have had PE during follow-up of a least 2 weeks duration (greater than 2 months in 93% and greater than 6 months in 75%). The results suggest that major short-term morbidity or death attributable to PE are quite infrequent in patients with low-probability V-P scintigrams

  5. Cognitive Laboratory Experiences : On Pre-testing Computerised Questionnaires

    NARCIS (Netherlands)

    Snijkers, G.J.M.E.

    2002-01-01

    In the literature on questionnaire design and survey methodology, pre-testing is mentioned as a way to evaluate questionnaires (i.e. investigate whether they work as intended) and control for measurement errors (i.e. assess data quality). As the American Statistical Association puts it (ASA, 1999, p

  6. CT abnormality in multiple sclerosis analysis based on 28 probable cases and correlation with clinical manifestations

    International Nuclear Information System (INIS)

    In order to investigate the occurrence and nature of CT abnormality and its correlation with clinical manifestations in multiple sclerosis, 34 CT records obtained from 28 consecutive patients with probable multiple sclerosis were reviewed. Forty-six percent of all cases showed abnormal CT. Dilatation of cortical sulci was found in 39%; dilatation of the lateral ventricle in 36%; dilatation of prepontine or cerebello-pontine cistern and the fourth ventricle, suggesting brainstem atrophy, in 18%; dilatation of cerebellar sulci, superior cerebellar cistern and cisterna magna, suggesting cerebellar atrophy, in 11%. Low density area was found in the cerebral hemisphere in 11% of cases. Contrast enhancement, performed on 25 CT records, did not show any change. There was no correlation between CT abnormality and duration of the illness. Although abnormal CT tended to occur more frequently during exacerbations and chronic stable state than during remissions, the difference was not statistically significant. CT abnormalities suggesting brainstem atrophy, cerebellar atrophy or plaques were found exclusively during exacerbations and chronic stable state. The occurrence of CT abnormalities was not significantly different among various clinical forms which were classified based on clinically estimated sites of lesion, except that abnormal CT tended to occur less frequently in cases classified as the optic-spinal form. It is noteworthy that cerebral cortical atrophy and/or dilatation of the lateral ventricle were found in 31% of cases who did not show any clinical sign of cerebral involvement. There was a statistically significant correlation between CT abnormalities and levels of clinical disability. Eighty percent of the bedridden or severely disabled patients showed abnormal CT, in contrast with only 29% of those with moderate, slight or no disability. (author)

  7. Topological characteristics of brainstem lesions in clinically definite and clinically probable cases of multiple sclerosis: An MRI-study

    International Nuclear Information System (INIS)

    Disseminated lesions in the white matter of the cerebral hemispheres and confluent lesions at the borders of the lateral ventricles as seen on MRI are both considered acceptable paraclinical evidence for the diagnosis of multiple sclerosis. Similar changes are, however, also found in vascular diseases of the brain. We therefore aimed at identifying those additional traits in the infratentorial region, which in our experience are not frequently found in cerebrovascular pathology. We evaluated MR brain scans of 68 patients and found pontine lesions in 71% of cases with a clinically definite diagnosis (17 out of 24) and in 33% of cases with a probable diagnosis (14 out of 43). Lesions in the medulla oblongata were present in 50% and 16%, respectively, and in the midbrain in 25% and 7%, respectively. With rare exceptions all brainstem lesions were contiguous with the cisternal or ventricular cerebrospinal fluid spaces. In keeping with post-mortem reports the morphological spectrum ranged from large confluent patches to solitary, well delineated paramedian lesions or discrete linings of the cerebrospinal fluid border zones and were most clearly depicted from horizontal and sagittal T2 weighted SE-sequences. If there is a predilection for the outer or inner surfaces of the brainstem, such lesions can be considered an additional typical feature of multiple sclerosis and can be more reliably weighted as paraclinical evidence for a definite diagnosis. (orig.)

  8. Pre- and Post-Operative Nomograms to Predict Recurrence-Free Probability in Korean Men with Clinically Localized Prostate Cancer

    OpenAIRE

    Minyong Kang; Chang Wook Jeong; Woo Suk Choi; Yong Hyun Park; Sung Yong Cho; Sangchul Lee; Seung Bae Lee; Ja Hyeon Ku; Sung Kyu Hong; Seok-Soo Byun; Hyeon Jeong; Cheol Kwak; Hyeon Hoe Kim; Eunsik Lee; Sang Eun Lee

    2014-01-01

    OBJECTIVES: Although the incidence of prostate cancer (PCa) is rapidly increasing in Korea, there are few suitable prediction models for disease recurrence after radical prostatectomy (RP). We established pre- and post-operative nomograms estimating biochemical recurrence (BCR)-free probability after RP in Korean men with clinically localized PCa. PATIENTS AND METHODS: Our sampling frame included 3,034 consecutive men with clinically localized PCa who underwent RP at our tertiary centers from...

  9. Clinical radiobiology of glioblastoma multiforme. Estimation of tumor control probability from various radiotherapy fractionation schemes

    Energy Technology Data Exchange (ETDEWEB)

    Pedicini, Piernicola [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy); Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Fiorentino, Alba [Sacro Cuore - Don Calabria Hospital, Radiation Oncology Department, Negrar, Verona (Italy); Simeon, Vittorio [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Laboratory of Preclinical and Translational Research, Rionero-in-Vulture (Italy); Tini, Paolo; Pirtoli, Luigi [University of Siena and Tuscany Tumor Institute, Unit of Radiation Oncology, Department of Medicine Surgery and Neurological Sciences, Siena (Italy); Chiumento, Costanza [Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Salvatore, Marco [I.R.C.C.S. SDN Foundation, Unit of Nuclear Medicine, Napoli (Italy); Storto, Giovanni [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy)

    2014-10-15

    The aim of this study was to estimate a radiobiological set of parameters from the available clinical data on glioblastoma (GB). A number of clinical trial outcomes from patients affected by GB and treated with surgery and adjuvant radiochemotherapy were analyzed to estimate a set of radiobiological parameters for a tumor control probability (TCP) model. The analytical/graphical method employed to fit the clinical data allowed us to estimate the intrinsic tumor radiosensitivity (α), repair capability (b), and repopulation doubling time (T{sub d}) in a first phase, and subsequently the number of clonogens (N) and kick-off time for accelerated proliferation (T{sub k}). The results were used to formulate a hypothesis for a scheduleexpected to significantly improve local control. The 95 % confidence intervals (CI{sub 95} {sub %}) of all parameters are also discussed. The pooled analysis employed to estimate the parameters summarizes the data of 559 patients, while the studies selected to verify the results summarize data of 104 patients. The best estimates and the CI{sub 95} {sub %} are α = 0.12 Gy{sup -1} (0.10-0.14), b = 0.015 Gy{sup -2} (0.013-0.020), α/b = 8 Gy (5.0-10.8), T{sub d} = 15.4 days (13.2-19.5), N = 1 . 10{sup 4} (1.2 . 10{sup 3} - 1 . 10{sup 5}), and T{sub k} = 37 days (29-46). The dose required to offset the repopulation occurring after 1 day (D{sub prolif}) and starting after T{sub k} was estimated as 0.30 Gy/day (0.22-0.39). The analysis confirms a high value for the α/b ratio. Moreover, a high intrinsic radiosensitivity together with a long kick-off time for accelerated repopulation and moderate repopulation kinetics were found. The results indicate a substantial independence of the duration of the overall treatment and an improvement in the treatment effectiveness by increasing the total dose without increasing the dose fraction. (orig.) [German] Schaetzung eines strahlenbiologischen Parametersatzes auf der Grundlage klinischer Daten bei

  10. On pre-test sensitisation and peer assessment to enhance learning gain in science education

    OpenAIRE

    Bos, Antonius Bernardus Hendrikus

    2009-01-01

    *The main part of this thesis focuses on designing, optimising, and studying the embedding of two types of interventions: pre-testing and peer assessment, both supported by or combined with ICT-tools. * Pre-test sensitisation is used intentionally to boost the learning gain of the main intervention, an interactive, multimodal learning environment, designed for the pre-training of science concepts . The results show a high learning gain, especially after applying a pre-test by interaction of t...

  11. A Clinical model to identify patients with high-risk coronary artery disease

    NARCIS (Netherlands)

    Y. Yang (Yelin); L. Chen (Li); Y. Yam (Yeung); S. Achenbach (Stephan); M. Al-Mallah (Mouaz); D.S. Berman (Daniel); M.J. Budoff (Matthew); F. Cademartiri (Filippo); T.Q. Callister (Tracy); H.-J. Chang (Hyuk-Jae); V.Y. Cheng (Victor); K. Chinnaiyan (Kavitha); R.C. Cury (Ricardo); A. Delago (Augustin); A. Dunning (Allison); G.M. Feuchtner (Gudrun); M. Hadamitzky (Martin); J. Hausleiter (Jörg); R.P. Karlsberg (Ronald); P.A. Kaufmann (Philipp); Y.-J. Kim (Yong-Jin); J. Leipsic (Jonathon); T.M. LaBounty (Troy); F.Y. Lin (Fay); E. Maffei (Erica); G.L. Raff (Gilbert); L.J. Shaw (Leslee); T.C. Villines (Todd); J.K. Min (James K.); B.J.W. Chow (Benjamin)

    2015-01-01

    textabstractObjectives This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Background Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify th

  12. On pre-test sensitisation and peer assessment to enhance learning gain in science education

    NARCIS (Netherlands)

    Bos, Antonius Bernardus Hendrikus

    2009-01-01

    *The main part of this thesis focuses on designing, optimising, and studying the embedding of two types of interventions: pre-testing and peer assessment, both supported by or combined with ICT-tools. * Pre-test sensitisation is used intentionally to boost the learning gain of the main intervention,

  13. [Dynamic obstruction to left ventricular outflow during dobutamine stress echocardiography: the probable mechanisms and clinical implications].

    Science.gov (United States)

    Scandura, S; Arcidiacono, S; Felis, S; Barbagallo, G; Deste, W; Drago, A; Calvi, V; Giuffrida, G

    1998-11-01

    We observed the development of left ventricular outflow tract dynamic obstruction in some patients during dobutamine stress echocardiography. The purpose of this study was to identify the possible mechanisms and to consider the clinical implications. From 11/04/94 to 01/09/97 we studied 547 patients; 42 patients developed dynamic obstruction, defined as a late peak Doppler velocity profile that exceeded baseline outflow velocity by at least 1 m/s. The encountered mechanisms were: increased myocardial contractility; systolic anterior motion of the mitral valve; decreased venous return to the left ventricle, and peculiar characteristics of the left ventricular geometry. The results of this study show that the dynamic obstruction is mainly due to the first mechanism and secondarily to some characteristics of the left ventricular geometry. The hypotension observed in a few cases is not related to the dynamic obstruction but to beta 2 receptor hypersensibility to dobutamine. The symptoms, like dyspnea and chest pain, experienced by these patients are related to the dynamic obstruction rather than to the presence of coronary artery disease. In conclusion, we think that patients who develop dynamic obstruction, without wall motion abnormalities, during dobutamine stress echocardiography, may behave pathophysiologically as patients with obstructive hypertrophic cardiomyopathy, in whom diastolic dysfunction and outflow tract obstruction are responsible for symptoms. Therefore, these patients require a pharmacological treatment with beta blockers and/or non-dihydropyridine calcium channel blockers. PMID:9922586

  14. Pre- and post-operative nomograms to predict recurrence-free probability in korean men with clinically localized prostate cancer.

    Directory of Open Access Journals (Sweden)

    Minyong Kang

    Full Text Available OBJECTIVES: Although the incidence of prostate cancer (PCa is rapidly increasing in Korea, there are few suitable prediction models for disease recurrence after radical prostatectomy (RP. We established pre- and post-operative nomograms estimating biochemical recurrence (BCR-free probability after RP in Korean men with clinically localized PCa. PATIENTS AND METHODS: Our sampling frame included 3,034 consecutive men with clinically localized PCa who underwent RP at our tertiary centers from June 2004 through July 2011. After inappropriate data exclusion, we evaluated 2,867 patients for the development of nomograms. The Cox proportional hazards regression model was used to develop pre- and post-operative nomograms that predict BCR-free probability. Finally, we resampled from our study cohort 200 times to determine the accuracy of our nomograms on internal validation, which were designated with concordance index (c-index and further represented by calibration plots. RESULTS: Over a median of 47 months of follow-up, the estimated BCR-free rate was 87.8% (1 year, 83.8% (2 year, and 72.5% (5 year. In the pre-operative model, Prostate-Specific Antigen (PSA, the proportion of positive biopsy cores, clinical T3a and biopsy Gleason score (GS were independent predictive factors for BCR, while all relevant predictive factors (PSA, extra-prostatic extension, seminal vesicle invasion, lymph node metastasis, surgical margin, and pathologic GS were associated with BCR in the post-operative model. The c-index representing predictive accuracy was 0.792 (pre- and 0.821 (post-operative, showing good fit in the calibration plots. CONCLUSIONS: In summary, we developed pre- and post-operative nomograms predicting BCR-free probability after RP in a large Korean cohort with clinically localized PCa. These nomograms will be provided as the mobile application-based SNUH Prostate Cancer Calculator. Our nomograms can determine patients at high risk of disease recurrence

  15. Free Fall Misconceptions: Results of a Graph Based Pre-Test of Sophomore Civil Engineering Students

    Science.gov (United States)

    Montecinos, Alicia M.

    2014-01-01

    A partially unusual behaviour was found among 14 sophomore students of civil engineering who took a pre test for a free fall laboratory session, in the context of a general mechanics course. An analysis contemplating mathematics models and physics models consistency was made. In all cases, the students presented evidence favoring a correct free…

  16. 40 CFR 86.1334-84 - Pre-test engine and dynamometer preparation.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Pre-test engine and dynamometer preparation. 86.1334-84 Section 86.1334-84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... from the secondary dilution tunnel . Particulate sample filters need not be stabilized or weighed,...

  17. 40 CFR 92.125 - Pre-test procedures and preconditioning.

    Science.gov (United States)

    2010-07-01

    ... valve is measured using in-use pressures and bypass flows (see § 92.118). (C) The response time measured....125 Pre-test procedures and preconditioning. (a) Locomotive testing. (1) Determine engine lubricating... fuel supply system and purge as necessary; determine that the fuel to be used during emission...

  18. Pre-test calculations of SPES experiment - a loss of main feedwater transient

    International Nuclear Information System (INIS)

    Results of a pre-test calculation of international standard experiment ISP-22 SPES are shown in this paper. SPES facility represents a model of three-loop PWR power plant which was used to perform an experimental loss of main feedwater transient with emergency feedwater delayed. calculation was performed by RELAP5/MOD2/36.1 computer code which we had converted to VAX computers. (author)

  19. Screening for pulmonary embolism with a D-dimer assay: do we still need to assess clinical probability as well?

    OpenAIRE

    Hammond, Christopher J; Hassan, Tajek B.

    2005-01-01

    Clinical risk stratification and D-dimer assay can be of use in excluding pulmonary embolism in patients presenting to emergency departments but many D-dimer assays exist and their accuracy varies. We used clinical risk stratification combined with a quantitative latex-agglutination D-dimer assay to screen patients before arranging further imaging if required. Retrospective analysis of a sequential series of 376 patients revealed that no patient with a D-dimer of

  20. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  2. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Prospective validation of a risk calculator which calculates the probability of a positive prostate biopsy in a contemporary clinical cohort

    NARCIS (Netherlands)

    van Vugt, Heidi A.; Kranse, Ries; Steyerberg, Ewout W.; van der Poel, Henk G.; Busstra, Martijn; Kil, Paul; Oomens, Eric H.; de Jong, Igle J.; Bangma, Chris H.; Roobol, Monique J.

    2012-01-01

    Background: Prediction models need validation to assess their value outside the development setting. Objective: To assess the external validity of the European Randomised study of Screening for Prostate Cancer (ERSPC) Risk Calculator (RC) in a contemporary clinical cohort. Methods: The RC calculates

  5. Pre-test CFD Calculations for a Bypass Flow Standard Problem

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson

    2011-11-01

    The bypass flow in a prismatic high temperature gas-cooled reactor (HTGR) is the flow that occurs between adjacent graphite blocks. Gaps exist between blocks due to variances in their manufacture and installation and because of the expansion and shrinkage of the blocks from heating and irradiation. Although the temperature of fuel compacts and graphite is sensitive to the presence of bypass flow, there is great uncertainty in the level and effects of the bypass flow. The Next Generation Nuclear Plant (NGNP) program at the Idaho National Laboratory has undertaken to produce experimental data of isothermal bypass flow between three adjacent graphite blocks. These data are intended to provide validation for computational fluid dynamic (CFD) analyses of the bypass flow. Such validation data sets are called Standard Problems in the nuclear safety analysis field. Details of the experimental apparatus as well as several pre-test calculations of the bypass flow are provided. Pre-test calculations are useful in examining the nature of the flow and to see if there are any problems associated with the flow and its measurement. The apparatus is designed to be able to provide three different gap widths in the vertical direction (the direction of the normal coolant flow) and two gap widths in the horizontal direction. It is expected that the vertical bypass flow will range from laminar to transitional to turbulent flow for the different gap widths that will be available.

  6. Pre-test analysis of ATLAS SBO with RCP seal leakage scenario using MARS code

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Quang Huy; Lee, Sang Young; Oh, Seung Jong [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    This study presents a pre-test calculation for the Advanced Thermal-hydraulic Test Loop for Accident Simulation (ATLAS) SBO experiment with RCP seal leakage scenario. Initially, turbine-driven auxfeed water pumps are used. Then, outside cooling water injection method is used for long term cooling. The analysis results would be useful for conducting the experiment to verify the APR 1400 extended SBO optimum mitigation strategy using outside cooling water injection in future. The pre-test calculation for ATLAS extended SBO with RCP seal leakage and outside cooling water injection scenario is performed. After Fukushima nuclear accident, the capability of coping with the extended station blackout (SBO) becomes important. Many NPPs are applying FLEX approach as main coping strategies for extended SBO scenarios. In FLEX strategies, outside cooling water injection to reactor cooling system (RCS) and steam generators (SGs) is considered as an effective method to remove residual heat and maintain the inventory of the systems during the accident. It is worthwhile to examine the soundness of outside cooling water injection method for extended SBO mitigation by both calculation and experimental demonstration. From the calculation results, outside cooling water injection into RCS and SGs is verified as an effective method during extended SBO when RCS and SGs depressurization is sufficiently performed.

  7. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  8. Confirmed Datura poisoning in a horse most probably due to D. ferox in contaminated tef hay : clinical communication

    Directory of Open Access Journals (Sweden)

    R. Gerber

    2006-06-01

    Full Text Available Two out of a group of 23 mares exposed to tef hay contaminated with Datura ferox (and possibly D. stramonium developed colic. The 1st animal was unresponsive to conservative treatment, underwent surgery for severe intestinal atony and had to be euthanased. The 2nd was less seriously affected, responded well to analgesics and made an uneventful recovery. This horse exhibited marked mydriasis on the first 2 days of being poisoned and showed protracted, milder mydriasis for a further 7 days. Scopolamine was chemically confirmed in urine from this horse for 3 days following the colic attack, while atropine could just be detected for 2 days. Scopolamine was also the main tropane alkaloid found in the contaminating plant material, confirming that this had most probably been a case of D. ferox poisoning. Although Datura intoxication of horses from contaminated hay was suspected previously, this is the 1st case where the intoxication could be confirmed by urine analysis for tropane alkaloids. Extraction and detection methods for atropine and scopolamine in urine are described employing enzymatic hydrolysis followed by liquid-liquid extraction and liquid chromatography tandem mass spectrometry (LC/MS/MS.

  9. Developing and pre-testing a decision board to facilitate informed choice about delivery approach in uncomplicated pregnancy

    Directory of Open Access Journals (Sweden)

    Wood Stephen

    2009-10-01

    Full Text Available Abstract Background The rate of caesarean sections is increasing worldwide, yet medical literature informing women with uncomplicated pregnancies about relative risks and benefits of elective caesarean section (CS compared with vaginal delivery (VD remains scarce. A decision board may address this gap, providing systematic evidence-based information so that patients can more fully understand their treatment options. The objective of our study was to design and pre-test a decision board to guide clinical discussions and enhance informed decision-making related to delivery approach (CS or VD in uncomplicated pregnancy. Methods Development of the decision board involved two preliminary studies to determine women's preferred mode of risk presentation and a systematic literature review for the most comprehensive presentation of medical risks at the time (VD and CS. Forty women were recruited to pre-test the tool. Eligible subjects were of childbearing age (18-40 years but were not pregnant in order to avoid raising the expectation among pregnant women that CS was a universally available birth option. Women selected their preferred delivery approach and completed the Decisional Conflict Scale to measure decisional uncertainty before and after reviewing the decision board. They also answered open-ended questions reflecting what they had learned, whether or not the information had helped them to choose between birth methods, and additional information that should be included. Descriptive statistics were used to analyse sample characteristics and women's choice of delivery approach pre/post decision board. Change in decisional conflict was measured using Wilcoxon's sign rank test for each of the three subscales. Results The majority of women reported that they had learned something new (n = 37, 92% and that the tool had helped them make a hypothetical choice between delivery approaches (n = 34, 85%. Women wanted more information about neonatal risks and

  10. Transition probabilities of HER2-positive and HER2-negative breast cancer patients treated with Trastuzumab obtained from a clinical cancer registry dataset.

    Science.gov (United States)

    Pobiruchin, Monika; Bochum, Sylvia; Martens, Uwe M; Kieser, Meinhard; Schramm, Wendelin

    2016-06-01

    Records of female breast cancer patients were selected from a clinical cancer registry and separated into three cohorts according to HER2-status (human epidermal growth factor receptor 2) and treatment with or without Trastuzumab (a humanized monoclonal antibody). Propensity score matching was used to balance the cohorts. Afterwards, documented information about disease events (recurrence of cancer, metastases, remission of local/regional recurrences, remission of metastases and death) found in the dataset was leveraged to calculate the annual transition probabilities for every cohort. PMID:27054173

  11. TOPFLOW-PTS experiments. pre-test calculations with NEPTUNECFD code

    International Nuclear Information System (INIS)

    Hypothetical Small Break Loss Of Coolant Accident is identified as one of the most severe transients leading to a potential huge Pressurized Thermal Shock on the Reactor Pressure Vessel (RPV). This may result in two-phase flow configurations in the cold legs, according to the operating conditions, and to reliably assess the RPV wall integrity, advanced two-phase flow simulations are required. Related needs in development and/or validation of these advanced models are important, and the on-going TOPFLOW-PTS experimental program was designed to provide a well documented data base to meet these needs. This paper focuses on pre-test NEPTUNECFD simulations of TOPFLOW-PTS experiments; these simulations were performed to (i) help in the definition of the test matrix and test procedure, and (ii) check the presence of the different key physical phenomena at the mock-up scale. (author)

  12. Probability of Extraprostatic Disease According to the Percentage of Positive Biopsy Cores in Clinically Localized Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Thiago N. Valette

    2015-06-01

    Full Text Available ABSTRACTObjectivePrediction of extraprostatic disease in clinically localized prostate cancer is relevant for treatment planning of the disease. The purpose of this study was to explore the usefulness of the percentage of positive biopsy cores to predict the chance of extraprostatic cancer.Materials and MethodsWe evaluated 1787 patients with localized prostate cancer submitted to radical prostatectomy. The percentage of positive cores in prostate biopsy was correlated with the pathologic outcome of the surgical specimen. In the final analysis, a correlation was made between categorical ranges of positive cores (10% intervals and the risk of extraprostatic extension and/or bladder neck invasion, seminal vesicles involvement or metastasis to iliac lymph nodes. Student's t test was used for statistical analysis.ResultsFor each 10% of positive cores we observed a progressive higher prevalence of extraprostatic disease. The risk of cancer beyond the prostate capsule for ConclusionThe percentage of positive cores in prostate biopsy can predict the risk of cancer outside the prostate. Our study shows that the percentage of positive prostate biopsy fragments helps predict the chance of extraprostatic cancer and may have a relevant role in the patient's management.

  13. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories.

    Science.gov (United States)

    Robson, Barry; Boray, Srinidhi

    2015-11-01

    We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management. PMID:26386548

  14. Pre-test prediction report LOBI-MOD2 Test BT-12 large steam line break

    International Nuclear Information System (INIS)

    The RETRAN-02 code has been selected by the CEGB for independent assessment of the thermal hydraulic component of the intact circuit fault safety case for Sizewell B. An important source of validation data for RETRAN is the European Community sponsored LOB1-MOD2 Integral Test Facility. One component of the agreed LOB1 test matrix is the large (100%) steam line break test BT-12 for which the UK has been designated as partner country. This report details the pre-test predictions undertaken in support of Test BT-12 using the RETRAN-02/Mod 3 code. Three separate analyses are presented. In addition to the best estimate prediction, two scoping predictions are presented which respectively minimise and maximise the primary cooldown. The best estimate calculation was undertaken using dynamic slip with multi-node steam generator representations. The maximum cooldown was obtained using a single bubble rise volume broken loop steam generator model to minimise the liquid carryover to the break. The minimum cooldown used full noding for the broken loop steam generator but without slip (ie equal phase velocities) to maximise the carryover. A number of modelling difficulties had to be overcome including steady state initialisation at the zero feed and steam flow hot standby condition. (author)

  15. Comparison of patient comprehension of rapid HIV pre-test fundamentals by information delivery format in an emergency department setting

    Directory of Open Access Journals (Sweden)

    Clark Melissa A

    2007-09-01

    Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients

  16. Strong association between serological status and probability of progression to clinical visceral leishmaniasis in prospective cohort studies in India and Nepal.

    Directory of Open Access Journals (Sweden)

    Epco Hasker

    Full Text Available INTRODUCTION: Asymptomatic persons infected with the parasites causing visceral leishmaniasis (VL usually outnumber clinically apparent cases by a ratio of 4-10 to 1. We assessed the risk of progression from infection to disease as a function of DAT and rK39 serological titers. METHODS: We used available data on four cohorts from villages in India and Nepal that are highly endemic for Leishmania donovani. In each cohort two serosurveys had been conducted. Based on results of initial surveys, subjects were classified as seronegative, moderately seropositive or strongly seropositive using both DAT and rK39. Based on the combination of first and second survey results we identified seroconvertors for both markers. Seroconvertors were subdivided in high and low titer convertors. Subjects were followed up for at least one year following the second survey. Incident VL cases were recorded and verified. RESULTS: We assessed a total of 32,529 enrolled subjects, for a total follow-up time of 72,169 person years. Altogether 235 incident VL cases were documented. The probability of progression to disease was strongly associated with initial serostatus and with seroconversion; this was particularly the case for those with high titers and most prominently among seroconvertors. For high titer DAT convertors the hazard ratio reached as high as 97.4 when compared to non-convertors. The strengths of the associations varied between cohorts and between markers but similar trends were observed between the four cohorts and the two markers. DISCUSSION: There is a strongly increased risk of progressing to disease among DAT and/or rK39 seropositives with high titers. The options for prophylactic treatment for this group merit further investigation, as it could be of clinical benefit if it prevents progression to disease. Prophylactic treatment might also have a public health benefit if it can be corroborated that these asymptomatically infected individuals are infectious

  17. Association between high biomarker probability of Alzheimer's disease and improvement of clinical outcomes after shunt surgery in patients with idiopathic normal pressure hydrocephalus.

    Science.gov (United States)

    Kazui, Hiroaki; Kanemoto, Hideki; Yoshiyama, Kenji; Kishima, Haruhiko; Suzuki, Yukiko; Sato, Shunsuke; Suehiro, Takashi; Azuma, Shingo; Yoshimine, Toshiki; Tanaka, Toshihisa

    2016-10-15

    We examined the effect of the pathology of Alzheimer's disease (AD) on improvement of clinical symptoms after shunt surgery in patients with idiopathic normal pressure hydrocephalus (iNPH). Forty-four iNPH patients were classified into 18 patients with (iNPH/AD+) and 26 patients without (iNPH/AD-) combination with low amyloid β42 and high total tau in cerebrospinal fluid (CSF). We compared improvements after lumbo-peritoneal shunt surgery (LPS) between the two groups in Timed Up & Go Test, 10-m reciprocating walking test, Digit Symbol Substitution Test, attention test, delayed recall test, Mini-Mental State Examination, iNPH grading scale, Neuropsychiatric Inventory, Zarit Burden Interview, and other evaluations. Three months after LPS, gait, urination, overall cognition, psychomotor speed, attention, and neuropsychiatric symptoms significantly improved in both groups, but the improvement in delayed recall and reduction of caregiver burden were significantly greater in iNPH/AD- than iNPH/AD+. In addition, improvement in delayed recall score after LPS was significantly and negatively correlated with the probability of AD as judged by amyloid β42 and total tau levels in CSF. Three months after LPS, almost all of the triad symptoms decreased in iNPH patients with and without AD pathology but memory improved only in iNPH patients without AD pathology. PMID:27653897

  18. The Quality of Working Life Questionnaire for Cancer Survivors (QWLQ-CS): a Pre-test Study

    OpenAIRE

    de Jong, Merel; Tamminga, Sietske J; de Boer, Angela G E M; Frings-Dresen, Monique H.W.

    2016-01-01

    Background Returning to and continuing work is important to many cancer survivors, but also represents a challenge. We know little about subjective work outcomes and how cancer survivors perceive being returned to work. Therefore, we developed the Quality of Working Life Questionnaire for Cancer Survivors (QWLQ-CS). Our aim was to pre-test the items of the initial QWLQ-CS on acceptability and comprehensiveness. In addition, item retention was performed by pre-assessing the relevance scores an...

  19. HIV pre-test information, discussion or counselling? A review of guidance relevant to the WHO European Region.

    Science.gov (United States)

    Bell, Stephen A; Delpech, Valerie; Raben, Dorthe; Casabona, Jordi; Tsereteli, Nino; de Wit, John

    2016-02-01

    In the context of a shift from exceptionalism to normalisation, this study examines recommendations/evidence in current pan-European/global guidelines regarding pre-test HIV testing and counselling practices in health care settings. It also reviews new research not yet included in guidelines. There is consensus that verbal informed consent must be gained prior to testing, individually, in private, confidentially, in the presence of a health care provider. All guidelines recommend pre-test information/discussion delivered verbally or via other methods (information sheet). There is agreement about a minimum standard of information to be provided before a test, but guidelines differ regarding discussion about issues encouraging patients to think about implications of the result. There is heavy reliance on expert consultation in guideline development. Referenced scientific evidence is often more than ten years old and based on US/UK research. Eight new papers are reviewed. Current HIV testing and counselling guidelines have inconsistencies regarding the extent and type of information that is recommended during pre-test discussions. The lack of new research underscores a need for new evidence from a range of European settings to support the process of expert consultation in guideline development.

  20. Comparison of different coupling CFD–STH approaches for pre-test analysis of a TALL-3D experiment

    Energy Technology Data Exchange (ETDEWEB)

    Papukchiev, Angel, E-mail: angel.papukchiev@grs.de [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany); Jeltsov, Marti; Kööp, Kaspar; Kudinov, Pavel [KTH Royal Institute of Technology, Stockholm (Sweden); Lerchl, Georg [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany)

    2015-08-15

    Highlights: • Thermal-hydraulic system codes and CFD tools are coupled. • Pre-test calculations for the TALL-3D facility are performed. • Complex flow and heat transfer phenomena are modeled. • Comparative analyses have been performed. - Abstract: The system thermal-hydraulic (STH) code ATHLET was coupled with the commercial 3D computational fluid dynamics (CFD) software package ANSYS CFX to improve ATHLET simulation capabilities for flows with pronounced 3D phenomena such as flow mixing and thermal stratification. Within the FP7 European project THINS (Thermal Hydraulics of Innovative Nuclear Systems), validation activities for coupled thermal-hydraulic codes are being carried out. The TALL-3D experimental facility, operated by KTH Royal Institute of Technology in Stockholm, is designed for thermal-hydraulic experiments with lead-bismuth eutectic (LBE) coolant at natural and forced circulation conditions. GRS carried out pre-test simulations with ATHLET–ANSYS CFX for the TALL-3D experiment T01, while KTH scientists perform these analyses with the coupled code RELAP5/STAR CCM+. In the experiment T01 the main circulation pump is stopped, which leads to interesting thermal-hydraulic transient with local 3D phenomena. In this paper, the TALL-3D behavior during T01 is analyzed and the results of the coupled pre-test calculations, performed by GRS (ATHLET–ANSYS CFX) and KTH (RELAP5/STAR CCM+) are directly compared.

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  3. Pre-test habituation improves the reliability of a handheld test of mechanical nociceptive threshold in dairy cows

    DEFF Research Database (Denmark)

    Raundal, P. M.; Andersen, P. H.; Toft, Nils;

    2015-01-01

    Mechanical nociceptive threshold (MNT) testing has been used to investigate aspects of painful states in bovine claws. We investigated a handheld tool, where the applied stimulation force was monitored continuously relative to a pre-encoded based target force. The effect on MNT of two pre......-testing habituation procedures was performed in two different experiments comprising a total of 88 sound Holsteins dairy cows kept either inside or outside their home environment. MNT testing was performed using five consecutive mechanical nociceptive stimulations per cow per test at a fixed pre-encoded target rate...... of 2.1 N/s. The habituation procedure performed in dairy cows kept in their home environment led to lowered intra-individual coefficient of variation of MNT (P force during stimulations (P

  4. Asbestos and Probable Microscopic Polyangiitis

    OpenAIRE

    George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W

    2004-01-01

    Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...

  5. Demographic, clinical and treatment related predictors for event-free probability following low-dose radiotherapy for painful heel spurs - a retrospective multicenter study of 502 patients

    International Nuclear Information System (INIS)

    A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p 58/≤58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis ≤ 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p 58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs

  6. Demographic, clinical and treatment related predictors for event-free probability following low-dose radiotherapy for painful heel spurs - a retrospective multicenter study of 502 patients

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, Ralph [Dept. of Radiotherapy, St. Josefs-Hospital. Wiesbaden (Germany); Micke, Oliver [Dept. of Radiotherapy, Muenster Univ. Hospital (Germany); Reichl, Berthold [Dept. of Radiotherapy, Weiden Hospital (DE)] (and others)

    2007-03-15

    A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p <0.001); >58/{<=}58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis {<=} 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p <0.001), an age >58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs.

  7. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  8. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  9. Non-Archimedean Probability

    CERN Document Server

    Benci, Vieri; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.

  10. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  11. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  13. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  14. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  15. Introduction to probability models

    CERN Document Server

    Ross, Sheldon M

    2006-01-01

    Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v

  16. Molecular contingencies: reinforcement probability.

    Science.gov (United States)

    Hale, J M; Shimp, C P

    1975-11-01

    Pigeons obtained food by responding in a discrete-trials two-choice probability-learning experiment involving temporal stimuli. A given response alternative, a left- or right-key peck, had 11 associated reinforcement probabilities within each session. Reinforcement probability for a choice was an increasing or a decreasing function of the time interval immediately preceding the choice. The 11 equiprobable temporal stimuli ranged from 1 to 11 sec in 1-sec classes. Preference tended to deviate from probability matching in the direction of maximizing; i.e., the percentage of choices of the preferred response alternative tended to exceed the probability of reinforcement for that alternative. This result was qualitatively consistent with probability-learning experiments using visual stimuli. The result is consistent with a molecular analysis of operant behavior and poses a difficulty for molar theories holding that local variations in reinforcement probability may safely be disregarded in the analysis of behavior maintained by operant paradigms. PMID:16811883

  17. Pre-test analysis for the KNGR LBLOCA DVI performance test using a best estimate code MARS

    International Nuclear Information System (INIS)

    Pre-test analysis using a MARS code has been performed for the KNGR (Korean Next Generation Reactor) DVI (Direct Vessel Injection) performance test facility which is a full height and 1/24.3 volume scaled separate effects test facility focusing on the identification of multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. From the steady state analyses for various test cases at late reflood condition, the degree of major thermal-hydraulic phenomena such as ECC bypass, ECC penetration, steam condensation, and water level sweep-out are quantified. The MARS code analysis results showed that: (a) multi-dimensional flow and temperature behaviors occurred in the downcomer region as expected, (b) the proximity of ECC injection to the break caused more ECC bypass and less steam condensation efficiency, (c) increasing steam flow rate resulted in more ECC bypass and less steam condensation, (d) and the high velocity of steam flow swept-out the water in the downcomer just below the cold leg nozzle. These results are comparable with those observed in the previous tests such as UPTF and CCTF. (author)

  18. Pre-test of the KYLIN-II thermal-hydraulics mixed circulation LBE loop using RELAP5

    International Nuclear Information System (INIS)

    To investigate the behavior of lead bismuth eutectic (LBE) as coolant in China LEAd-based Research Reactor, Institute of Nuclear Energy Safety Institute (INEST), Chinese Academy of Sciences has built a multi-functional LBE experiment facility KYLIN-II. Mixed circulation loop, which is one of the KYLIN-II thermal-hydraulics loops, has the capability to drive the flowing LBE in different ways such as pump, gas lift and temperature difference (natural circulation). In this contribution, preliminary numerical simulations in support of the operation and experiment of KYLIN-II thermal-hydraulics mixed circulation LBE loop have been carried out and the obtained results have been studied. The RELAP5 Mod4.0 with LBE model has been utilized. Pre-test analysis showed the LBE circulation capability can reach the object under several driven patterns. The maximum velocity in fuel pin bundles can be larger than 0.15 m/s for natural circulation, 0.5 m/s for gas enhanced circulation, and 2 m/s for pump driven circulation. (author)

  19. Asbestos and Probable Microscopic Polyangiitis

    Directory of Open Access Journals (Sweden)

    George S Rashed Philteos

    2004-01-01

    Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.

  20. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  1. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  2. Qubit persistence probability

    International Nuclear Information System (INIS)

    In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)

  3. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  4. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  5. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  6. Pre-Test Assessment of the Use Envelope of the Normal Force of a Wind Tunnel Strain-Gage Balance

    Science.gov (United States)

    Ulbrich, N.

    2016-01-01

    The relationship between the aerodynamic lift force generated by a wind tunnel model, the model weight, and the measured normal force of a strain-gage balance is investigated to better understand the expected use envelope of the normal force during a wind tunnel test. First, the fundamental relationship between normal force, model weight, lift curve slope, model reference area, dynamic pressure, and angle of attack is derived. Then, based on this fundamental relationship, the use envelope of a balance is examined for four typical wind tunnel test cases. The first case looks at the use envelope of the normal force during the test of a light wind tunnel model at high subsonic Mach numbers. The second case examines the use envelope of the normal force during the test of a heavy wind tunnel model in an atmospheric low-speed facility. The third case reviews the use envelope of the normal force during the test of a floor-mounted semi-span model. The fourth case discusses the normal force characteristics during the test of a rotated full-span model. The wind tunnel model's lift-to-weight ratio is introduced as a new parameter that may be used for a quick pre-test assessment of the use envelope of the normal force of a balance. The parameter is derived as a function of the lift coefficient, the dimensionless dynamic pressure, and the dimensionless model weight. Lower and upper bounds of the use envelope of a balance are defined using the model's lift-to-weight ratio. Finally, data from a pressurized wind tunnel is used to illustrate both application and interpretation of the model's lift-to-weight ratio.

  7. Pre-test metyrapone impairs memory recall in fear conditioning tasks: lack of interaction with β-adrenergic activity

    Science.gov (United States)

    Careaga, Mariella B. L.; Tiba, Paula A.; Ota, Simone M.; Suchecki, Deborah

    2015-01-01

    Cognitive processes, such as learning and memory, are essential for our adaptation to environmental changes and consequently for survival. Numerous studies indicate that hormones secreted during stressful situations, such as glucocorticoids (GCs), adrenaline and noradrenaline, regulate memory functions, modulating aversive memory consolidation and retrieval, in an interactive and complementary way. Thus, the facilitatory effects of GCs on memory consolidation as well as their suppressive effects on retrieval are substantially explained by this interaction. On the other hand, low levels of GCs are also associated with negative effects on memory consolidation and retrieval and the mechanisms involved are not well understood. The present study sought to investigate the consequences of blocking the rise of GCs on fear memory retrieval in multiple tests, assessing the participation of β-adrenergic signaling on this effect. Metyrapone (GCs synthesis inhibitor; 75 mg/kg), administered 90 min before the first test of contextual or tone fear conditioning (TFC), negatively affected animals’ performances, but this effect did not persist on a subsequent test, when the conditioned response was again expressed. This result suggested that the treatment impaired fear memory retrieval during the first evaluation. The administration immediately after the first test did not affect the animals’ performances in contextual fear conditioning (CFC), suggesting that the drug did not interfere with processes triggered by memory reactivation. Moreover, metyrapone effects were independent of β-adrenergic signaling, since concurrent administration with propranolol (2 mg/kg), a β-adrenergic antagonist, did not modify the effects induced by metyrapone alone. These results demonstrate that pre-test metyrapone administration led to negative effects on fear memory retrieval and this action was independent of a β-adrenergic signaling. PMID:25784866

  8. Grouped to Achieve: Are There Benefits to Assigning Students to Heterogeneous Cooperative Learning Groups Based on Pre-Test Scores?

    Science.gov (United States)

    Werth, Arman Karl

    Cooperative learning has been one of the most widely used instructional practices around the world since the early 1980's. Small learning groups have been in existence since the beginning of the human race. These groups have grown in their variance and complexity overtime. Classrooms are getting more diverse every year and instructors need a way to take advantage of this diversity to improve learning. The purpose of this study was to see if heterogeneous cooperative learning groups based on student achievement can be used as a differentiated instructional strategy to increase students' ability to demonstrate knowledge of science concepts and ability to do engineering design. This study includes two different groups made up of two different middle school science classrooms of 25-30 students. These students were given an engineering design problem to solve within cooperative learning groups. One class was put into heterogeneous cooperative learning groups based on student's pre-test scores. The other class was grouped based on random assignment. The study measured the difference between each class's pre-post gains, student's responses to a group interaction form and interview questions addressing their perceptions of the makeup of their groups. The findings of the study were that there was no significant difference between learning gains for the treatment and comparison groups. There was a significant difference between the treatment and comparison groups in student perceptions of their group's ability to stay on task and manage their time efficiently. Both the comparison and treatment groups had a positive perception of the composition of their cooperative learning groups.

  9. Pre-test metyrapone impairs memory recall in fear conditioning tasks: lack of interaction with β-adrenergic activity

    Directory of Open Access Journals (Sweden)

    Mariella B.L. Careaga

    2015-03-01

    Full Text Available Cognitive processes, such as learning and memory, are essential for our adaptation to environmental changes and consequently for survival. Numerous studies indicate that hormones secreted during stressful situations, such as glucocorticoids (GCs, adrenaline and noradrenaline, regulate memory functions, modulating aversive memory consolidation and retrieval, in an interactive and complementary way. Thus, the facilitatory effects of GCs on memory consolidation as well as their suppressive effects on retrieval are substantially explained by this interaction. On the other hand, low levels of GCs are also associated with negative effects on memory consolidation and retrieval and the mechanisms involved are not well understood. The present study sought to investigate the consequences of blocking the rise of GCs on fear memory retrieval in multiple tests, assessing the participation of β-adrenergic signaling on this effect. Metyrapone (GCs synthesis inhibitor, administered 90 min before the first test of contextual or auditory fear conditioning, negatively affected animals’ performances, but this effect did not persist on a subsequent test, when the conditioned response was again expressed. This result suggested that the treatment impaired fear memory retrieval during the first evaluation. The administration immediately after the first test did not affect the animals’ performances in contextual fear conditioning, suggesting that the drug did not interfere with processes triggered by memory reactivation. Moreover, metyrapone effects were independent of β-adrenergic signaling, since concurrent administration with propranolol, a β-adrenergic antagonist, did not modify the effects induced by metyrapone alone. These results demonstrate that pre-test metyrapone administration led to negative effects on fear memory retrieval and this action was independent of a β-adrenergic signaling.

  10. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  11. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  12. Dynamic update with probabilities

    NARCIS (Netherlands)

    J. van Benthem; J. Gerbrandy; B. Kooi

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant pr

  13. Economy, probability and risk

    Directory of Open Access Journals (Sweden)

    Elena Druica

    2007-05-01

    Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.

  14. Abstract Models of Probability

    Science.gov (United States)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  15. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  16. The concept of probability

    International Nuclear Information System (INIS)

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  17. Learning Probability in the Arts Stream Classes: Do Colour Balls with STADCooperative Learning Help in Improving Students’ Performance?

    OpenAIRE

    Siew Nyet Moi; Abdullah Sopiah; Kueh Ngie King

    2013-01-01

    Aims: 1. To investigate the effects of concrete learning aids (Colour Balls) with Student Teams-Achievement Division (STAD) cooperative learning (CBCL) method on Form Four Arts Stream students’ performance in probability; 2. To find out students’ perception towards the use of CBCL method in learning probability. Study Design: Quasi experimental pre-test post-test control group design. Two treatment groups were employed in this design, they were CBCL (experimental gr...

  18. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  19. Stochastic Programming with Probability

    CERN Document Server

    Andrieu, Laetitia; Vázquez-Abad, Felisa

    2007-01-01

    In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...

  20. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  1. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  2. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  3. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  5. Probability with Roulette

    Science.gov (United States)

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  6. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  7. Asteroidal collision probabilities

    Science.gov (United States)

    Bottke, William F., Jr.; Greenberg, Richard

    1993-01-01

    Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.

  8. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  9. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  10. Launch Collision Probability

    Science.gov (United States)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  11. Evaluation of a reproductive health awareness program for adolescence in urban Tanzania-A quasi-experimental pre-test post-test research

    Directory of Open Access Journals (Sweden)

    Iida Mariko

    2011-06-01

    Full Text Available Abstract Background Sub-Saharan Africa is among the countries where 10% of girls become mothers by the age of 16 years old. The United Republic of Tanzania located in Sub-Saharan Africa is one country where teenage pregnancy is a problem facing adolescent girls. Adolescent pregnancy has been identified as one of the reasons for girls dropping out from school. This study's purpose was to evaluate a reproductive health awareness program for the improvement of reproductive health for adolescents in urban Tanzania. Methods A quasi-experimental pre-test and post-test research design was conducted to evaluate adolescents' knowledge, attitude, and behavior about reproductive health before and after the program. Data were collected from students aged 11 to 16, at Ilala Municipal, Dar es Salaam, Tanzania. An anonymous 23-item questionnaire provided the data. The program was conducted using a picture drama, reproductive health materials and group discussion. Results In total, 313 questionnaires were distributed and 305 (97.4% were useable for the final analysis. The mean age for girls was 12.5 years and 13.2 years for boys. A large minority of both girls (26.8% and boys (41.4% had experienced sex and among the girls who had experienced sex, 51.2% reported that it was by force. The girls' mean score in the knowledge pre-test was 5.9, and 6.8 in post-test, which increased significantly (t = 7.9, p = 0.000. The mean behavior pre-test score was 25.8 and post-test was 26.6, which showed a significant increase (t = 3.0, p = 0.003. The boys' mean score in the knowledge pre-test was 6.4 and 7.0 for the post-test, which increased significantly (t = 4.5, p = 0.000. The mean behavior pre-test score was 25.6 and 26.4 in post-test, which showed a significant increase (t = 2.4, p = 0.019. However, the pre-test and post-test attitude scores showed no statistically significant difference for either girls or boys. Conclusions Teenagers have sexual experiences including

  12. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  13. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  14. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  15. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  17. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  18. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  19. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  20. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  1. Logic and probability

    OpenAIRE

    Quznetsov, G. A.

    2003-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  2. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  3. Logic, Truth and Probability

    OpenAIRE

    Quznetsov, Gunn

    1998-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2014-01-01

    that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  5. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  6. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  7. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  8. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  9. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  10. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  11. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Angles as probabilities

    CERN Document Server

    Feldman, David V

    2008-01-01

    We use a probabilistic interpretation of solid angles to generalize the well-known fact that the inner angles of a triangle sum to 180 degrees. For the 3-dimensional case, we show that the sum of the solid inner vertex angles of a tetrahedron T, divided by 2*pi, gives the probability that an orthogonal projection of T onto a random 2-plane is a triangle. More generally, it is shown that the sum of the (solid) inner vertex angles of an n-simplex S, normalized by the area of the unit (n-1)-hemisphere, gives the probability that an orthogonal projection of S onto a random hyperplane is an (n-1)-simplex. Applications to more general polytopes are treated briefly, as is the related Perles-Shephard proof of the classical Gram-Euler relations.

  14. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  15. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  16. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.;

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...

  17. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  18. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  19. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t

  20. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  1. Pre-test calculation of reflooding experiments with wider lattice in APWR-geometry (FLORESTAN 2) using the advanced computer code FLUT-FDWR

    International Nuclear Information System (INIS)

    After the reflooding tests in an extremely tight bundle (p/d=1.06, FLORESTAN 1) have been completed, new experiments for a wider lattice (p/d=1.242, FLORESTAN 2), which is employed in the recent APWR design of KfK, are planned at KfK to obtain the benchmark data for validation and improvement of calculation methods. This report presents the results of pre-test calculations for the FLORESTAN 2 experiment using FLUT-FDWR, a modified version of the GRS computer code FLUT for analysis of the most important behaviour during the reflooding phase after a LOCA in the APWR design. (orig.)

  2. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    Science.gov (United States)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  3. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  4. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  5. 薏苡叶化学成分的预试验%The Coix Leaves Chemical Composition of the Pre-test

    Institute of Scientific and Technical Information of China (English)

    谭冰; 黄锁义; 严焕宁; 史柳芝; 吕龙祥

    2014-01-01

    Experimental study on the Guangxi Coix leaves of the chemical composition of pre-test Chemical reaction identification method of production of the Coix leaves water extract , ethanol extract and petroleum ether extract of Guangxi Coix leaves chemical composition of the pretest. Through the pre-test , suggesting that from Guangxi Coix leaves contain flavonoids,Phenolic,Coumarin,Volatile oil,Phytosterol,Carbohydrate,Glycosides, Tannin,Organic acids,Alkaloids and other chemical constituents. This test provided the experimental basis for further biologically active constituents of the plant.%对广西薏苡叶的化学成分进行预试验研究。采用化学反应鉴别法分别对广西产薏苡叶水提取液、乙醇提取液和石油醚提取液进行化学成分预试。通过预试验,提示广西产薏苡叶中可能含有黄酮类、酚类、香豆素类、挥发油、植物甾醇、糖类、苷类、鞣质、有机酸、生物碱等化学成分。此试验为进一步进行该植物的生物活性成分研究提供了实验基础。

  6. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities. PMID:27570097

  7. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  8. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  9. 孤立性肺结节恶性概率估算临床预测模型的建立%Establishment of Clinical Prediction Model to Estimate the Probability of Malignancy in Patients with Solitary Pulmonary Nodules

    Institute of Scientific and Technical Information of China (English)

    张晓辉; 陈成; 曾辉; 宁卫卫; 张楠; 黄建安

    2016-01-01

    Objective To screen the clinical risk factors of lung cancer in the patients with solitary pulmonary nodules ( SPN) ,and build the clinical prediction model to estimate the probability of malignancy.Methods A retrospective analysis was performed on the clinical data and chest imaging characteristics of 270 patients with SPN.Results Among 270 patients,there had 110 (40.7%) cases of lung cancer,and 160 (59.3%) benign lesions.On the analysis of imaging characteristics,lobulation, spiculated sign, pleural indentation sign, contrast enhancement, air bronchogram sign were associated with lung cancer ( P <0.05).Nodules with clear boundary,calcification,homogeneous density were associated with benign lesions (P<0.05).Single factor analysis showed that age, smoking history, malignant imaging characteristics and diameter were significantly affected the judgment of SPN whether it was benign or malignant(P<0.05).The multivariate analysis revealed that age,malignant imaging characteristics and diameter were independent risk factors of lung cancer in the patients with SPN (P<0.01).The clinical pre-diction model to estimate the probability of malignancy as following:P=ex/(1+ex),X=-5.882+0.050* age+1.672*ima-ging characteristic+0.123* the maximum diameter,where the e is the base of the natural logarithm.The cut-off value was 0.46. The sensitivity was 82%,specificity 85%,positive balue 80%,and negative predictive value 87%.The area under the ROC curve for our model was 0.901.Conclusion Age,malignant imaging characteristics and diameter are independent risk factors of lung cancer in the patients with SPN.Our prediction model is accurate and sufficient to estimate the malignancy of patients with SPN.%目的:筛选恶性孤立性肺结节(solitary pulmonary nodules,SPN)的危险因素,构建判断SPN良恶性的临床预测模型。方法回顾性分析孤立性肺结节患者270例的临床资料及胸部影像学特征。结果270例患者中,肺癌110例(40.7

  10. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  11. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-04-01

    Full Text Available In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  12. The Logic of Parametric Probability

    CERN Document Server

    Norman, Joseph W

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.

  13. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  14. The Experiences Talk on the Pre-test Training for the Exam of Computer's Certificates%计算机考证考前培训经验谈

    Institute of Scientific and Technical Information of China (English)

    顾敏

    2013-01-01

    计算机高新考试培训工作的重点是教师要熟悉考试形式,要弄通题库的题型,并为学生作好考前培训辅导工作,再加上激发学生学习的主动性,考试过关率就会大大提高。%The training work of computer high-new exam is focused on the teachers to be familiar with forms of the test, it should be gotten a good grasp for the question type in question bank, and done tutoring job well for the pre-test training of students, as well as stimulated the learning initiative in students, the pass rate of exam is much to be improved higher.

  15. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  16. Probable Linezolid-Induced Pancytopenia

    Directory of Open Access Journals (Sweden)

    Nita Lakhani

    2005-01-01

    Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.

  17. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  18. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  19. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  20. Pretest probability assessment derived from attribute matching

    OpenAIRE

    Hollander Judd E; Diercks Deborah B; Pollack Charles V; Johnson Charles L; Kline Jeffrey A; Newgard Craig D; Garvey J Lee

    2005-01-01

    Abstract Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possib...

  1. Probabilities of multiple quantum teleportation

    OpenAIRE

    Woesler, Richard

    2002-01-01

    Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...

  2. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to charac......An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  4. Probability theory and its models

    OpenAIRE

    Humphreys, Paul

    2008-01-01

    This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.

  5. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  6. Varieties of Belief and Probability

    NARCIS (Netherlands)

    Eijck, D.J.N. van; Ghosh, S.; Szymanik, J.

    2015-01-01

    For reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for knowledge and b

  7. [Evaluation of the pre-test counseling process in the HIV Testing and Counseling Centers in Rio de Janeiro State: the perception of users and health professionals].

    Science.gov (United States)

    Sobreira, Paula Guidone Pereira; de Vasconcellos, Mauricio Teixeira Leite; Portela, Margareth Crisóstomo

    2012-11-01

    This study sought to evaluate the pre-test counseling process in the HIV Testing and Counseling Centers (CTA) in Rio de Janeiro State, based on the perceptions of users and health professionals. A population survey was performed, based on a structured questionnaire given to a sample of users and counselors of nine CTAs. Quantitative analyses were employed to evaluate the degree of satisfaction in relation to infrastructure indicators of the way patients are received and treated, the user-counselor relationship, and territoriality, accessibility and availability. Among the CTA users interviewed, 58.1% were very satisfied and 38.7% were satisfied with the care received, according to analysis of set of indicators. The majority of health professionals (80.9%) interviewed also declared their satisfaction. Despite the high level of satisfaction, some complaints regarding structural and procedural aspects were identified, which call for the attention of the health managers and professionals for the enhancement of the quality of the service rendered. PMID:23175316

  8. Subjective probability models for lifetimes

    CERN Document Server

    Spizzichino, Fabio

    2001-01-01

    Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...

  9. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  10. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  11. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  12. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  13. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  14. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  15. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  16. Transition probabilities of Br II

    Science.gov (United States)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  17. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  18. Logical, conditional, and classical probability

    OpenAIRE

    Quznetsov, G. A.

    2005-01-01

    The propositional logic is generalized on the real numbers field. the logical function with all properties of the classical probability function is obtained. The logical analog of the Bernoulli independent tests scheme is constructed. The logical analog of the Large Number Law is deduced from properties of these functions. The logical analog of thd conditional probability is defined. Consistency encured by a model on a suitable variant of the nonstandard analysis.

  19. Compliance with endogenous audit probabilities

    OpenAIRE

    Konrad, Kai A.; Lohse, Tim; Qari, Salmai

    2015-01-01

    This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...

  20. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  1. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  2. Novel Bounds on Marginal Probabilities

    OpenAIRE

    Mooij, Joris M.; Kappen, Hilbert J

    2008-01-01

    We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...

  3. Trajectory probability hypothesis density filter

    OpenAIRE

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  4. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  5. Diagnostic accuracy of MRI in adults with suspect brachial plexus lesions: A multicentre retrospective study with surgical findings and clinical follow-up as reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Tagliafico, Alberto, E-mail: alberto.tagliafico@unige.it [Institute of Anatomy, Department of Experimental Medicine, University of Genoa, Largo Rosanna Benzi 8, 16132 Genoa (Italy); Succio, Giulia; Serafini, Giovanni [Department of Radiology, Santa Corona Hospital, Pietra Ligure, Italy via XXV Aprile, 38- Pietra Ligure, 17027 Savona (Italy); Martinoli, Carlo [Radiology Department, DISC, Università di Genova, Largo Rosanna Benzi 8, 16138 Genova (Italy)

    2012-10-15

    Objective: To evaluate brachial plexus MRI accuracy with surgical findings and clinical follow-up as reference standard in a large multicentre study. Materials and methods: The research was approved by the Institutional Review Boards, and all patients provided their written informed consent. A multicentre retrospective trial that included three centres was performed between March 2006 and April 2011. A total of 157 patients (men/women: 81/76; age range, 18–84 years) were evaluated: surgical findings and clinical follow-up of at least 12 months were used as the reference standard. MR imaging was performed with different equipment at 1.5 T and 3.0 T. The patient group was divided in five subgroups: mass lesion, traumatic injury, entrapment syndromes, post-treatment evaluation, and other. Sensitivity, specificity with 95% confidence intervals (CIs), positive predictive value (PPV), pre-test-probability (the prevalence), negative predictive value (NPV), pre- and post-test odds (OR), likelihood ratio for positive results (LH+), likelihood ratio for negative results (LH−), accuracy and post-test probability (post-P) were reported on a per-patient basis. Results: The overall sensitivity and specificity with 95% CIs were: 0.810/0.914; (0.697–0.904). Overall PPV, pre-test probability, NPV, LH+, LH−, and accuracy: 0.823, 0.331, 0.905, 9.432, 0.210, 0.878. Conclusions: The overall diagnostic accuracy of brachial plexus MRI calculated on a per-patient base is relatively high. The specificity of brachial plexus MRI in patients suspected of having a space-occupying mass is very high. The sensitivity is also high, but there are false-positive interpretations as well.

  6. Impact of the clinical context on the 14-3-3 test for the diagnosis of sporadic CJD

    Directory of Open Access Journals (Sweden)

    Sierra-Moros Maríajosé

    2006-07-01

    Full Text Available Abstract Background The 14-3-3 test appears to be a valuable aid for the clinical diagnosis of sporadic Creutzfeldt-Jakob disease (sCJD in selected populations. However, its usefulness in routine practice has been challenged. In this study, the influence of the clinical context on the performance of the 14-3-3 test for the diagnosis of sCJD is investigated through the analysis of a large prospective clinical series. Methods Six hundred seventy-two Spanish patients with clinically suspected sCJD were analyzed. Clinical classification at sample reception according to the World Health Organization's (WHO criteria (excluding the 14-3-3 test result was used to explore the influence of the clinical context on the pre-test probabilities, and positive (PPV and negative (NPV predictive values of the 14-3-3 test. Results Predictive values of the test varied greatly according to the initial clinical classification: PPV of 98.8%, 96.5% and 45.0%, and NPV of 26.1%, 66.6% and 100% for probable sCJDi (n = 115, possible sCJDi (n = 73 and non-sCJDi (n = 484 cases, respectively. According to multivariate and Bayesian analyses, these values represent an improvement of diagnostic certainty compared to clinical data alone. Conclusion In three different contexts of sCJD suspicion, the 14-3-3 assay provides useful information complementary to clinical and electroencephalographic (EEG data. The test is most useful supporting a clinical impression, whilst it may show deceptive when it is not in agreement with clinical data.

  7. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  8. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  9. Born Rule and Noncontextual Probability

    CERN Document Server

    Logiurato, Fabrizio

    2012-01-01

    The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...

  10. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  11. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  12. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  13. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  14. Probability as a physical motive

    CERN Document Server

    Martin, P

    2007-01-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking the physical principle of Maximum Entropy Production ("MEP") to the information-theoretical "MaxEnt" principle of scientific inference, together with conjectures from theoretical physics that there may be no fundamental causal laws but only probabilities for physical processes, and from evolutionary theory that biological systems expand "the adjacent possible" as rapidly as possible, all lend credence to the proposition that probability should be recognized as a fundamental physical motive. It is further proposed that spatial order and temporal order are two aspects of the same thing, and that this is the essence of the second law of thermodynamics.

  15. Fusion Probability in Dinuclear System

    CERN Document Server

    Hong, Juhee

    2015-01-01

    Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.

  16. Pre-Test CFD for the Design and Execution of the Enhanced Injection and Mixing Project at NASA Langley Research Center

    Science.gov (United States)

    Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.

    2014-01-01

    With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to

  17. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  18. Pollock on probability in epistemology

    OpenAIRE

    Fitelson, Branden

    2010-01-01

    In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.

  19. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  20. Quantum correlations; quantum probability approach

    OpenAIRE

    Majewski, W A

    2014-01-01

    This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...

  1. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  2. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  3. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  4. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  5. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    Science.gov (United States)

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  6. Transition probability and preferential gauge

    OpenAIRE

    Chen, C.Y.

    1999-01-01

    This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.

  7. Classical Probability and Quantum Outcomes

    Directory of Open Access Journals (Sweden)

    James D. Malley

    2014-05-01

    Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.

  8. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  9. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  10. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  11. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  12. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  13. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  14. Subjective probability and quantum certainty

    CERN Document Server

    Caves, C M; Schack, R; Caves, Carlton M.; Fuchs, Christopher A.; Schack, Ruediger

    2006-01-01

    In the Bayesian approach to quantum mechanics, probabilities--and thus quantum states--represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Our analysis reveals fundamental differences between our Bayesian approach on the one hand and the Copenhagen interpretation and similar interpretations of quantum states on the other hand. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then show that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that with-certainty predictions derived from such a state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply certainty for a measurement outcome, that outcome would effectively correspond to a preexisting system pr...

  15. The probability of extraterrestrial life

    International Nuclear Information System (INIS)

    Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)

  16. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  17. Tight Bernoulli tail probability bounds

    OpenAIRE

    Dzindzalieta, Dainius

    2014-01-01

    The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...

  18. Relative transition probabilities of cobalt

    Science.gov (United States)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  19. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  20. The neurologic examination in patients with probable Alzheimer's disease.

    Science.gov (United States)

    Huff, F J; Boller, F; Lucchelli, F; Querriera, R; Beyer, J; Belle, S

    1987-09-01

    Abnormal findings on a standardized neurologic examination were compared between patients with a clinical diagnosis of probable Alzheimer's disease (AD) and healthy control subjects. Aside from mental status findings, the most useful examination findings for differentiating AD from control subjects were the presence of release signs, olfactory deficit, impaired stereognosis or graphesthesia, gait disorder, tremor, and abnormalities on cerebellar testing. These abnormalities probably reflect the different areas of the central nervous system that are affected pathologically in AD. In the clinical diagnosis of AD, particular attention should be given to these aspects of the neurologic examination.

  1. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  2. Nonlocality, Bell's Ansatz and Probability

    CERN Document Server

    Kracklauer, A F

    2006-01-01

    Quantum Mechanics lacks an intuitive interpretation, which is the cause of a generally formalistic approach to its use. This in turn has led to a certain insensitivity to the actual meaning of many words used in its description and interpretation. Herein, we analyze carefully the possible meanings of those terms used in analysis of EPR's contention, that Quantum Mechanics is incomplete, as well as Bell's work descendant therefrom. As a result, many inconsistencies and errors in contemporary discussions of nonlocality, as well as in Bell's Ansatz with respect to the laws of probability, are identified. Evading these errors precludes serious conflicts between Quantum Mechanics and Special Relativity and Philosophy.

  3. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  4. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  5. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  6. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  7. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  8. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  9. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  10. Transition probabilities for argon I

    International Nuclear Information System (INIS)

    Transition probabilities for ArI lines have been calculated on the basis of the (j,k)-coupling scheme for more than 16000 spectral lines belonging to the transition arrays 4s-np (n=4 to n=9), 5s-np (n=5 to n=9), 6s-np (n=6 to n=9), 7s-np (n=8 to n=9), 4p-ns (n=5 to n=10), 5p-ns (n=6 to n=9), 6p-ns (n=7 to n=8), 4p-nd (n=3 to n=9), 5p-nd (n=4 to n=9), 3d-np (n=5 to n=9), 4d-np (n=6 to n=9), 5d-np (n=7 to n=9), 3d-nf (n=4 to n=9), 4d-nf (n=4 to n=9), 5d-nf (n=5 to n=9), 4f-nd (n=5 to n=9) 5f-nd (n=6 to n=9), 4f-ng (n=5 to n=9), 5f-ng (n=6 to n=9). Inso far as values by other authors exist, comparison is made with these values. It turns out that the results obtained in (j,k)-coupling are close to those obtained in intermediate coupling except for intercombination lines. For high principal and/or orbital quantum numbers the transition probabilities for a multiplet approach those of the corresponding transitions in atomic hydrogen. The calculated values are applied to construct a simplified argon-atom model, which reflects the real transition properties and which allows simplified but realistic non-equilibrium calculations for argon plasmas which deviate from local thermodynamic equilibrium (LTE)

  11. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  12. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  13. Cosmological dynamics in tomographic probability representation

    OpenAIRE

    Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.

    2004-01-01

    The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...

  14. Joint Probability Models of Radiology Images and Clinical Annotations

    Science.gov (United States)

    Arnold, Corey Wells

    2009-01-01

    Radiology data, in the form of images and reports, is growing at a high rate due to the introduction of new imaging modalities, new uses of existing modalities, and the growing importance of objective image information in the diagnosis and treatment of patients. This increase has resulted in an enormous set of image data that is richly annotated…

  15. Garlic: A Concise Drug Review with Probable Clinical Uses

    OpenAIRE

    Vineet Singla; Jai Deep Bajaj; Radhika Bhaskar; Bimlesh Kumar

    2012-01-01

    Garlic and its preparations have been widely recognized as an agent for prevention and treatment of cardiovascular diseases and other metabolic disorders, atherosclerosis, hyperlipidemia, thrombosis, hypertension and hypoglycemia. This review discusses the possible mechanism of therapeutic actions of garlic, different extraction procedures along with determination of its constituents, its stability and dissolution method of garlic tablet.

  16. CENTELLA ASIATICA: A CONCISE DRUG REVIEW WITH PROBABLE CLINICAL USES

    Directory of Open Access Journals (Sweden)

    Sushma Tiwari

    2011-03-01

    Full Text Available Centella asiatica (Gotu kola is an imperative herb in Ayurvedic medicine, often mentioned in combination with the related European marsh pennywort (Hydrocotyle vulgaris. About 20 species recounted to Gotu kola cultivate in most parts of the tropic or wet pantropical areas such as rice paddies and also in rocky and higher elevations. Centella asiatica (Gotu kola is known as longevity herb and used widely in India and Nepal as part of the traditional Ayurvedic medicine. In Samskrita, it is called 'Mandūkaparnī' as its leaf appears as a standing frog from its backside. It is also called 'Brahmi the goddess of supreme wisdom and 'Saraswati the goddess knowledge & wisdom. Its roots and leaves are used for medicinal purposes and provide important health benefits related to healthy veins and blood vessels, to treat skin disorders, help with better memory and improve brain function.

  17. Pre-Test and Post-Test Applications to Shape the Education of Phlebotomists in A Quality Management Program: An Experience in A Training Hospital

    Directory of Open Access Journals (Sweden)

    Aykal Güzin

    2016-09-01

    Full Text Available Background: After the introduction of modern laboratory instruments and information systems, preanalytic phase is the new field of battle. Errors in preanalytical phase account for approximately half of total errors in clinical laboratory. The objective of this study was to share an experience of an education program that was believed to be successful in decreasing the number of rejected samples received from the Emergency Department (ED.

  18. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections

  19. Avoiding Negative Probabilities in Quantum Mechanics

    CERN Document Server

    Nyambuya, Golden Gadzirayi

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...

  20. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  1. Trajectory versus probability density entropy

    Science.gov (United States)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  2. Failure-probability driven dose painting

    Energy Technology Data Exchange (ETDEWEB)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Berthelsen, Anne K. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Bentzen, Søren M. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Departments of Human Oncology and Medical Physics, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2013-08-15

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.

  3. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  4. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner;

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  5. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  6. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  7. Pre-test analysis of an integral effect test facility for thermal-hydraulic similarities of 6 inches coldleg break and DVI injection line break using MARS-1D

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Tae Soon; Choi, Ki Yong; Park, Hyun Sik; Euh, Dong Jin; Baek, Won Pil [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    A pre-test analysis of a small-break loss-of-coolant accident (SBLOCA, DVI Line break) has been performed for the integral effect test loop of Korea Atomic Energy Research Institute (Korea Atomic Energy Research Institute-ITL), the construction of which will be started soon. The Korea Atomic Energy Research Institute-ITL is a full-height and 1/310 volume-scaled test facility based on the design features of the APR1400 (Korean Next Generation Reactor). This paper briefly introduces the basic design features of the Korea Atomic Energy Research Institute-ITL and presents the results of pre-test analysis for a postulated cold leg SBLOCA and DVI line break. Based on the same control logics and accident scenarios, the similarity between the Korea Atomic Energy Research Institute-ITL and the prototype plant, APR1400, is evaluated by using the MARS code, which is a multi-dimensional best-estimate thermal hydraulic code being developed by Korea Atomic Energy Research Institute. It is found that the Korea Atomic Energy Research Institute-ITL and APR 1400 have similar thermal hydraulic responses against the analyzed SBLOCA and DVI Line break scenario. It is also verified that the volume scaling law, applied to the design of the Korea Atomic Energy Research Institute-ITL, gives a reasonable results to keep a similarity with APR1400. 11 refs., 19 figs., 3 tabs. (Author)

  8. Estimating Small Probabilities for Langevin Dynamics

    OpenAIRE

    Aristoff, David

    2012-01-01

    The problem of estimating small transition probabilities for overdamped Langevin dynamics is considered. A simplification of Girsanov's formula is obtained in which the relationship between the infinitesimal generator of the underlying diffusion and the change of probability measure corresponding to a change in the potential energy is made explicit. From this formula an asymptotic expression for transition probability densities is derived. Separately the problem of estimating the probability ...

  9. Correlation between hypertension and clinical probable Parkinson disease: Cohort analysis of 4 335 people in Linxian County with nutritional intervention%高血压与临床很可能帕金森病的关联性:林县营养干预4 335人队列人群资料分析

    Institute of Scientific and Technical Information of China (English)

    范金虎; 张亚黎; 刘颖; 孙秀娣; 乔友林

    2006-01-01

    BACKGROUND: Linxian County of China is one of the areas with the highest incidence of esophageal cancer and gastric cardia cancer in the world, and nutrition-deficiency is widely existing in local people. In recent years, many researches around the world revealed that the cause of Parkinson disease (PD) is related to factors of gene, age, environment, diet, nutrition and smoking. More and more studies confirmed that primary hypertension may be in relation to vascular Parkinsonism (VP) and long-term hypertension was apt to VP.OBJECTIVE: To investigate the relationship between hypertension and clinical probable Parkinson disease (PPD) in nutrition-deficient population of Linxian County and provide a theoretical basis for early prevention and treatment of PD.DESIGN: Cross-sectional study.PARTICIPANTS: A total of 4 335 subjects aged over 55 years were selected. These subjects have taken part in the nutritional intervention study of Linxian County and first entered in the cohort study in 1985. They were enrolled in the nutritional intervention study in Linxian County in 1985.METHODS: A prospective cohort study was conducted. ①Case screening: PD questionnaire (used in American Gebai County) combined with general neurological examination were adopted. ②The diagnosis of PD: Clinical diagnostic criteria of UK Parkinson Disease Society Brain Bank were taken as the criteria for screening PD. Further evaluations were undertaken for clinical PPD and clinical possible PD on subjects who had PD symptoms.The diagnostic criteria of clinical PPD: Subjects were diagnosed as having clinical PPD if they presented any two of the following two cardinal features (resting tremor, hypermyotonia, bradykinesia and impairment of postural reflexes) or presented any one of the following features (resting tremor, hypermyotonia and bradykinesia). Diagnostic criteria of clinical possible PD: Subjects were diagnosed as having clinical possible PD when presented any one of the following four

  10. The Cognitive Substrate of Subjective Probability

    Science.gov (United States)

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  11. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  12. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  13. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  14. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number...

  15. On the measurement probability of quantum phases

    OpenAIRE

    Schürmann, Thomas

    2006-01-01

    We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

  16. Uniqueness in ergodic decomposition of invariant probabilities

    OpenAIRE

    Zimmermann, Dieter

    1992-01-01

    We show that for any set of transition probabilities on a common measurable space and any invariant probability, there is at most one representing measure on the set of extremal, invariant probabilities with the $\\sigma$-algebra generated by the evaluations. The proof uses nonstandard analysis.

  17. A Case of Probable Ibuprofen-Induced Acute Pancreatitis

    Directory of Open Access Journals (Sweden)

    Paul Magill

    2006-05-01

    Full Text Available Context :The incidence of drug-induced pancreatitis is rare. There have been no prior definite cases reported of ibuprofen-induced pancreatitis. Case report: We present a case of a young man with acute pancreatitis probably secondary to an ibuprofen overdose. Immediately preceding the onset of the attack he took a 51 mg/kg dose of ibuprofen. He had other causes of acute pancreatitis excluded by clinical history, serum toxicology and abdominal imaging. Discussion :In the absence of re-challenge we believe it is probable that ibuprofen has acausative link with acute pancreatitis.

  18. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  19. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  20. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  1. Pretest probability assessment derived from attribute matching

    Directory of Open Access Journals (Sweden)

    Hollander Judd E

    2005-08-01

    Full Text Available Abstract Background Pretest probability (PTP assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS. We compare the new method with a validated logistic regression equation (LRE. Methods Eight clinical variables (attributes were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82 for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77 for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25% patients as having a PTP Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE.

  2. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  3. TPmsm: Estimation of the Transition Probabilities in 3-State Models

    Directory of Open Access Journals (Sweden)

    Artur Araújo

    2014-12-01

    Full Text Available One major goal in clinical applications of multi-state models is the estimation of transition probabilities. The usual nonparametric estimator of the transition matrix for non-homogeneous Markov processes is the Aalen-Johansen estimator (Aalen and Johansen 1978. However, two problems may arise from using this estimator: first, its standard error may be large in heavy censored scenarios; second, the estimator may be inconsistent if the process is non-Markovian. The development of the R package TPmsm has been motivated by several recent contributions that account for these estimation problems. Estimation and statistical inference for transition probabilities can be performed using TPmsm. The TPmsm package provides seven different approaches to three-state illness-death modeling. In two of these approaches the transition probabilities are estimated conditionally on current or past covariate measures. Two real data examples are included for illustration of software usage.

  4. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  5. Bell Could Become the Copernicus of Probability

    Science.gov (United States)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  7. Probability and Quantum Paradigms: the Interplay

    Science.gov (United States)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  8. Introduction: Research and Developments in Probability Education

    OpenAIRE

    Manfred Borovcnik; Ramesh Kapadia

    2009-01-01

    In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the wor...

  9. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  10. Time and probability in quantum cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Greensite, J. (San Francisco State Univ., CA (USA). Dept. of Physics and Astronomy)

    1990-10-01

    A time function, an exactly conserved probability measure, and a time-evolution equation (related to the Wheeler-DeWitt equation) are proposed for quantum cosmology. The time-integral of the probability measure is the measure proposed by Hawking and Page. The evolution equation reduces to the Schroedinger equation, and probability measure to the Born measure, in the WKB approximation. The existence of this 'Schroedinger-limit', which involves a cancellation of time-dependencies in the probability density between the WKB prefactor and integration measure, is a consequence of laplacian factor ordering in the Wheeler-DeWitt equation. (orig.).

  11. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  12. Bayesian logistic betting strategy against probability forecasting

    CERN Document Server

    Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei

    2012-01-01

    We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.

  13. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  14. Quantum Statistical Mechanics. III. Equilibrium Probability

    OpenAIRE

    Attard, Phil

    2014-01-01

    Given are a first principles derivation and formulation of the probabilistic concepts that underly equilibrium quantum statistical mechanics. The transition to non-equilibrium probability is traversed briefly.

  15. Probable Effects Of Exposure To Electromagnetic Waves Emitted From Video Display Terminals On Ocular Functions

    International Nuclear Information System (INIS)

    There is growing body of evidence that usage of computers can adversely affect the visual health. Considering the rising number of computer users in Egypt, computer-related visual symptoms might take an epidemic form. In view of that, this study was undertaken to find out the magnitude of the visual problems in computer operators and its relationship with various personal and workplace factors. Aim: To evaluate the probable effects of exposure to electromagnetic waves radiated from visual display terminals on some visual functions. Subjects and Methods: hundred fifty computer operators working in different institutes were randomly selected. They were asked to fill a pre-tested questionnaire (written in Arabic), after obtaining their verbal consent. The selected exposed subjects were were subjected to the following clinical assessment: 1-Visual acuity measurements 2-Refraction (using autorefractometer). 3- Measurements of the ocular dryness defects using the following different diagnostic tests: Schirmer test-,Fluorescein staining , Rose Bengal staining, Tear Break Up Time (TBUT) and LIPCOF test (lid parallel conjunctival fold). A control group included hundred fifty participants, they are working in a field does not necessitate exposure to video display terminals. Inclusion criteria of the subjects were as follows: minimum three symptoms of computer vision syndrome (CVS), minimum one year exposure to (VDT, s) and minimum 6 hs/day in 5working days/week. Exclusion criteria included candidates having ocular pathology like: glaucoma, optic atrophy, diabetic retinopathy, papilledema The following complaints were studied: 1-Tired eyes. 2- Burning eyes with excessive tear production. 3-Dry sore eyes 4-Blurred near vision (letters on the screen run together). 5-Asthenopia. 6-Neck, shoulder and back aches, overall bodily fatigue or tiredness. An interventional protective measure for the selected subjects from the exposed group was administered, it included the following (1

  16. Combination of clinical and v/q scan assessment for the diagnosis of pulmonary embolism: a 2-year outcome prospective study

    Energy Technology Data Exchange (ETDEWEB)

    Barghouth, G.; Boubaker, A.; Delaloye, A.B. [Univ. Hospital, Lausanne (Switzerland). Dept. of Nuclear Medicine; Yersin, B. [Dept. of Internal Medicine, Univ. Hospital, Lausanne (Switzerland); Doenz, F.; Schnyder, P. [Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland). Dept. of Radiology

    2000-09-01

    With the aim of evaluating the efficiency of our diagnositc approach in patients with suspected acute pulmonary embolism (PE), we prospectively studied 143 patients investigated by means of a ventilation/perfusion (V/Q) lung scan. A pre-test clinical probability of PE (P{sub clin}) was assigned to all patients by the clinicians and scans were interpreted blinded to clinical assessment. A 2-year follow-up of our patients was systematically performed and possible in 134 cases. Distribution of clinical probabilities was high P{sub clin} in 22.5%, intermediate P{sub clin} in 24% and low P{sub clin} in 53.5%, whereas the distribution of scan categories was high P{sub scan} in 14%, intermediate P{sub scan} in 18%, low P{sub scan} in 57% and normal P{sub scan} in 11%. The final prevalence of PE was 24.5%. High P{sub scan} and normal P{sub scan} were always conclusive (19 and 15 cases respectively). Low P{sub scan} associated with low P{sub clin} could exclude PE in 43/45 cases (96%). Noe of the patients in whom the diagnosis of PE was discarded had a major event related to PE during the 2-year follow-up. Overall, the combined assessment of clinical and scintigraphic probabilities allowed confirmation or exclusion of PE in 80% of subjects (107/134) and proved to be a valuable tool for selecting patients who needed pulmonary angiography, which was required in 20% of our patients (27/134). (orig.)

  17. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  18. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  19. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  20. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  1. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  2. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  3. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  4. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  5. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  6. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  7. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship-ship c...

  8. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  9. The probability premium: a graphical representation

    NARCIS (Netherlands)

    L.R. Eeckhoudt; R.J.A. Laeven

    2015-01-01

    We illustrate that Pratt’s probability premium can be given a simple graphical representation allowing a direct comparison to the equivalent but more prevalent concept of risk premium under expected utility. We also show that the probability premium’s graphical representation under the dual theory m

  10. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  11. Incorporating medical interventions into carrier probability estimation for genetic counseling

    Directory of Open Access Journals (Sweden)

    Katki Hormuzd A

    2007-03-01

    Full Text Available Abstract Background Mendelian models for predicting who may carry an inherited deleterious mutation of known disease genes based on family history are used in a variety of clinical and research activities. People presenting for genetic counseling are increasingly reporting risk-reducing medical interventions in their family histories because, recently, a slew of prophylactic interventions have become available for certain diseases. For example, oophorectomy reduces risk of breast and ovarian cancers, and is now increasingly being offered to women with family histories of breast and ovarian cancer. Mendelian models should account for medical interventions because interventions modify mutation penetrances and thus affect the carrier probability estimate. Methods We extend Mendelian models to account for medical interventions by accounting for post-intervention disease history through an extra factor that can be estimated from published studies of the effects of interventions. We apply our methods to incorporate oophorectomy into the BRCAPRO model, which predicts a woman's risk of carrying mutations in BRCA1 and BRCA2 based on her family history of breast and ovarian cancer. This new BRCAPRO is available for clinical use. Results We show that accounting for interventions undergone by family members can seriously affect the mutation carrier probability estimate, especially if the family member has lived many years post-intervention. We show that interventions have more impact on the carrier probability as the benefits of intervention differ more between carriers and non-carriers. Conclusion These findings imply that carrier probability estimates that do not account for medical interventions may be seriously misleading and could affect a clinician's recommendation about offering genetic testing. The BayesMendel software, which allows one to implement any Mendelian carrier probability model, has been extended to allow medical interventions, so future

  12. Multivariate Markov chain analysis of the probability of pregnancy in infertile couples undergoing assisted reproduction

    NARCIS (Netherlands)

    J. McDonnell; A.J. Goverde (Angelique); J.P.W. Vermeiden; F.F.H. Rutten (Frans)

    2002-01-01

    textabstractBACKGROUND: Estimating the probability of pregnancy leading to delivery and the influence of clinical factors on that probability is of fundamental importance in the treatment counselling of infertile couples. A variety of statistical techniques have been used to analys

  13. Angular anisotropy representation by probability tables

    International Nuclear Information System (INIS)

    In this paper, we improve point-wise or group-wise angular anisotropy representation by using probability tables. The starting point of this study was to give more flexibility (sensitivity analysis) and more accuracy (ray effect) to group-wise anisotropy representation by Dirac functions, independently introduced at CEA (Mao, 1998) and at IRSN (Le Cocq, 1998) ten years ago. Basing ourselves on our experience of cross-section description, acquired in CALENDF (Sublet et al., 2006), we introduce two kinds of moment based probability tables, Dirac (DPT) and Step-wise (SPT) Probability Tables where the angular probability distribution is respectively represented by Dirac functions or by a step-wise function. First, we show how we can improve equi-probable cosine representation of point-wise anisotropy by using step-wise probability tables. Then we show, by Monte Carlo techniques, how we can obtain a more accurate description of group-wise anisotropy than the one usually given by a finite expansion on a Legendre polynomial basis (that can induce negative values) and finally, we describe it by Dirac probability tables. This study is carried out in the framework of GALILEE project R and D activities (Coste-Delclaux, 2008). (authors)

  14. Survival probability in patients with liver trauma.

    Science.gov (United States)

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  15. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  16. Advanced Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob

  17. Basic Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first

  18. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  19. Probability Distributions for a Surjective Unimodal Map

    Institute of Scientific and Technical Information of China (English)

    HongyanSUN; LongWANG

    1996-01-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  20. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  1. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  2. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  3. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  4. Size constrained unequal probability sampling with a non-integer sum of inclusion probabilities

    OpenAIRE

    Grafström, Anton; Qualité, Lionel; Tillé, Yves; Matei, Alina

    2012-01-01

    More than 50 methods have been developed to draw unequal probability samples with fixed sample size. All these methods require the sum of the inclusion probabilities to be an integer number. There are cases, however, where the sum of desired inclusion probabilities is not an integer. Then, classical algorithms for drawing samples cannot be directly applied. We present two methods to overcome the problem of sample selection with unequal inclusion probabilities when their sum is not an integer ...

  5. Choosing information variables for transition probabilities in a time-varying transition probability Markov switching model

    OpenAIRE

    Andrew J. Filardo

    1998-01-01

    This paper discusses a practical estimation issue for time-varying transition probability (TVTP) Markov switching models. Time-varying transition probabilities allow researchers to capture important economic behavior that may be missed using constant (or fixed) transition probabilities. Despite its use, Hamilton’s (1989) filtering method for estimating fixed transition probability Markov switching models may not apply to TVTP models. This paper provides a set of sufficient conditions to justi...

  6. Probability of spent fuel transportation accidents

    Energy Technology Data Exchange (ETDEWEB)

    McClure, J. D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10/sup -7/ spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10/sup -9//mile.

  7. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  8. Transition probabilities in superfluid He4

    International Nuclear Information System (INIS)

    The transition probabilities between various states of superfluid helium-4 are found by using the approximation method of Bogolyubov and making use of his canonical transformations for different states of transitions. (author)

  9. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  10. Inclusion probability with dropout: an operational formula.

    Science.gov (United States)

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  11. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  12. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  13. Asymmetry of the work probability distribution

    OpenAIRE

    Saha, Arnab; Bhattacharjee, J. K.

    2006-01-01

    We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.

  14. Transition Probability and the ESR Experiment

    Science.gov (United States)

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  15. Transition Probability Estimates for Reversible Markov Chains

    OpenAIRE

    Telcs, Andras

    2000-01-01

    This paper provides transition probability estimates of transient reversible Markov chains. The key condition of the result is the spatial symmetry and polynomial decay of the Green's function of the chain.

  16. The Animism Controversy Revisited: A Probability Analysis

    Science.gov (United States)

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  17. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  18. Avoiding Negative Probabilities in Quantum Mechanics

    OpenAIRE

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless questi...

  19. Breakdown Point Theory for Implied Probability Bootstrap

    OpenAIRE

    Lorenzo Camponovo; Taisuke Otsu

    2011-01-01

    This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulati...

  20. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  1. The Pauli equation for probability distributions

    International Nuclear Information System (INIS)

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  2. The Pauli equation for probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it

    2001-04-27

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  3. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  4. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  5. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  6. A case concerning the improved transition probability

    OpenAIRE

    Tang, Jian; Wang, An Min

    2006-01-01

    As is well known, the existed perturbation theory can be applied to calculations of energy, state and transition probability in many quantum systems. However, there are different paths and methods to improve its calculation precision and efficiency in our view. According to an improved scheme of perturbation theory proposed by [An Min Wang, quant-ph/0611217], we reconsider the transition probability and perturbed energy for a Hydrogen atom in a constant magnetic field. We find the results obt...

  7. Atomic transition probabilities of neutral samarium

    International Nuclear Information System (INIS)

    Absolute atomic transition probabilities from a combination of new emission branching fraction measurements using Fourier transform spectrometer data with radiative lifetimes from recent laser induced fluorescence measurements are reported for 299 lines of the first spectrum of samarium (Sm i). Improved values for the upper and lower energy levels of these lines are also reported. Comparisons to published transition probabilities from earlier experiments show satisfactory and good agreement with two of the four published data sets. (paper)

  8. Validation of fluorescence transition probability calculations

    OpenAIRE

    M. G. PiaINFN, Sezione di Genova; P. Saracco(INFN, Sezione di Genova); Manju Sudhaka(INFN, Sezione di Genova)

    2015-01-01

    A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimen...

  9. Generalized couplings and convergence of transition probabilities

    OpenAIRE

    Kulik, Alexei; Scheutzow, Michael

    2015-01-01

    We provide sufficient conditions for the uniqueness of an invariant measure of a Markov process as well as for the weak convergence of transition probabilities to the invariant measure. Our conditions are formulated in terms of generalized couplings. We apply our results to several SPDEs for which unique ergodicity has been proven in a recent paper by Glatt-Holtz, Mattingly, and Richards and show that under essentially the same assumptions the weak convergence of transition probabilities actu...

  10. Semiclassical transition probabilities for interacting oscillators

    OpenAIRE

    Khlebnikov, S. Yu.

    1994-01-01

    Semiclassical transition probabilities characterize transfer of energy between "hard" and "soft" modes in various physical systems. We establish the boundary problem for singular euclidean solutions used to calculate such probabilities. Solutions are found numerically for a system of two interacting quartic oscillators. In the double-well case, we find numerical evidence that certain regular {\\em minkowskian} trajectories have approximate stopping points or, equivalently, are approximately pe...

  11. Country Default Probabilities: Assessing and Backtesting

    OpenAIRE

    Vogl, Konstantin; Maltritz, Dominik; Huschens, Stefan; Karmann, Alexander

    2006-01-01

    We address the problem how to estimate default probabilities for sovereign countries based on market data of traded debt. A structural Merton-type model is applied to a sample of emerging market and transition countries. In this context, only few and heterogeneous default probabilities are derived, which is problematic for backtesting. To deal with this problem, we construct likelihood ratio test statistics and quick backtesting procedures.

  12. Transition probability studies in 175Au

    OpenAIRE

    Grahn, Tuomas; Watkins, H.; Joss, David; Page, Robert; Carroll, R. J.; Dewald, A.; Greenlees, Paul; Hackstein, M.; Herzberg, Rolf-Dietmar; Jakobsson, Ulrika; Jones, Peter; Julin, Rauno; Juutinen, Sakari; Ketelhut, Steffen; Kröll, Th

    2013-01-01

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms...

  13. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  14. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  15. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  16. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Roger E. Feeley

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  17. An Orientation Program for Clinical Adjunct Faculty.

    Science.gov (United States)

    Rice, Gwendolyn

    2016-01-01

    Having highly competent clinical faculty in an institution of higher learning is a prerequisite for graduating safe nurses in the future. The purpose of this project was to increase each clinical nurse's knowledge and skills for the new role of clinical adjunct nursing faculty. Successful implementation of this program will help promote consistency in effective job performance of clinical adjunct faculty and facilitate achievement of the projected goals and outcomes. This orientation program was presented in a one day face-to-face encounter with twelve (12) adjunct faculty members, tenured and others on the tenured track. These faculty members were hired by City Colleges of Chicago (CCC) School of Nursing Program at the Malcolm X College. Presentations were given by attendees with a lesson plan. Pre-test, post-test and evaluation forms were presented and it was agreed that an orientation program should be developed and presented to all newly hired clinical adjunct nursing faculty at CCC. PMID:26930766

  18. Probability to retrieve testicular spermatozoa in azoospermic patients

    Institute of Scientific and Technical Information of China (English)

    H.-J.Glander; L.-C.Horn; W.Dorschner; U.Paasch; J.Kratzsch

    2000-01-01

    Aim: The degree of probability to retrieve spermatozoa from testicular tissue for intracytoplasmic sperm injection into oocytes is of interest for counselling of infertility patients. We investigated the relation of sperm retrieval to clinical data and histological pattern in testicular biopsies from azoospermic patients. Methods: In 264 testicular biopsies from 142 azoospermic patients, the testicular tissue was shredded to separate the spermatozoa, histological semi - thin sections of which were then evaluated using Johnsen score. Results: The retrieval of spermatozoa correlated significantly ( P 18 U/L, testicular volume < 5 mL, mean Johnsen score<5, and maximum Johnsen score < 7.

  19. Presmoothing the transition probabilities in the illness-death model

    OpenAIRE

    Amorim, Ana Paula de; De Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2011-01-01

    Abstract One major goal in clinical applications of multi-state models is the estimation of transition probabilities. In a recent paper, Meira-Machado, de U?a-Alvarez and Cadarso-Suarez (2006) introduce a substitute for the Aalen- Johansen estimator in the case of a non-Markov illness-death model. The idea behind their estimator is to weight the data by the Kaplan-Meier weights pertaining to the distribution of the total survival time of the process. In this paper we propose a modi...

  20. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  1. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  2. Consistent probabilities in loop quantum cosmology

    CERN Document Server

    Craig, David A

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler-DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce vs. a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation v...

  3. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  4. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  5. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  6. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  7. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  8. Probabilities and Signalling in Quantum Field Theory

    CERN Document Server

    Dickinson, Robert; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.

  9. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  10. Match probabilities in racially admixed populations.

    Science.gov (United States)

    Lange, K

    1993-02-01

    The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors.

  11. Introduction: Research and Developments in Probability Education

    Directory of Open Access Journals (Sweden)

    Manfred Borovcnik

    2009-10-01

    Full Text Available In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the world and the result of an experiment in electronic communication. For convenience of international readers, abstracts in Spanish and German have been supplied, as well as hints for navigation to linked electronic materials.

  12. Sampling Quantum Nonlocal Correlations with High Probability

    Science.gov (United States)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  13. Conditional Probabilities and Collapse in Quantum Measurements

    Science.gov (United States)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  14. Probability, statistics, and decision for civil engineers

    CERN Document Server

    Benjamin, Jack R

    2014-01-01

    Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and

  15. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  16. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  17. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth;

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  18. Steering in spin tomographic probability representation

    Science.gov (United States)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  19. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  20. Probability groups as orbits of groups

    International Nuclear Information System (INIS)

    The set of double cosets of a group with respect to a subgroup and the set of orbits of a group with respect to a group of automorphisms have structures which can be studied as multigroups, hypergroups or Pasch geometries. When the subgroup or the group of automorphisms are finite, the multivalued products can be provided with some weightages forming so-called Probability Groups. It is shown in this paper that some abstract probability groups can be realized as orbit spaces of groups. (author)

  1. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  2. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  3. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  4. A structural model of intuitive probability

    CERN Document Server

    Dessalles, Jean-Louis

    2011-01-01

    Though the ability of human beings to deal with probabilities has been put into question, the assessment of rarity is a crucial competence underlying much of human decision-making and is pervasive in spontaneous narrative behaviour. This paper proposes a new model of rarity and randomness assessment, designed to be cognitively plausible. Intuitive randomness is defined as a function of structural complexity. It is thus possible to assign probability to events without being obliged to consider the set of alternatives. The model is tested on Lottery sequences and compared with subjects' preferences.

  5. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  6. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  7. Quantum probability and quantum decision making

    CERN Document Server

    Yukalov, V I

    2016-01-01

    A rigorous general definition of quantum probability is given, which is valid for elementary events and for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  8. Transition probability studies in 175Au

    International Nuclear Information System (INIS)

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms of available systematics as a function of atomic number and aligned angular momentum.

  9. Electric quadrupole transition probabilities for atomic lithium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT

  10. Poisson spaces with a transition probability

    OpenAIRE

    Landsman, N. P.

    1997-01-01

    The common structure of the space of pure states $P$ of a classical or a quantum mechanical system is that of a Poisson space with a transition probability. This is a topological space equipped with a Poisson structure, as well as with a function $p:P\\times P-> [0,1]$, with certain properties. The Poisson structure is connected with the transition probabilities through unitarity (in a specific formulation intrinsic to the given context). In classical mechanics, where $p(\\rho,\\sigma)=\\dl_{\\rho...

  11. What is probability? The importance of probability when dealing with technical risks

    International Nuclear Information System (INIS)

    The book handles the following themes: - different aspects in connection with the probability concept including the mathematical fundamentals, - the importance of the probability concepts for the estimation of the effects of various activities, - the link between risk and time and the utilisation of concepts for describing this link, - the application of the probability concept in various engineering fields, - complementary attempts for the probabilistic safety analysis of systems. figs., tabs., refs

  12. Reduced reward-related probability learning in schizophrenia patients

    Directory of Open Access Journals (Sweden)

    Yılmaz A

    2012-01-01

    Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation

  13. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  14. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  15. Probable Bright Supernova discovered by PSST

    Science.gov (United States)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-09-01

    A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  16. Laplace's 1774 Memoir on Inverse Probability

    OpenAIRE

    Stigler, Stephen M.

    1986-01-01

    Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, and English translation of this epochal memoir is given.

  17. Probability Theories and the Justification of Theism

    OpenAIRE

    Portugal, Agnaldo Cuoco

    2003-01-01

    In the present paper I intend to analyse, criticise and suggest an alternative to Richard Swinburne"s use of Bayes"s theorem to justify the belief that there is a God. Swinburne"s contribution here lies in the scope of his project and the interpretation he adopts for Bayes"s formula, a very important theorem of the probability calculus.

  18. The Pauli Equation for Probability Distributions

    OpenAIRE

    Mancini, S.; Man'ko, O. V.; Man'ko, V. I.; Tombesi, P.

    2000-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  19. The Pauli Equation for Probability Distributions

    CERN Document Server

    Mancini, S; Man'ko, V I; Tombesi, P

    2001-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  20. Partially Specified Probabilities: Decisions and Games

    OpenAIRE

    Ehud Lehrer

    2012-01-01

    The paper develops a theory of decision making based on partially specified probabilities. It takes an axiomatic approach using Anscombe and Aumann's (1963) setting, and is based on the concave integral for capacities. This theory is then expanded to interactive models in order to extend Nash equilibrium by introducing the concept of partially specified equilibrium. (JEL C70, D81, D83)

  1. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  2. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...

  3. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  4. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  5. Learning a Probability Distribution Efficiently and Reliably

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  6. Adiabatic transition probability for a tangential crossing

    OpenAIRE

    Watanabe, Takuya

    2006-01-01

    We consider a time-dependent Schrödinger equation whose Hamiltonian is a $2\\times 2$ real symmetric matrix. We study, using an exact WKB method, the adiabatic limit of the transition probability in the case where several complex eigenvalue crossing points accumulate to one real point.

  7. Markov Chains with Stochastically Stationary Transition Probabilities

    OpenAIRE

    Orey, Steven

    1991-01-01

    Markov chains on a countable state space are studied under the assumption that the transition probabilities $(P_n(x,y))$ constitute a stationary stochastic process. An introductory section exposing some basic results of Nawrotzki and Cogburn is followed by four sections of new results.

  8. Dynamic Estimation of Credit Rating Transition Probabilities

    OpenAIRE

    Berd, Arthur M.

    2009-01-01

    We present a continuous-time maximum likelihood estimation methodology for credit rating transition probabilities, taking into account the presence of censored data. We perform rolling estimates of the transition matrices with exponential time weighting with varying horizons and discuss the underlying dynamics of transition generator matrices in the long-term and short-term estimation horizons.

  9. A real formula for transition probabilities

    Directory of Open Access Journals (Sweden)

    Alessandra Luati

    2007-10-01

    Full Text Available Transition probabilities between states in two dimensional quantum systems are derived as functions of unit vectors in R3 instead of state vectors in C2. This can be done once represented states and von Neumann measurements acting on C2 by means of vectors on the unit sphere of R3.

  10. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  11. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi...

  12. The albedo effect on neutron transmission probability.

    Science.gov (United States)

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  13. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  14. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  15. Need for probabilities in cancer litigation

    International Nuclear Information System (INIS)

    The third article in a series on radiation and the courts considers the new concept of probability of causation (PC), which the author concludes is the best of imperfect approaches. The problem arises from the medical inability to state that the particular cancer was the result of a specific exposure to radiation. Epidemiological and statistical evidence as the basis for probable cause has precedents in other situations where absolute certainty is unattainable. Alternatives to PC include threshold dose levels, referrals to judges with demonstrated scientific understanding, and improvements in the legal skills used in trying cases. Although PC yields more consistent results, the approach is best because it encompasses the advantages of the other approaches. 20 references

  16. Success Probability Assessment Based on Information Entropy

    Directory of Open Access Journals (Sweden)

    Xuan Chen

    2010-04-01

    Full Text Available The Bayesian method is superior to the classical statistical method on condition of small sample test. However, its evaluation results are not so good if subjective prior information is intervened. The success probability assessment about the success or failure tests of weapon products focussed in this paper, and a fusing evaluation method based on information entropy is proposed. Firstly, data from equivalent surrogate tests is converted into the prior information of an equivalent source by the information entropy theory. Secondly, the prior distribution of the success probability is identified via the Bootstrap method, and the posterior distribution is provided by the Bayesian method with the information of prototype tests in succession. Lastly, an example is given, and the results show that the proposed method is effective and valuable.Defence Science Journal, 2010, 60(3, pp.271-275, DOI:http://dx.doi.org/10.14429/dsj.60.353

  17. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  18. Earthquake probabilities: theoretical assessments and reality

    Science.gov (United States)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  19. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  20. Radiationless transition probabilities in muonic 209Bi

    International Nuclear Information System (INIS)

    The probability for non-radiative (n.r.) excitations in muonic 209Bi was determined from a (μ-, γγ)-measurement by comparing the intensities of muonic X-ray transitions in single and coincidence spectra. The values of Pn.r.(3p→1s)=(17.9±2.0)% and Pn.r.(3d→1s)=(3.0±2.2)% were measured for the first time. The strength of the n.r. decay of the 2p-level was found to be (4.2±2.2)%. The n.r. transition probabilities of two subcomplexes of the (2p→1s)-transition leading to different mean excitation energies are (3.2±1.8)% and (5.0±2.0)%, respectively. (orig.)

  1. Investigation of probable decays in rhenium isotopes

    International Nuclear Information System (INIS)

    Making use of effective liquid drop model (ELDM), the feasibility of proton and alpha decays and various cluster decays is analysed theoretically. For different neutron-rich and neutron-deficient isotopes of Rhenium in the mass range 150 < A < 200, the half-lives of proton and alpha decays and probable cluster decays are calculated considering the barrier potential as the effective liquid drop one which is the sum of Coulomb, surface and centrifugal potentials. The calculated half-lives for proton decay from various Rhenium isotopes are then compared with the universal decay law (UDL) model to assess the efficiency of the present formalism. Geiger-Nuttal plots of the probable decays are analysed and their respective slopes and intercepts are evaluated

  2. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  3. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  4. Registration probability of alphas in cellulose nitrate

    International Nuclear Information System (INIS)

    Registration 'thresholds' of alpha particles in cellulose nitrate plastic present a statistical behaviour. The effect depends on etching conditions. It is particularly large in strong etching conditions, in which registration is transposed to high energies: up to 7.7 MeV for the conditions and energies studied. 'Registration probability' expresses more adequately the effect of registration constraints. The study of registration probability indicates that the 'target theory' can describe the effect. The parameters of target theory, m (number of targets) and D0 (the equivalent of biological dose D37) were found to be: m = 5 and D0 = 3 x 107 erg cm-3. Dose distribution around the trajectory of alphas of various energies is estimated. It is also deduced that track development takes place when the required dose for registration is deposited at a distance r >= 20 A from particle trajectory. (author)

  5. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  6. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  7. An introduction to measure-theoretic probability

    CERN Document Server

    Roussas, George G

    2004-01-01

    This book provides in a concise, yet detailed way, the bulk of the probabilistic tools that a student working toward an advanced degree in statistics,probability and other related areas, should be equipped with. The approach is classical, avoiding the use of mathematical tools not necessary for carrying out the discussions. All proofs are presented in full detail.* Excellent exposition marked by a clear, coherent and logical devleopment of the subject* Easy to understand, detailed discussion of material* Complete proofs

  8. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  9. Calculating nuclear accident probabilities from empirical frequencies

    OpenAIRE

    Ha-Duong, Minh; Journé, V.

    2014-01-01

    International audience Since there is no authoritative, comprehensive and public historical record of nuclear power plant accidents, we reconstructed a nuclear accident data set from peer-reviewed and other literature. We found that, in a sample of five random years, the worldwide historical frequency of a nuclear major accident, defined as an INES level 7 event, is 14 %. The probability of at least one nuclear accident rated at level ≥4 on the INES scale is 67 %. These numbers are subject...

  10. Unseated septifoil non-detection probability

    International Nuclear Information System (INIS)

    The frequency that the Savannah River K-Reactor would proceed beyond hydraulic startup with a septifoil not properly seated is estimated in this report. It summarizes previous work on this subject, incorporates concerns about the utility of individual septifoil pressure measurements, and discusses two proposed techniques that could lower the non-detection probability to the point that this issue could be beyond Design Basis consideration

  11. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  12. Computational methods for probability of instability calculations

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  13. Quantile Probability and Statistical Data Modeling

    OpenAIRE

    Parzen, Emanuel

    2004-01-01

    Quantile and conditional quantile statistical thinking, as I have innovated it in my research since 1976, is outlined in this comprehensive survey and introductory course in quantile data analysis. We propose that a unification of the theory and practice of statistical methods of data modeling may be possible by a quantile perspective. Our broad range of topics of univariate and bivariate probability and statistics are best summarized by the key words. Two fascinating practical examples are g...

  14. A probability loophole in the CHSH

    CERN Document Server

    Geurdes, J F

    2014-01-01

    In the present paper a robustness stress-test of the CHSH experiments for Einstein locality and causality is designed and employed. Random A and B from dice and coins, but based on a local model, run "parallel" to a real experiment. We found a local causal model with a nonzero probability to violate the CHSH inequality for some relevant quartets $\\mathcal{Q}$ of settings in the series of trials.

  15. A probability loophole in the CHSH

    Directory of Open Access Journals (Sweden)

    Han Geurdes

    2014-01-01

    Full Text Available In the present paper a robustness stress-test of the CHSH experiments for Einstein locality and causality is designed and employed. Random A and B from dice and coins, but based on a local model, run ”parallel” to a real experiment. We found a local causal model with a nonzero probability to violate the CHSH inequality for some relevant quartets Q of settings in the series of trials.

  16. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  17. Investigation of Flood Inundation Probability in Taiwan

    Science.gov (United States)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  18. Free Energy Changes, Fluctuations, and Path Probabilities

    OpenAIRE

    Hoover, William G.; Hoover, Carol G.

    2011-01-01

    We illustrate some of the static and dynamic relations discovered by Cohen, Crooks, Evans, Jarzynski, Kirkwood, Morriss, Searles, and Zwanzig. These relations link nonequilibrium processes to equilibrium isothermal free energy changes and to dynamical path probabilities. We include ideas suggested by Dellago, Geissler, Oberhofer, and Schoell-Paschinger. Our treatment is intended to be pedagogical, for use in an updated version of our book: Time Reversibility, Computer Simulation, and Chaos. C...

  19. Probable Psittacosis Outbreak Linked to Wild Birds

    OpenAIRE

    Telfer, Barbara L.; Moberley, Sarah A.; Hort, Krishna P.; Branley, James M.; Dominic E. Dwyer; Muscatello, David J; Correll, Patricia K.; England, John; McAnulty, Jeremy M.

    2005-01-01

    In autumn 2002, an outbreak of probable psittacosis occurred among residents of the Blue Mountains district, Australia. We conducted a case-control study to determine independent risk factors for psittacosis by comparing exposures between hospitalized patients and other residents selected randomly from the telephone directory. Of the 59 case-patients with laboratory results supportive of psittacosis, 48 participated in a case-control study with 310 controls. Independent risk factors were resi...

  20. Picturing mobility: Transition probability color plots

    OpenAIRE

    Philippe Kerm

    2011-01-01

    This talk presents a simple graphical device for visualization of patterns of income mobility. The device uses color palettes to picture information contained in transition matrices created from a fine partition of the marginal distributions. The talk explains how these graphs can be constructed using the user-written package spmap from Maurizio Pisati, briefly presents the wrapper command transcolorplot (for transition probability color plots) and demonstrates how such graphs are effective f...

  1. Transition Probability (Fidelity) and Its Relatives

    OpenAIRE

    Uhlmann, Armin

    2011-01-01

    Transition Probability (fidelity) for pairs of density operators can be defined as "functor" in the hierarchy of "all" quantum systems and also within any quantum system. The introduction of "amplitudes" for density operators allows for a more intuitive treatment of these quantities, also pointing to a natural parallel transport. The latter is governed by a remarkable gauge theory with strong relations to the Riemann-Bures metric.

  2. Continuum ionization transition probabilities of atomic oxygen

    Science.gov (United States)

    Samson, J. A. R.; Petrosky, V. E.

    1974-01-01

    The technique of photoelectron spectroscopy was employed in the investigation. Atomic oxygen was produced in a microwave discharge operating at a power of 40 W and at a pressure of approximately 20 mtorr. The photoelectron spectrum of the oxygen with and without the discharge is shown. The atomic states can be clearly seen. In connection with the measurement of the probability for transitions into the various ionic states, the analyzer collection efficiency was determined as a function of electron energy.

  3. Transition choice probabilities and welfare in ARUM's

    OpenAIRE

    de Palma, André; Kilani, Karim

    2009-01-01

    We study the descriptive and the normative consequences of price and/or other attributes changes in additive random utility models. We first derive expressions for the transition choice probabilities associated to these changes. A closed-form formula is obtained for the logit. We then use these expressions to compute the cumulative distribution functions of the compensating variation conditional on ex-ante and/or ex-post choices. The unconditional distribution is also provided. The conditiona...

  4. Ragnar Frisch and the Probability Approach

    OpenAIRE

    BJERKHOLT, Olav; DUPONT, Ariane

    2011-01-01

    The title hints at the attention given to the lack of probability considerations in the econometric work of the recognized pioneer of econometrics, Ragnar Frisch. Clues to a better understanding of his position may be found in his comprehensive archive and correspondence. This essay gives a brief overview of Frisch's scientific archive and exhibits from his search for econometric methods. It also sets out a selection of letters exchanged between Frisch and other leading members of the Econome...

  5. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    Science.gov (United States)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  6. A Tutorial Introduction to the Logic of Parametric Probability

    OpenAIRE

    Norman, Joseph W.

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are so...

  7. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  8. School and conference on probability theory

    International Nuclear Information System (INIS)

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  9. Instability of Wave Trains and Wave Probabilities

    Science.gov (United States)

    Babanin, Alexander

    2013-04-01

    Centre for Ocean Engineering, Science and Technology, Swinburne University of Technology, Melbourne, Australia, ababanin@swin.edu.au Design criteria in ocean engineering, whether this is one in 50 years or one in 5000 years event, are hardly ever based on measurements, and rather on statistical distributions of relevant metocean properties. Of utmost interest is the tail of distribution, that is rare events such as the highest waves with low probability. Engineers have long since realised that the superposition of linear waves with narrow-banded spectrum as depicted by the Rayleigh distribution underestimates the probability of extreme wave heights and crests, which is a critical shortcoming as far as the engineering design is concerned. Ongoing theoretical and experimental efforts have been under way for decades to address this issue. Typical approach is the treating all possible waves in the ocean or at a particular location as a single ensemble for which some comprehensive solution can be obtained. The oceanographic knowledge, however, now indicates that no single and united comprehensive solution is available. We would expect the probability distributions of wave height to depend on a) whether the waves are at the spectral peak or at the tail; b) on wave spectrum and mean steepness in the wave field; c) on the directional distribution of the peak waves; d) on whether the waves are in deep water, in intermediate depth or in shallow water; e) on wave breaking; f) on the wind, particularly if it is very strong, and on the currents if they have suitable horizontal gradients. Probability distributions in the different circumstances according to these groups of conditions should be different, and by combining them together the inevitable scatter is introduced. The scatter and the accuracy will not improve by increasing the bulk data quality and quantity, and it hides the actual distribution of extremes. The groups have to be separated and their probability

  10. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  11. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (Sst, Sst, pst) for stochastic uncertainty, a probability space (Ssu, Ssu, psu) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (Sst, Sst, pst) and (Ssu, Ssu, psu). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  12. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  13. Probable Mechanisms of Needling Therapies for Myofascial Pain Control

    Directory of Open Access Journals (Sweden)

    Li-Wei Chou

    2012-01-01

    Full Text Available Myofascial pain syndrome (MPS has been defined as a regional pain syndrome characterized by muscle pain caused by myofascial trigger points (MTrPs clinically. MTrP is defined as the hyperirritable spot in a palpable taut band of skeletal muscle fibers. Appropriate treatment to MTrPs can effectively relieve the clinical pain of MPS. Needling therapies, such as MTrP injection, dry needling, or acupuncture (AcP can effectively eliminate pain immediately. AcP is probably the first reported technique in treating MPS patients with dry needling based on the Traditional Chinese Medicine (TCM theory. The possible mechanism of AcP analgesia were studied and published in recent decades. The analgesic effect of AcP is hypothesized to be related to immune, hormonal, and nervous systems. Compared to slow-acting hormonal system, nervous system acts in a faster manner. Given these complexities, AcP analgesia cannot be explained by any single mechanism. There are several principles for selection of acupoints based on the TCM principles: “Ah-Shi” point, proximal or remote acupoints on the meridian, and extra-meridian acupoints. Correlations between acupoints and MTrPs are discussed. Some clinical and animal studies of remote AcP for MTrPs and the possible mechanisms of remote effectiveness are reviewed and discussed.

  14. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  15. Intermediate Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner--developing sp

  16. Acceleration Detection of Large (Probably Prime Numbers

    Directory of Open Access Journals (Sweden)

    Dragan Vidakovic

    2013-02-01

    Full Text Available In order to avoid unnecessary applications of Miller-Rabin algorithm to the number in question, we resortto trial division by a few initial prime numbers, since such a division take less time. How far we should gowith such a division is the that we are trying to answer in this paper?For the theory of the matter is fullyresolved. However, that in practice we do not have much use.Therefore, we present a solution that isprobably irrelevant to theorists, but it is very useful to people who have spent many nights to producelarge (probably prime numbers using its own software.

  17. Data analysis & probability task & drill sheets

    CERN Document Server

    Cook, Tanya

    2011-01-01

    For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro

  18. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  19. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  20. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  1. Numerical Ultimate Ruin Probabilities under Interest Force

    Directory of Open Access Journals (Sweden)

    Juma Kasozi

    2005-01-01

    Full Text Available This work addresses the issue of ruin of an insurer whose portfolio is exposed to insurance risk arising from the classical surplus process. Availability of a positive interest rate in the financial world forces the insurer to invest into a risk free asset. We derive a linear Volterra integral equation of the second kind and apply an order four Block-by-block method in conjuction with the Simpson rule to solve the Volterra equation for ultimate ruin. This probability is arrived at by taking a linear combination of some two solutions to the Volterra integral equation. The several numerical examples given show that our results are excellent and reliable.

  2. Lifetimes and transition probabilities in Kr V

    International Nuclear Information System (INIS)

    Weighted oscillator strengths (gf), weighted transition probabilities (gA) and lifetimes are presented for all experimentally known dipole transitions and levels of Kr V. Values were determined by four methods. Three of them are based on the Hartree-Fock method, including relativistic corrections and core-polarization effects, with electrostatic parameters optimized by a least-squares procedure in order to obtain energy levels adjusted to the corresponding experimental values. The fourth method is based on a relativistic multiconfigurational Dirac-Fock approach. In addition, 47 new classified lines belonging to the Kr V spectrum are presented.

  3. Necessity of Exact Calculation for Transition Probability

    Institute of Scientific and Technical Information of China (English)

    LIU Fu-Sui; CHEN Wan-Fang

    2003-01-01

    This paper shows that exact calculation for transition probability can make some systems deviate fromFermi golden rule seriously. This paper also shows that the corresponding exact calculation of hopping rate inducedby phonons for deuteron in Pd-D system with the many-body electron screening, proposed by Ichimaru, can explainthe experimental fact observed in Pd-D system, and predicts that perfection and low-dimension of Pd lattice are veryimportant for the phonon-induced hopping rate enhancement in Pd-D system.

  4. Improved Ar(II) transition probabilities

    OpenAIRE

    Danzmann, K.; de Kock, M

    1986-01-01

    Precise Ar(II) branching ratios have been measured on a high current hollow cathode with a 1-m Fourier transform spectrometer. Absolute transition probabilities for 11 Ar(II) lines were calculated from these branching ratios and lifetime measurements published by Mohamed et al. For the prominent 4806 Å line, the present result is Aik = 7.12×107s-1 ±2.8%, which is in excellent agreement with recent literature data derived from pure argon diagnostics, two-wavelength-interferometry, and Hβ-diagn...

  5. Calculation of radiative transition probabilities and lifetimes

    Science.gov (United States)

    Zemke, W. T.; Verma, K. K.; Stwalley, W. C.

    1982-01-01

    Procedures for calculating bound-bound and bound-continuum (free) radiative transition probabilities and radiative lifetimes are summarized. Calculations include rotational dependence and R-dependent electronic transition moments (no Franck-Condon or R-centroid approximation). Detailed comparisons of theoretical results with experimental measurements are made for bound-bound transitions in the A-X systems of LiH and Na2. New bound-free results are presented for LiH. New bound-free results and comparisons with very recent fluorescence experiments are presented for Na2.

  6. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  7. Optimal Reinsurance with Heterogeneous Reference Probabilities

    Directory of Open Access Journals (Sweden)

    Tim J. Boonen

    2016-07-01

    Full Text Available This paper studies the problem of optimal reinsurance contract design. We let the insurer use dual utility, and the premium is an extended Wang’s premium principle. The novel contribution is that we allow for heterogeneity in the beliefs regarding the underlying probability distribution. We characterize layer-reinsurance as an optimal reinsurance contract. Moreover, we characterize layer-reinsurance as optimal contracts when the insurer faces costs of holding regulatory capital. We illustrate this in cases where both firms use the Value-at-Risk or the conditional Value-at-Risk.

  8. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  9. Subsequent investigation and management of patients with intermediate-category and - probability ventilation - perfusion scintigraphy

    International Nuclear Information System (INIS)

    The authors wished to determine the proportion of patients with intermediate-category and intermediate-probability ventilation-perfusion scintigraphy (IVQS) who proceed to further imaging for investigation of thromboembolism, to identify the defining clinical parameters and to determine the proportion of patients who have a definite imaging diagnosis of thromboembolism prior to discharge from hospital on anticoagulation therapy. One hundred and twelve VQS studies performed at the Flinders Medical Centre over a 9-month period were reported as having intermediate category and probability for pulmonary embolism. Medical case notes were available for review in 99 of these patients and from these the pretest clinical probability, subsequent patient progress and treatment were recorded. Eight cases were excluded because they were already receiving anticoagulation therapy. In the remaining 91 patients the pretest clinical probability was considered to be low in 25; intermediate in 30; and high in 36 cases. In total, 51.6% (n = 47) of these patients (8% (n = 2) with low, 66% (n = 20) with intermediate, and 69.4% (n = 25) with high pretest probability) proceeded to CT pulmonary angiography (CTPA) and/or lower limb duplex Doppler ultrasound (DUS) evaluation. Of the patients with IVQS results, 30.7% (n 28) were evaluated with CTPA. No patient with a low, all patients with a high and 46% of patients with an intermediate pretest probability initially received anticoagulation therapy. This was discontinued in three patients with high and in 12 patients with intermediate clinical probability prior to discharge from hospital. Overall, 40% of patients discharged on anticoagulation therapy (including 39% of those with a high pretest probability) had a positive imaging diagnosis of thromboembolism The results suggest that, although the majority of patients with intermediate-to-high pretest probability and IVQS proceed to further imaging investigation, CTPA is relatively underused in

  10. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  11. Probability-consistent spectrum and code spectrum

    Institute of Scientific and Technical Information of China (English)

    沈建文; 石树中

    2004-01-01

    In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.

  12. Decay Probability Ratio of Pentaquark Theta^+ State

    CERN Document Server

    Chen, X; Ma, B Q; Chen, Xun; Mao, Yajun; Ma, Bo-Qiang

    2003-01-01

    The pentaquark state of $\\Theta^{+}(uudd\\bar{s})$ has been observed to decay with two decay modes: $\\Theta^+\\to n K^+$ and $\\Theta^+ \\to p K^0$. The decay probability ratio of the two decay modes is studied with general symmetry consideration of isospin, spin, and parity. We arrive at a result of the ratio $\\frac{\\Gamma(\\Theta^+\\to nK^+)}{\\Gamma(\\Theta^+\\to pK^0)} =\\frac{(\\alpha-\\beta)^2}{(\\alpha+\\beta)^2}(\\frac{k_1}{k_2})^{2L+1}$, which is valid for the $\\Theta^+$ state to be a pure isoscalar or isovector state, or an isotensor state with mixture of isoscalar and isovector components with coefficients $\\alpha$ and $\\beta$. The dependence on spin and parity of the pentaquark $\\Theta^+$ state is found to be small due to small difference between the center of mass decay momenta $k_1$ and $k_2$ of the two decay modes. Future experimental results about the decay probability ratio may provide information about the isospin configuration of the pentaquark $\\Theta^+$ state.

  13. Atomic Transition Probabilities Scandium through Manganese

    International Nuclear Information System (INIS)

    Atomic transition probabilities for about 8,800 spectral lines of five iron-group elements, Sc(Z = 21) to Mn(Z = 25), are critically compiled, based on all available literature sources. The data are presented in separate tables for each element and stage of ionization and are further subdivided into allowed (i.e., electric dipole-E1) and forbidden (magnetic dipole-M1, electric quadrupole-E2, and magnetic quadrupole-M2) transitions. Within each data table the spectral lines are grouped into multiplets, which are in turn arranged according to parent configurations, transition arrays, and ascending quantum numbers. For each line the transition probability for spontaneous emission and the line strength are given, along with the spectroscopic designation, the wavelength, the statistical weights, and the energy levels of the upper and lower states. For allowed lines the absorption oscillator strength is listed, while for forbidden transitions the type of transition is identified (M1, E2, etc.). In addition, the estimated accuracy and the source are indicated. In short introductions, which precede the tables for each ion, the main justifications for the choice of the adopted data and for the accuracy rating are discussed. A general introduction contains a discussion of our method of evaluation and the principal criteria for our judgements

  14. Estimating flood exceedance probabilities in estuarine regions

    Science.gov (United States)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  15. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  16. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  17. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren;

    2013-01-01

    To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors.......To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors....

  18. Estimation of Transition Probabilities Using Median Absolute Deviations

    OpenAIRE

    Kim, C. S.; Schaible, Glenn D.

    1988-01-01

    The probability-constrained minimum absolute deviations (MAD) estimator appears to be superior to the probability-constrained quadratic programming estimator in estimating transition probabilities with limited aggregate time series data Futhermore, one can reduce the number of columns in the probability-constrained MAD simplex tableau by adopting the median property

  19. System Geometries and Transit/Eclipse Probabilities

    Directory of Open Access Journals (Sweden)

    Howard A.

    2011-02-01

    Full Text Available Transiting exoplanets provide access to data to study the mass-radius relation and internal structure of extrasolar planets. Long-period transiting planets allow insight into planetary environments similar to the Solar System where, in contrast to hot Jupiters, planets are not constantly exposed to the intense radiation of their parent stars. Observations of secondary eclipses additionally permit studies of exoplanet temperatures and large-scale exo-atmospheric properties. We show how transit and eclipse probabilities are related to planet-star system geometries, particularly for long-period, eccentric orbits. The resulting target selection and observational strategies represent the principal ingredients of our photometric survey of known radial-velocity planets with the aim of detecting transit signatures (TERMS.

  20. On the probability of dinosaur fleas.

    Science.gov (United States)

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-11

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.

  1. Transits Probabilities Around Hypervelocity and Runaway Stars

    CERN Document Server

    Fragione, Giacomo

    2016-01-01

    In the blooming field of exoplanetary science, NASA's Kepler Space Telescope has revolutionized our understanding of exoplanets. Kepler's very precise and long-duration photometry is ideal for detecting planetary transits around Sun-like stars. The forthcoming Transiting Exoplanet Survey Satellite (TESS) is expected to continue Kepler's legacy. In this paper, we explore the possibility of detecting planetary transits around hypervelocity and runaway stars, which should host a very compact system as consequence of their turbulent origin. We find that the probability of a multi-planetary transit is $10^{-3}\\lesssim P\\lesssim 10^{-1}$. We therefore need to observe $\\sim 10-1000$ high-velocity stars to spot a transit. We predict that the European Gaia satellite, along with TESS, could spot such transits.

  2. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  3. Generalized Bures products from free probability

    CERN Document Server

    Jarosz, Andrzej

    2012-01-01

    Inspired by the theory of quantum information, I use two non-Hermitian random matrix models - a weighted sum of circular unitary ensembles and a product of rectangular Ginibre unitary ensembles - as building blocks of three new products of random matrices which are generalizations of the Bures model. I apply the tools of both Hermitian and non-Hermitian free probability to calculate the mean densities of their eigenvalues and singular values in the thermodynamic limit, along with their divergences at zero; the results are supported by Monte Carlo simulations. I pose and test conjectures concerning the relationship between the two densities (exploiting the notion of the N-transform), the shape of the mean domain of the eigenvalues (an extension of the single ring theorem), and the universal behavior of the mean spectral density close to the domain's borderline (using the complementary error function).

  4. Some improved transition probabilities for neutral carbon

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Charlotte Froese [Atomic Physics Division, National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2006-05-14

    An earlier paper (Zatsarinny O and Froese Fischer C 2002 J. Phys. B: At. Mol. Opt. Phys. 35 4669) presented oscillator strengths for transitions from the 2p{sup 2} 3P term to high-lying excited states of carbon. The emphasis was on the accurate prediction of energy levels relative to the ionization limit and allowed transition data from the ground state. The present paper reports some refined transition probability calculations for transitions from 2p{sup 2}, {sup 3}P, 1{sup D}, and {sup 1}S to all odd levels up to 2p3d{sup 3}P. Particular attention is given to intercombination lines where relativistic effects are most important.

  5. Priority probability deceleration deadline-aware TCP

    Institute of Scientific and Technical Information of China (English)

    Jin Ye; Jing Lin; Jiawei Huang

    2015-01-01

    In modern data centers, because of the deadline-agnostic congestion control in transmission control protocol (TCP), many deadline-sensitive flows can not finish before their deadlines. Therefore, providing a higher deadline meeting ratio becomes a critical chal enge in the typical online data intensive (OLDI) ap-plications of data center networks (DCNs). However, a problem named as priority synchronization is found in this paper, which de-creases the deadline meeting ratio badly. To solve this problem, we propose a priority probability deceleration (P2D) deadline-aware TCP. By using the novel probabilistic deceleration, P2D prevents the priority synchronization problem. Simulation results show that P2 D increases the deadline meeting ratio by 20%compared with D2TCP.

  6. Using L/E Oscillation Probability Distributions

    CERN Document Server

    Aguilar-Arevalo, A A; Bugel, L; Cheng, G; Church, E D; Conrad, J M; Dharmapalan, R; Djurcic, Z; Finley, D A; Ford, R; Garcia, F G; Garvey, G T; Grange, J; Huelsnitz, W; Ignarra, C; Imlay, R; Johnson, R A; Karagiorgi, G; Katori, T; Kobilarcik, T; Louis, W C; Mariani, C; Marsh, W; Mills, G B; Mirabal, J; Moore, C D; Mousseau, J; Nienaber, P; Osmanov, B; Pavlovic, Z; Perevalov, D; Polly, C C; Ray, H; Roe, B P; Russell, A D; Shaevitz, M H; Spitz, J; Stancu, I; Tayloe, R; Van de Water, R G; White, D H; Wickremasinghe, D A; Zeller, G P; Zimmerman, E D

    2014-01-01

    This paper explores the use of $L/E$ oscillation probability distributions to compare experimental measurements and to evaluate oscillation models. In this case, $L$ is the distance of neutrino travel and $E$ is a measure of the interacting neutrino's energy. While comparisons using allowed and excluded regions for oscillation model parameters are likely the only rigorous method for these comparisons, the $L/E$ distributions are shown to give qualitative information on the agreement of an experiment's data with a simple two-neutrino oscillation model. In more detail, this paper also outlines how the $L/E$ distributions can be best calculated and used for model comparisons. Specifically, the paper presents the $L/E$ data points for the final MiniBooNE data samples and, in the Appendix, explains and corrects the mistaken analysis published by the ICARUS collaboration.

  7. Comprehensive transition probabilities in Mo I

    International Nuclear Information System (INIS)

    Transition probabilities for 2835 lines in Mo I between 2548 A and 10565 A have been measured by combining radiative level lifetimes, excited level populations measured in an inductively coupled plasma (ICP) source, and emission branching ratios measured with the ICP source and with a hollow cathode discharge source. We show that the level populations in the ICP source approximate a thermal distribution and use this property to interpolate betwen levels of known lifetime to find the population of levels of unknown lifetime. Comparison of the spectra from the two different sources distinguishes between Mo I and II lines and detects hidden blends and self-absorption in the hollow cathode source. Improved excitation energy for many Mo I levels was extracted from the high resolution Fourier transform spectrum, and 27 new levels were found. (orig.)

  8. Trending in Probability of Collision Measurements

    Science.gov (United States)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  9. Probability of seeing increases saccadic readiness.

    Directory of Open Access Journals (Sweden)

    Thérèse Collins

    Full Text Available Associating movement directions or endpoints with monetary rewards or costs influences movement parameters in humans, and associating movement directions or endpoints with food reward influences movement parameters in non-human primates. Rewarded movements are facilitated relative to non-rewarded movements. The present study examined to what extent successful foveation facilitated saccadic eye movement behavior, with the hypothesis that foveation may constitute an informational reward. Human adults performed saccades to peripheral targets that either remained visible after saccade completion or were extinguished, preventing visual feedback. Saccades to targets that were systematically extinguished were slower and easier to inhibit than saccades to targets that afforded successful foveation, and this effect was modulated by the probability of successful foveation. These results suggest that successful foveation facilitates behavior, and that obtaining the expected sensory consequences of a saccadic eye movement may serve as a reward for the oculomotor system.

  10. Probability of detection for corrosion defects

    Energy Technology Data Exchange (ETDEWEB)

    Rudlin, J.R.; Kenzie, B.W. [TWI Ltd., Cambridge (United Kingdom)

    2002-07-01

    A variety of NDT techniques have been made available by the industry to assess the wall thickness of a pipe. These include manual ultrasonics, automated ultrasonics, magnetic flux leakage, guided wave ultrasonics and pulsed eddy current. These methods have been evaluated for the detection and sizing of localised corrosion on a set of around 50 test pipes in a project carried out by TWI, University College London, Technical Software Consultants and Bureau Veritas. Trials of the inspection methods were carried out in various situations including coated pipe and for corrosion under insulation. This paper describes the trials carried out and discusses some of the difficulties involved in producing and using probability of detection data when assessing corrosion. (orig.)

  11. Ignorance is not bliss: Statistical power is not probability of trial success.

    Science.gov (United States)

    Zierhut, M L; Bycott, P; Gibbs, M A; Smith, B P; Vicini, P

    2016-04-01

    The purpose of this commentary is to place probability of trial success, or assurance, in the context of decision making in drug development, and to illustrate its properties in an intuitive manner for the readers of Clinical Pharmacology and Therapeutics. The hope is that this will stimulate a dialog on how assurance should be incorporated into a quantitative decision approach for clinical development and trial design that uses all available information.

  12. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  13. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  14. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    Science.gov (United States)

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  15. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  16. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  17. Logic and probability in quantum mechanics

    CERN Document Server

    1976-01-01

    During the academic years 1972-1973 and 1973-1974, an intensive sem­ inar on the foundations of quantum mechanics met at Stanford on a regular basis. The extensive exploration of ideas in the seminar led to the org~ization of a double issue of Synthese concerned with the foundations of quantum mechanics, especially with the role of logic and probability in quantum meChanics. About half of the articles in the volume grew out of this seminar. The remaining articles have been so­ licited explicitly from individuals who are actively working in the foun­ dations of quantum mechanics. Seventeen of the twenty-one articles appeared in Volume 29 of Syn­ these. Four additional articles and a bibliography on -the history and philosophy of quantum mechanics have been added to the present volume. In particular, the articles by Bub, Demopoulos, and Lande, as well as the second article by Zanotti and myself, appear for the first time in the present volume. In preparing the articles for publication I am much indebted to ...

  18. Essays on probability elicitation scoring rules

    Science.gov (United States)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  19. Lectures on probability and statistics. Revision

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  20. Probability of rupture of multiple fault segments

    Science.gov (United States)

    Andrews, D.J.; Schwerer, E.

    2000-01-01

    Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.

  1. Lectures on Probability, Entropy, and Statistical Physics

    CERN Document Server

    Caticha, Ariel

    2008-01-01

    These lectures deal with the problem of inductive inference, that is, the problem of reasoning under conditions of incomplete information. Is there a general method for handling uncertainty? Or, at least, are there rules that could in principle be followed by an ideally rational mind when discussing scientific matters? What makes one statement more plausible than another? How much more plausible? And then, when new information is acquired how do we change our minds? Or, to put it differently, are there rules for learning? Are there rules for processing information that are objective and consistent? Are they unique? And, come to think of it, what, after all, is information? It is clear that data contains or conveys information, but what does this precisely mean? Can information be conveyed in other ways? Is information physical? Can we measure amounts of information? Do we need to? Our goal is to develop the main tools for inductive inference--probability and entropy--from a thoroughly Bayesian point of view a...

  2. Do aftershock probabilities decay with time?

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."

  3. XI Symposium on Probability and Stochastic Processes

    CERN Document Server

    Pardo, Juan; Rivero, Víctor; Bravo, Gerónimo

    2015-01-01

    This volume features lecture notes and a collection of contributed articles from the XI Symposium on Probability and Stochastic Processes, held at CIMAT Mexico in September 2013. Since the symposium was part of the activities organized in Mexico to celebrate the International Year of Statistics, the program included topics from the interface between statistics and stochastic processes. The book starts with notes from the mini-course given by Louigi Addario-Berry with an accessible description of some features of the multiplicative coalescent and its connection with random graphs and minimum spanning trees. It includes a number of exercises and a section on unanswered questions. Further contributions provide the reader with a broad perspective on the state-of-the art of active areas of research. Contributions by: Louigi Addario-Berry Octavio Arizmendi Fabrice Baudoin Jochen Blath Loïc Chaumont J. Armando Domínguez-Molina Bjarki Eldon Shui Feng Tulio Gaxiola Adrián González Casanova Evgueni Gordienko Daniel...

  4. Lectures on probability and statistics. Revision

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion

  5. Implications of conflicting definitions of probability to health risk communication: a case study of familial cancer and genetic counselling.

    Science.gov (United States)

    O'Doherty, Kieran C

    2007-02-01

    The question of what probability actually is has long been debated in philosophy and statistics. Although the concept of probability is fundamental to many applications in the health sciences, these debates are generally not well known to health professionals. This paper begins with an outline of some of the different interpretations of probability. Examples are provided of how each interpretation manifests in clinical practice. The discipline of genetic counselling (familial cancer) is used to ground the discussion. In the second part of the paper, some of the implications that different interpretations of probability may have in practice are examined. The main purpose of the paper is to draw attention to the fact that there is much contention as to the nature of the concept of probability. In practice, this creates the potential for ambiguity and confusion. This paper constitutes a call for deeper engagement with the ways in which probability and risk are understood in health research and practice.

  6. BETASCAN: probable beta-amyloids identified by pairwise probabilistic analysis.

    Directory of Open Access Journals (Sweden)

    Allen W Bryan

    2009-03-01

    Full Text Available Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s, there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid

  7. MATHEMATICAL EXPECTATION ABOUT DISCRETE RANDOM VARIABLE WITH INTERVAL PROBABILITY OR FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The character and an algorithm about DRVIP(discrete random variable with interval probability) and the second kind DRVFP (discrete random variable with crisp event-fuzzy probability) are researched. Using the fuzzy resolution theorem, the solving mathematical expectation of a DRVFP can be translated into solving mathematical expectation of a series of RVIP. It is obvious that solving mathematical expectation of a DRVIP is a typical linear programming problem. A very functional calculating formula for solving mathematical expectation of DRVIP was obtained by using the Dantzig's simplex method. The example indicates that the result obtained by using the functional calculating formula fits together completely with the result obtained by using the linear programming method, but the process using the formula deduced is simpler.

  8. On Explicit Probability Densities Associated with Fuss-Catalan Numbers

    OpenAIRE

    Liu, Dang-Zheng; Song, Chunwei; Wang, Zheng-Dong

    2010-01-01

    In this note we give explicitly a family of probability densities, the moments of which are Fuss-Catalan numbers. The densities appear naturally in random matrices, free probability and other contexts.

  9. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. PMID:25704578

  10. Probability-summation model of multiple laser-exposure effects.

    Science.gov (United States)

    Menendez, A R; Cheney, F E; Zuclich, J A; Crump, P

    1993-11-01

    A probability-summation model is introduced to provide quantitative criteria for discriminating independent from interactive effects of multiple laser exposures on biological tissue. Data that differ statistically from predictions of the probability-summation model indicate the action of sensitizing (synergistic/positive) or desensitizing (hardening/negative) biophysical interactions. Interactions are indicated when response probabilities vary with changes in the spatial or temporal separation of exposures. In the absence of interactions, probability-summation parsimoniously accounts for "cumulative" effects. Data analyzed using the probability-summation model show instances of both sensitization and desensitization of retinal tissue by laser exposures. Other results are shown to be consistent with probability-summation. The relevance of the probability-summation model to previous laser-bioeffects studies, models, and safety standards is discussed and an appeal is made for improved empirical estimates of response probabilities for single exposures.

  11. Polarization Mode Dispersion Probability Distribution for Arbitrary Mode Coupling

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The probability distribution of the differential group delay for arbitrary mode coupling is simulated with Monte-Carlo method. Fitting the simulation results, we obtain probability distribution function for arbitrary mode coupling.

  12. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  13. The Concept of Probability in the Work of Lord Keynes

    OpenAIRE

    Alberto Landro

    2014-01-01

    The interpretation given by Keynes to the notion of probability leads to a model in which the probability is understood as a degree of rational belief conceived as a relationship between a body of knowledge and a proposition or set of propositions. A thoughtful analysis of "Treatise on Probability" allows to conclude that: i) the Keynesian model is not a consequence but an extension of "Principia Mathematica" and "Problems of Philosophy" in which the approach to the concept of probability is ...

  14. How to Read Probability Distributions as Statements about Process

    OpenAIRE

    Frank, Steven A.

    2014-01-01

    Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken o...

  15. Contingency and its two indices within conditional probability analysis

    OpenAIRE

    Watson, John S.

    1997-01-01

    Four theoretical bases for detecting a contingency between behavior and consequent stimuli are considered: contiguity, correlation, conditional probability, and logical implication. It is argued that conditional probability analysis is statistically the most powerful of these options, in part due to its provision of two indices of contingency: a forward time probability that reinforcement follows behavior and a backward time probability that behavior precedes reinforcement. Evidence is cited ...

  16. Loss bounds for uncertain transition probabilities in Markov decision processes

    OpenAIRE

    Jaillet, Patrick; Mastin, Dana Andrew

    2012-01-01

    We analyze losses resulting from uncertain transition probabilities in Markov decision processes with bounded nonnegative rewards. We assume that policies are precomputed using exact dynamic programming with the estimated transition probabilities, but the system evolves according to different, true transition probabilities. Given a bound on the total variation error of estimated transition probability distributions, we derive upper bounds on the loss of expected total reward. The approach ana...

  17. Transition probabilities for diffusion equations by means of path integrals

    OpenAIRE

    Goovaerts, Marc; DE SCHEPPER, Ann; Decamps, Marc

    2002-01-01

    In this paper, we investigate the transition probabilities for diffusion processes. In a first part, we show how transition probabilities for rather general diffusion processes can always be expressed by means of a path integral. For several classical models, an exact calculation is possible, leading to analytical expressions for the transition probabilities and for the maximum probability paths. A second part consists of the derivation of an analytical approximation for the transition probab...

  18. Transition probabilities for diffusion equations by means of path integrals.

    OpenAIRE

    Goovaerts, Marc; De Schepper, A; Decamps, M.

    2002-01-01

    In this paper, we investigate the transition probabilities for diffusion processes. In a first part, we show how transition probabilities for rather general diffusion processes can always be expressed by means of a path integral. For several classical models, an exact calculation is possible, leading to analytical expressions for the transition probabilities and for the maximum probability paths. A second part consists of the derivation of an analytical approximation for the transition probab...

  19. Series Jackson networks and non-crossing probabilities

    OpenAIRE

    Dieker, A.B.; Warren, J.

    2008-01-01

    This paper studies the queue length process in series Jackson networks with external input to the first station. We show that its Markov transition probabilities can be written as a finite sum of non-crossing probabilities, so that questions on time-dependent queueing behavior are translated to questions on non-crossing probabilities. This makes previous work on non-crossing probabilities relevant to queueing systems and allows new queueing results to be established. To illustrate the latter,...

  20. The Elicitation of Subjective Probabilities with Applications in Agricultural Economics

    OpenAIRE

    Patricia E. NORRIS; Randall A. Kramer

    1990-01-01

    Probability judgements are important components of decision making under uncertainty. In particular, economic decisions can be aided by assuring more accurate assessment of probabilities and more realistic modelling of economic problems through the inclusion of subjective probabilities. The purpose of this paper is to describe the techniques which can be used to elicit subjective probabilities and the ways in which these techniques can be incorporated into agricultural economics research. The...